WorldWideScience

Sample records for sources including quantitative

  1. Pulmonary nodule characterization, including computer analysis and quantitative features.

    Science.gov (United States)

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  2. Do the enigmatic ``Infrared-Faint Radio Sources'' include pulsars?

    Science.gov (United States)

    Hobbs, George; Middelberg, Enno; Norris, Ray; Keith, Michael; Mao, Minnie; Champion, David

    2009-04-01

    The Australia Telescope Large Area Survey (ATLAS) team have surveyed seven square degrees of sky at 1.4GHz. During processing some unexpected infrared-faint radio sources (IFRS sources) were discovered. The nature of these sources is not understood, but it is possible that some of these sources may be pulsars within our own galaxy. We propose to observe the IFRS sources with steep spectral indices using standard search techniques to determine whether or not they are pulsars. A pulsar detection would 1) remove a subset of the IFRS sources from the ATLAS sample so they would not need to be observed with large optical/IR telescopes to find their hosts and 2) be intrinsically interesting as the pulsar would be a millisecond pulsar and/or have an extreme spatial velocity.

  3. Source term reduction at DAEC (including stellite ball recycling)

    International Nuclear Information System (INIS)

    Smith, R.; Schebler, D.

    1995-01-01

    The Duane Arnold Energy Center was seeking methods to reduce dose rates from the drywell due to Co-60. Duane Arnold is known in the industry to have one of the highest drywell dose rates from the industry standardized 'BRAC' point survey. A prime method to reduce dose rates due to Co-60 is the accelerated replacement of stellite pins and rollers in control rod blades due to their high stellite (cobalt) content. Usually the cobalt content in alloys of stellite is greater than 60% cobalt by weight. During the RFO-12 refueling outage at Duane Arnold, all of the remaining cobalt bearing control rod blades were replaced and new stellite free control rod blades were installed in the core. This left Duane Arnold with the disposal of highly radioactive stellite pins and rollers. The processing of control rod blades for disposal is a very difficult evolution. First, the velocity limiter (a bottom portion of the component) and the highly radioactive upper stellite control rod blade ins and rollers are separated from the control rod blade. Next, the remainder of the control rod blade is processed (chopped and/or crushed) to aid packaging the waste for disposal. The stellite bearings are then often carefully placed in with the rest of the waste in a burial liner to provide shielding for disposal or more often are left as 'orphans' in the spent fuel pool because their high specific activity create shipping and packaging problems. Further investigation by the utility showed that the stellite balls and pins could be recycled to a source manufacturer rather than disposed of in a low-level burial site. The cost savings to the utility was on the order of $200,000 with a gross savings of $400,000 in savings in burial site charges. A second advantage of the recycling of the stellite pins and rollers was a reduction in control in radioactive waste shipments

  4. 77 FR 6463 - Revisions to Labeling Requirements for Blood and Blood Components, Including Source Plasma...

    Science.gov (United States)

    2012-02-08

    ... Blood Components, Including Source Plasma; Correction AGENCY: Food and Drug Administration, HHS. ACTION..., Including Source Plasma,'' which provided incorrect publication information regarding a 60-day notice that...

  5. Portable instrumentation for quantitatively measuring radioactive surface contaminations, including 90Sr

    International Nuclear Information System (INIS)

    Brodzinski, R.L.

    1983-10-01

    In order to measure the effectiveness of decontamination efforts, a quantitative analysis of the radiocontamination is necessary, both before and after decontamination. Since it is desirable to release the decontaminated material for unrestricted use or disposal, the assay equipment must provide adequate sensitivity to measure the radioactivity at or below the release limit. In addition, the instrumentation must be capable of measuring all kinds of radiocontaminants including fission products, activation products, and transuranic materials. Finally, the survey instrumentation must be extremely versatile in order to assay the wide variety of contaminated surfaces in many environments, some of which may be extremely hostile or remote. This communication describes the development and application of portable instrumentation capable of quantitatively measuring most transuranics, activation products, and fission products, including 90 Sr, on almost any contaminated surface in nearly any location

  6. Visible light scatter as quantitative information source on milk constituents

    DEFF Research Database (Denmark)

    Melentiyeva, Anastasiya; Kucheryavskiy, Sergey; Bogomolov, Andrey

    2012-01-01

    analysis. The main task here is to extract individual quantitative information on milk fat and total protein content from spectral data. This is particularly challenging problem in the case of raw natural milk, where the fat globule sizes may essentially differ depending on source. Fig. 1. Spots of light...... designed set of raw milk samples with simultaneously varying fat, total protein and particle size distribution has been analyzed in the Vis spectral region. The feasibility of raw milk analysis by PLS regression on spectral data has been proved. The root mean-square errors below 0.10% and 0.04% for fat....... 3J&M Analytik AG, Willy-Messerschmitt-Strasse 8, 73457 Essingen, Germany. bogomolov@j-m.de Fat and protein are two major milk nutrients that are routinely analyzed in the dairy industry. Growing food quality requirements promote the dissemination of spectroscopic analysis, enabling real...

  7. Quantitative Real-Time PCR Fecal Source Identification in the ...

    Science.gov (United States)

    Rivers in the Tillamook Basin play a vital role in supporting a thriving dairy and cheese-making industry, as well as providing a safe water resource for local human and wildlife populations. Historical concentrations of fecal bacteria in these waters are at times too high to allow for safe use leading to economic loss, endangerment of local wildlife, and poor conditions for recreational use. In this study, we employ host-associated qPCR methods for human (HF183/BacR287 and HumM2), ruminant (Rum2Bac), cattle (CowM2 and CowM3), canine (DG3 and DG37), and avian (GFD) fecal pollution combined with high-resolution geographic information system (GIS) land use data and general indicator bacteria measurements to elucidatewater quality spatial and temporal trends. Water samples (n=584) were collected over a 1-year period at 29 sites along the Trask, Kilchis, and Tillamook rivers and tributaries (Tillamook Basin, OR). A total of 16.6% of samples (n=97) yielded E. coli levels considered impaired based on Oregon Department of Environmental Quality bacteria criteria (406 MPN/100mL). Hostassociated genetic indicators were detected at frequencies of 39.2% (HF183/BacR287), 16.3% (HumM2), 74.6% (Rum2Bac), 13.0% (CowM2), 26.7% (CowM3), 19.8% (DG3), 3.2% (DG37), and 53.4% (GFD) across all water samples (n=584). Seasonal trends in avian, cattle, and human fecal pollution sources were evident over the study area. On a sample site basis, quantitative fecal source identification and

  8. Teaching Children How to Include the Inversion Principle in Their Reasoning about Quantitative Relations

    Science.gov (United States)

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Bell, Daniel; Barros, Rossana

    2012-01-01

    The basis of this intervention study is a distinction between numerical calculus and relational calculus. The former refers to numerical calculations and the latter to the analysis of the quantitative relations in mathematical problems. The inverse relation between addition and subtraction is relevant to both kinds of calculus, but so far research…

  9. Quantitative evaluation of SIMS spectra including spectrum interpretation and Saha-Eggert correction

    International Nuclear Information System (INIS)

    Steiger, W.; Ruedenauer, F.G.

    1978-01-01

    A spectrum identification program is described, using a computer algorithm which solely relies on the natural isotopic abundances for identification of elemental, molecular and cluster ions. The thermodynamic approach to the quantitative interpretation of SIMS spectra, through the use of the Saha-Eggert equation, is discussed, and a computer program is outlined. (U.K.)

  10. Energy-Water Nexus Relevant to Baseload Electricity Source Including Mini/Micro Hydropower Generation

    Science.gov (United States)

    Fujii, M.; Tanabe, S.; Yamada, M.

    2014-12-01

    Water, food and energy is three sacred treasures that are necessary for human beings. However, recent factors such as population growth and rapid increase in energy consumption have generated conflicting cases between water and energy. For example, there exist conflicts caused by enhanced energy use, such as between hydropower generation and riverine ecosystems and service water, between shale gas and ground water, between geothermal and hot spring water. This study aims to provide quantitative guidelines necessary for capacity building among various stakeholders to minimize water-energy conflicts in enhancing energy use. Among various kinds of renewable energy sources, we target baseload sources, especially focusing on renewable energy of which installation is required socially not only to reduce CO2 and other greenhouse gas emissions but to stimulate local economy. Such renewable energy sources include micro/mini hydropower and geothermal. Three municipalities in Japan, Beppu City, Obama City and Otsuchi Town are selected as primary sites of this study. Based on the calculated potential supply and demand of micro/mini hydropower generation in Beppu City, for example, we estimate the electricity of tens through hundreds of households is covered by installing new micro/mini hydropower generation plants along each river. However, the result is based on the existing infrastructures such as roads and electric lines. This means that more potentials are expected if the local society chooses options that enhance the infrastructures to increase micro/mini hydropower generation plants. In addition, further capacity building in the local society is necessary. In Japan, for example, regulations by the river law and irrigation right restrict new entry by actors to the river. Possible influences to riverine ecosystems in installing new micro/mini hydropower generation plants should also be well taken into account. Deregulation of the existing laws relevant to rivers and

  11. Development of a quantitative safety assessment method for nuclear I and C systems including human operators

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2004-02-01

    Conventional PSA (probabilistic safety analysis) is performed in the framework of event tree analysis and fault tree analysis. In conventional PSA, I and C systems and human operators are assumed to be independent for simplicity. But, the dependency of human operators on I and C systems and the dependency of I and C systems on human operators are gradually recognized to be significant. I believe that it is time to consider the interdependency between I and C systems and human operators in the framework of PSA. But, unfortunately it seems that we do not have appropriate methods for incorporating the interdependency between I and C systems and human operators in the framework of Pasa. Conventional human reliability analysis (HRA) methods are not developed to consider the interdependecy, and the modeling of the interdependency using conventional event tree analysis and fault tree analysis seem to be, event though is does not seem to be impossible, quite complex. To incorporate the interdependency between I and C systems and human operators, we need a new method for HRA and a new method for modeling the I and C systems, man-machine interface (MMI), and human operators for quantitative safety assessment. As a new method for modeling the I and C systems, MMI and human operators, I develop a new system reliability analysis method, reliability graph with general gates (RGGG), which can substitute conventional fault tree analysis. RGGG is an intuitive and easy-to-use method for system reliability analysis, while as powerful as conventional fault tree analysis. To demonstrate the usefulness of the RGGG method, it is applied to the reliability analysis of Digital Plant Protection System (DPPS), which is the actual plant protection system of Ulchin 5 and 6 nuclear power plants located in Republic of Korea. The latest version of the fault tree for DPPS, which is developed by the Integrated Safety Assessment team in Korea Atomic Energy Research Institute (KAERI), consists of 64

  12. Quantitative characterization of urban sources of organic aerosol by high-resolution gas chromatography

    International Nuclear Information System (INIS)

    Hildemann, L.M.; Mazurek, M.A.; Cass, G.R.; Simoneit, B.R.T.

    1991-01-01

    Fine aerosol emissions have been collected from a variety of urban combustion sources, including an industrial boiler, a fireplace, automobiles, diesel trucks, gas-fired home appliances, and meat cooking operations, by use of a dilution sampling system. Other sampling techniques have been utilized to collect fine aerosol samples of paved road dust, brake wear, tire wear, cigarette smoke, tar pot emissions, and vegetative detritus. The organic matter contained in each of these samples has been analyzed via high-resolution gas chromatography. By use of a simple computational approach, a quantitative, 50-parameter characterization of the elutable fine organic aerosol emitted from each source type has been determined. The organic mass distribution fingerprints obtained by this approach are shown to differ significantly from each other for most of the source types tested, using hierarchical cluster analysis

  13. Human fecal source identification with real-time quantitative PCR

    Science.gov (United States)

    Waterborne diseases represent a significant public health risk worldwide, and can originate from contact with water contaminated with human fecal material. We describe a real-time quantitative PCR (qPCR) method that targets a Bacteroides dori human-associated genetic marker for...

  14. Qualitative and Quantitative Evaluation of Multi-source Piroxicam ...

    African Journals Online (AJOL)

    The qualitative and quantitative evaluation of eleven brands of piroxicam capsules marketed in Nigeria is presented. The disintegration time, dissolution rate and absolute drug content were determined in simulated intestinal fluid (SIF) and simulated gastric fluid (SGF) without enzymes. Weight uniformity test was also ...

  15. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  16. Multi-source quantitative photoacoustic tomography in a diffusive regime

    International Nuclear Information System (INIS)

    Bal, Guillaume; Ren, Kui

    2011-01-01

    Photoacoustic tomography (PAT) is a novel hybrid medical imaging technique that aims to combine the large contrast of optical coefficients with the high-resolution capabilities of ultrasound. We assume that the first step of PAT, namely the reconstruction of a map of absorbed radiation from ultrasound boundary measurement, has been done. We focus on quantitative photoacoustic tomography, which aims at quantitatively reconstructing the optical coefficients from knowledge of the absorbed radiation map. We present a non-iterative procedure to reconstruct such optical coefficients, namely the diffusion and absorption coefficients, and the Grüneisen coefficient when the propagation of radiation is modeled by a second-order elliptic equation. We show that PAT measurements allow us to uniquely reconstruct only two out of the above three coefficients, even when data are collected using an arbitrary number of radiation illuminations. We present uniqueness and stability results for the reconstructions of two such parameters and demonstrate the accuracy of the reconstruction algorithm with numerical reconstructions from two-dimensional synthetic data

  17. Auralization of airborne sound insulation including the influence of source room

    DEFF Research Database (Denmark)

    Rindel, Jens Holger

    2006-01-01

    The paper describes a simple and acoustically accurate method for the auralization of airborne sound insulation between two rooms by means of a room acoustic simulation software (ODEON). The method makes use of a frequency independent transparency of the transmitting surface combined...... with a frequency dependent power setting of the source in the source room. The acoustic properties in terms of volume and reverberation time as well as the area of the transmitting surface are all included in the simulation. The user only has to select the position of the source in the source room and the receiver...... of the transmitting surface is used for the simulation of sound transmission. Also the reduced clarity of the auralization due to the reverberance of the source room is inherent in the method. Currently the method is restricted to transmission loss data in octave bands....

  18. A quantitative PGNAA study for use in aqueous solution measurements using Am–Be neutron source and BGO scintillation detector

    Energy Technology Data Exchange (ETDEWEB)

    Ghal-Eh, N., E-mail: ghal-eh@du.ac.ir [School of Physics, Damghan University, P.O. Box 36716-41167, Damghan (Iran, Islamic Republic of); Ahmadi, P. [School of Physics, Damghan University, P.O. Box 36716-41167, Damghan (Iran, Islamic Republic of); Doost-Mohammadi, V. [Nuclear Science and Technology Research Center, AEOI, P.O. Box 11365-8486, Tehran (Iran, Islamic Republic of)

    2016-02-01

    A prompt gamma neutron activation analysis (PGNAA) system including an Am–Be neutron source and BGO scintillation detector are used for quantitative analysis of bulk samples. Both Monte Carlo-simulated and experimental data are considered as input data libraries for two different procedures based on neural network and least squares methods. The results confirm the feasibility and precision of the proposed methods.

  19. Quantitative determination of minor and trace elements in rocks and soils by spark source mass spectrometry

    International Nuclear Information System (INIS)

    Ure, A.M.; Bacon, J.R.

    1978-01-01

    Experimental details are given of the quantitative determination of minor and trace elements in rocks and soils by spark source mass spectrometry. The effects of interfering species, and corrections that can be applied, are discussed. (U.K.)

  20. Pseudodynamic Source Characterization for Strike-Slip Faulting Including Stress Heterogeneity and Super-Shear Ruptures

    KAUST Repository

    Mena, B.

    2012-08-08

    Reliable ground‐motion prediction for future earthquakes depends on the ability to simulate realistic earthquake source models. Though dynamic rupture calculations have recently become more popular, they are still computationally demanding. An alternative is to invoke the framework of pseudodynamic (PD) source characterizations that use simple relationships between kinematic and dynamic source parameters to build physically self‐consistent kinematic models. Based on the PD approach of Guatteri et al. (2004), we propose new relationships for PD models for moderate‐to‐large strike‐slip earthquakes that include local supershear rupture speed due to stress heterogeneities. We conduct dynamic rupture simulations using stochastic initial stress distributions to generate a suite of source models in the magnitude Mw 6–8. This set of models shows that local supershear rupture speed prevails for all earthquake sizes, and that the local rise‐time distribution is not controlled by the overall fault geometry, but rather by local stress changes on the faults. Based on these findings, we derive a new set of relations for the proposed PD source characterization that accounts for earthquake size, buried and surface ruptures, and includes local rise‐time variations and supershear rupture speed. By applying the proposed PD source characterization to several well‐recorded past earthquakes, we verify that significant improvements in fitting synthetic ground motion to observed ones is achieved when comparing our new approach with the model of Guatteri et al. (2004). The proposed PD methodology can be implemented into ground‐motion simulation tools for more physically reliable prediction of shaking in future earthquakes.

  1. Synchrotron radiation as a source for quantitative XPS: advantages and consequences

    International Nuclear Information System (INIS)

    Rosseel, T.M.; Carlson, T.A.; Negri, R.E.; Beall, C.E.; Taylor, J.W.

    1986-01-01

    Synchrotron radiation (SR) has a variety of properties which make it an attractive source for quantitative x-ray photoelectron spectroscopy (XPS). Among the most significant are high intensity and tunability. In addition, the intensity of the dispersed radiation is comparable to laboratory line sources. Synchrotron radiation is also a clean source, i.e., it will not contaminate the sample, because it operates under ultra-high vacuum conditions. We have used these properties to demonstrate the advantages of SR as a source for quantitative XPS. We have also found several consequences associated with this source which can either limit its use or provide unique opportunities for analysis and research. Using the tunability of SR, we have measured the energy dependence of the 3p photoionization cross sections of Ti, Cr, and Mn from 50 to 150 eV above threshold at the University of Wisconsin's Tantalus electron-storage ring

  2. Spectro-refractometry of individual microscopic objects using swept-source quantitative phase imaging.

    Science.gov (United States)

    Jung, Jae-Hwang; Jang, Jaeduck; Park, Yongkeun

    2013-11-05

    We present a novel spectroscopic quantitative phase imaging technique with a wavelength swept-source, referred to as swept-source diffraction phase microscopy (ssDPM), for quantifying the optical dispersion of microscopic individual samples. Employing the swept-source and the principle of common-path interferometry, ssDPM measures the multispectral full-field quantitative phase imaging and spectroscopic microrefractometry of transparent microscopic samples in the visible spectrum with a wavelength range of 450-750 nm and a spectral resolution of less than 8 nm. With unprecedented precision and sensitivity, we demonstrate the quantitative spectroscopic microrefractometry of individual polystyrene beads, 30% bovine serum albumin solution, and healthy human red blood cells.

  3. A practical algorithm for distribution state estimation including renewable energy sources

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher [Electronic and Electrical Department, Shiraz University of Technology, Modares Blvd., P.O. 71555-313, Shiraz (Iran); Firouzi, Bahman Bahmani [Islamic Azad University Marvdasht Branch, Marvdasht (Iran)

    2009-11-15

    Renewable energy is energy that is in continuous supply over time. These kinds of energy sources are divided into five principal renewable sources of energy: the sun, the wind, flowing water, biomass and heat from within the earth. According to some studies carried out by the research institutes, about 25% of the new generation will be generated by Renewable Energy Sources (RESs) in the near future. Therefore, it is necessary to study the impact of RESs on the power systems, especially on the distribution networks. This paper presents a practical Distribution State Estimation (DSE) including RESs and some practical consideration. The proposed algorithm is based on the combination of Nelder-Mead simplex search and Particle Swarm Optimization (PSO) algorithms, called PSO-NM. The proposed algorithm can estimate load and RES output values by Weighted Least-Square (WLS) approach. Some practical considerations are var compensators, Voltage Regulators (VRs), Under Load Tap Changer (ULTC) transformer modeling, which usually have nonlinear and discrete characteristics, and unbalanced three-phase power flow equations. The comparison results with other evolutionary optimization algorithms such as original PSO, Honey Bee Mating Optimization (HBMO), Neural Networks (NNs), Ant Colony Optimization (ACO), and Genetic Algorithm (GA) for a test system demonstrate that PSO-NM is extremely effective and efficient for the DSE problems. (author)

  4. Quantitative Analysis of VIIRS DNB Nightlight Point Source for Light Power Estimation and Stability Monitoring

    Directory of Open Access Journals (Sweden)

    Changyong Cao

    2014-12-01

    Full Text Available The high sensitivity and advanced onboard calibration on the Visible Infrared Imaging Radiometer Suite (VIIRS Day/Night Band (DNB enables accurate measurements of low light radiances which leads to enhanced quantitative applications at night. The finer spatial resolution of DNB also allows users to examine social economic activities at urban scales. Given the growing interest in the use of the DNB data, there is a pressing need for better understanding of the calibration stability and absolute accuracy of the DNB at low radiances. The low light calibration accuracy was previously estimated at a moderate 15% using extended sources while the long-term stability has yet to be characterized. There are also several science related questions to be answered, for example, how the Earth’s atmosphere and surface variability contribute to the stability of the DNB measured radiances; how to separate them from instrument calibration stability; whether or not SI (International System of Units traceable active light sources can be designed and installed at selected sites to monitor the calibration stability, radiometric and geolocation accuracy, and point spread functions of the DNB; furthermore, whether or not such active light sources can be used for detecting environmental changes, such as aerosols. This paper explores the quantitative analysis of nightlight point sources, such as those from fishing vessels, bridges, and cities, using fundamental radiometry and radiative transfer, which would be useful for a number of applications including search and rescue in severe weather events, as well as calibration/validation of the DNB. Time series of the bridge light data are used to assess the stability of the light measurements and the calibration of VIIRS DNB. It was found that the light radiant power computed from the VIIRS DNB data matched relatively well with independent assessments based on the in situ light installations, although estimates have to be

  5. Subjective Response to Foot-Fall Noise, Including Localization of the Source Position

    DEFF Research Database (Denmark)

    Brunskog, Jonas; Hwang, Ha Dong; Jeong, Cheol-Ho

    2011-01-01

    annoyance, using simulated binaural room impulse responses, with sources being a moving point source or a nonmoving surface source, and rooms being a room with a reverberation time of 0.5 s or an anechoic room. The paper concludes that no strong effect of the source localization on the annoyance can...

  6. The European source-term evaluation code ASTEC: status and applications, including CANDU plant applications

    International Nuclear Information System (INIS)

    Van Dorsselaere, J.P.; Giordano, P.; Kissane, M.P.; Montanelli, T.; Schwinges, B.; Ganju, S.; Dickson, L.

    2004-01-01

    Research on light-water reactor severe accidents (SA) is still required in a limited number of areas in order to confirm accident-management plans. Thus, 49 European organizations have linked their SA research in a durable way through SARNET (Severe Accident Research and management NETwork), part of the European 6th Framework Programme. One goal of SARNET is to consolidate the integral code ASTEC (Accident Source Term Evaluation Code, developed by IRSN and GRS) as the European reference tool for safety studies; SARNET efforts include extending the application scope to reactor types other than PWR (including VVER) such as BWR and CANDU. ASTEC is used in IRSN's Probabilistic Safety Analysis level 2 of 900 MWe French PWRs. An earlier version of ASTEC's SOPHAEROS module, including improvements by AECL, is being validated as the Canadian Industry Standard Toolset code for FP-transport analysis in the CANDU Heat Transport System. Work with ASTEC has also been performed by Bhabha Atomic Research Centre, Mumbai, on IPHWR containment thermal hydraulics. (author)

  7. Quantitative phase imaging of living cells with a swept laser source

    Science.gov (United States)

    Chen, Shichao; Zhu, Yizheng

    2016-03-01

    Digital holographic phase microscopy is a well-established quantitative phase imaging technique. However, interference artifacts from inside the system, typically induced by elements whose optical thickness are within the source coherence length, limit the imaging quality as well as sensitivity. In this paper, a swept laser source based technique is presented. Spectra acquired at a number of wavelengths, after Fourier Transform, can be used to identify the sources of the interference artifacts. With proper tuning of the optical pathlength difference between sample and reference arms, it is possible to avoid these artifacts and achieve sensitivity below 0.3nm. Performance of the proposed technique is examined in live cell imaging.

  8. 76 FR 62452 - Avon Products, Inc. Including On-Site Leased Workers From Spherion/Source Right, Springdale, OH...

    Science.gov (United States)

    2011-10-07

    .... Including On-Site Leased Workers From Spherion/Source Right, Springdale, OH; Amended Certification Regarding... workers of the subject firm. The company reports that workers leased from Spherion/Source Right were...., including on-site leased workers from Spherion/Source Right, Springdale, Ohio, who became totally or...

  9. 76 FR 62451 - Avon Products, Inc., Including On-Site Leased Workers From Spherion/Source Right, Springdale...

    Science.gov (United States)

    2011-10-07

    ...., Including On-Site Leased Workers From Spherion/Source Right, Springdale, Ohio; Amended Certification... workers of the subject firm. The company reports that workers leased from Spherion/Source Right were...., including on-site leased workers from Spherion/Source Right, Springdale, Ohio, who became totally or...

  10. Hybrid Design of Electric Power Generation Systems Including Renewable Sources of Energy

    Science.gov (United States)

    Wang, Lingfeng; Singh, Chanan

    2008-01-01

    With the stricter environmental regulations and diminishing fossil-fuel reserves, there is now higher emphasis on exploiting various renewable sources of energy. These alternative sources of energy are usually environmentally friendly and emit no pollutants. However, the capital investments for those renewable sources of energy are normally high,…

  11. Factors affecting the repeatability of gamma camera calibration for quantitative imaging applications using a sealed source

    International Nuclear Information System (INIS)

    Anizan, N; Wahl, R L; Frey, E C; Wang, H; Zhou, X C

    2015-01-01

    Several applications in nuclear medicine require absolute activity quantification of single photon emission computed tomography images. Obtaining a repeatable calibration factor that converts voxel values to activity units is essential for these applications. Because source preparation and measurement of the source activity using a radionuclide activity meter are potential sources of variability, this work investigated instrumentation and acquisition factors affecting repeatability using planar acquisition of sealed sources. The calibration factor was calculated for different acquisition and geometry conditions to evaluate the effect of the source size, lateral position of the source in the camera field-of-view (FOV), source-to-camera distance (SCD), and variability over time using sealed Ba-133 sources. A small region of interest (ROI) based on the source dimensions and collimator resolution was investigated to decrease the background effect. A statistical analysis with a mixed-effects model was used to evaluate quantitatively the effect of each variable on the global calibration factor variability. A variation of 1 cm in the measurement of the SCD from the assumed distance of 17 cm led to a variation of 1–2% in the calibration factor measurement using a small disc source (0.4 cm diameter) and less than 1% with a larger rod source (2.9 cm diameter). The lateral position of the source in the FOV and the variability over time had small impacts on calibration factor variability. The residual error component was well estimated by Poisson noise. Repeatability of better than 1% in a calibration factor measurement using a planar acquisition of a sealed source can be reasonably achieved. The best reproducibility was obtained with the largest source with a count rate much higher than the average background in the ROI, and when the SCD was positioned within 5 mm of the desired position. In this case, calibration source variability was limited by the quantum

  12. Improving quantitative gas chromatography-electron ionization mass spectrometry results using a modified ion source: demonstration for a pharmaceutical application.

    Science.gov (United States)

    D'Autry, Ward; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Van Schepdael, Ann

    2011-07-01

    Gas chromatography-mass spectrometry is a well established analytical technique. However, mass spectrometers with electron ionization sources may suffer from signal drifts, hereby negatively influencing quantitative performance. To demonstrate this phenomenon for a real application, a static headspace-gas chromatography method in combination with electron ionization-quadrupole mass spectrometry was optimized for the determination of residual dichloromethane in coronary stent coatings. Validating the method, the quantitative performance of an original stainless steel ion source was compared to that of a modified ion source. Ion source modification included the application of a gold coating on the repeller and exit plate. Several validation aspects such as limit of detection, limit of quantification, linearity and precision were evaluated using both ion sources. It was found that, as expected, the stainless steel ion source suffered from signal drift. As a consequence, non-linearity and high RSD values for repeated analyses were obtained. An additional experiment was performed to check whether an internal standard compound would lead to better results. It was found that the signal drift patterns of the analyte and internal standard were different, consequently leading to high RSD values for the response factor. With the modified ion source however, a more stable signal was observed resulting in acceptable linearity and precision. Moreover, it was also found that sensitivity improved compared to the stainless steel ion source. Finally, the optimized method with the modified ion source was applied to determine residual dichloromethane in the coating of coronary stents. The solvent was detected but found to be below the limit of quantification. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. A study on the quantitative evaluation for the software included in digital systems of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Kang, H. G.; Lee, K. Y.; Park, J. K. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    In general, probabilistic safety analysis (PSA) has been used as one of the most important methods to evaluate the safety of NPPs. The PSA, because most of NPPs have been installed and used analog I and C systems, has been performed based on the hardware perspectives. In addition, since the tendency to use digital I and C systems including software instead of analog I and C systems is increasing, the needs of quantitative evaluation methods so as to perform PSA are also increasing. Nevertheless, several reasons such as software did not aged and it is very perplexed to estimate software failure rate due to its non-linearity, make the performance of PSA difficult. In this study, in order to perform PSA including software more efficiently, test-based software reliability estimation methods are reviewed to suggest a preliminary procedure that can provide reasonable guidances to quantify software failure rate. In addition, requisite activities to enhance applicability of the suggested procedure are also discussed. 67 refs., 11 figs., 5 tabs. (Author)

  14. Development of CD3 cell quantitation algorithms for renal allograft biopsy rejection assessment utilizing open source image analysis software.

    Science.gov (United States)

    Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad

    2018-02-01

    Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.

  15. A Study on Conjugate Heat Transfer Analysis of Reactor Vessel including Irradiated Structural Heat Source

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Kunwoo; Cho, Hyuksu; Im, Inyoung; Kim, Eunkee [KEPCO EnC, Daejeon (Korea, Republic of)

    2015-10-15

    Though Material reliability programs (MRPs) have a purpose to provide the evaluation or management methodologies for the operating RVI, the similar evaluation methodologies can be applied to the APR1400 fleet in the design stage for the evaluation of neutron irradiation effects. The purposes of this study are: to predict the thermal behavior whether or not irradiated structure heat source; to evaluate effective thermal conductivity (ETC) in relation to isotropic and anisotropic conductivity of porous media for APR1400 Reactor Vessel. The CFD simulations are performed so as to evaluate thermal behavior whether or not irradiated structure heat source and effective thermal conductivity for APR1400 Reactor Vessel. In respective of using irradiated structure heat source, the maximum temperature of fluid and core shroud for isotropic ETC are 325.8 .deg. C, 341.5 .deg. C. The total amount of irradiated structure heat source is about 5.41 MWth and not effect to fluid temperature.

  16. dcmqi: An Open Source Library for Standardized Communication of Quantitative Image Analysis Results Using DICOM.

    Science.gov (United States)

    Herz, Christian; Fillion-Robin, Jean-Christophe; Onken, Michael; Riesmeier, Jörg; Lasso, Andras; Pinter, Csaba; Fichtinger, Gabor; Pieper, Steve; Clunie, David; Kikinis, Ron; Fedorov, Andriy

    2017-11-01

    Quantitative analysis of clinical image data is an active area of research that holds promise for precision medicine, early assessment of treatment response, and objective characterization of the disease. Interoperability, data sharing, and the ability to mine the resulting data are of increasing importance, given the explosive growth in the number of quantitative analysis methods being proposed. The Digital Imaging and Communications in Medicine (DICOM) standard is widely adopted for image and metadata in radiology. dcmqi (DICOM for Quantitative Imaging) is a free, open source library that implements conversion of the data stored in commonly used research formats into the standard DICOM representation. dcmqi source code is distributed under BSD-style license. It is freely available as a precompiled binary package for every major operating system, as a Docker image, and as an extension to 3D Slicer. Installation and usage instructions are provided in the GitHub repository at https://github.com/qiicr/dcmqi Cancer Res; 77(21); e87-90. ©2017 AACR . ©2017 American Association for Cancer Research.

  17. 13 CFR 120.102 - Funds not available from alternative sources, including personal resources of principals.

    Science.gov (United States)

    2010-01-01

    ... source) when that owner's liquid assets exceed the amounts specified in paragraphs (a) (1) through (3) of... applicant must inject any personal liquid assets which are in excess of two times the total financing... the applicant must inject any personal liquid assets which are in excess of one and one-half times the...

  18. Quantitative identification and source apportionment of anthropogenic heavy metals in marine sediment of Hong Kong

    Science.gov (United States)

    Zhou, Feng; Guo, Huaicheng; Liu, Lei

    2007-10-01

    Based on ten heavy metals collected twice annually at 59 sites from 1998 to 2004, enrichment factors (EFs), principal component analysis (PCA) and multivariate linear regression of absolute principal component scores (MLR-APCS) were used in identification and source apportionment of the anthropogenic heavy metals in marine sediment. EFs with Fe as a normalizer and local background as reference values was properly tested and suitable in Hong Kong, and Zn, Ni, Pb, Cu, Cd, Hg and Cr mainly originated from anthropogenic sources, while Al, Mn and Fe were derived from rocks weathering. Rotated PCA and GIS mapping further identified two types of anthropogenic sources and their impacted regions: (1) electronic industrial pollution, riparian runoff and vehicle exhaust impacted the entire Victoria Harbour, inner Tolo Harbour, Eastern Buffer, inner Deep Bay and Cheung Chau; and (2) discharges from textile factories and paint, influenced Tsuen Wan Bay and Kwun Tong typhoon shelter and Rambler Channel. In addition, MLR-APCS was successfully introduced to quantitatively determine the source contributions with uncertainties almost less than 8%: the first anthropogenic sources were responsible for 50.0, 45.1, 86.6, 78.9 and 87.5% of the Zn, Pb, Cu, Cd and Hg, respectively, whereas 49.9% of the Ni and 58.4% of the Cr came from the second anthropogenic sources.

  19. Sensitivity of a search for cosmic ray sources including magnetic field effects

    Energy Technology Data Exchange (ETDEWEB)

    Urban, Martin; Erdmann, Martin; Mueller, Gero [III. Physikalisches Institut A, RWTH Aachen University (Germany)

    2016-07-01

    We analyze the sensitivity of a new method investigating correlations between ultra-high energy cosmic rays and extragalactic sources taking into account deflections in the galactic magnetic field. In comparisons of expected and simulated arrival directions of cosmic rays we evaluate the directional characteristics and magnitude of the field. We show that our method is capable of detecting anisotropy in data sets with a low signal fraction.

  20. SU-D-210-03: Limited-View Multi-Source Quantitative Photoacoustic Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Feng, J; Gao, H [Shanghai Jiao Tong University, Shanghai, Shanghai (China)

    2015-06-15

    Purpose: This work is to investigate a novel limited-view multi-source acquisition scheme for the direct and simultaneous reconstruction of optical coefficients in quantitative photoacoustic tomography (QPAT), which has potentially improved signal-to-noise ratio and reduced data acquisition time. Methods: Conventional QPAT is often considered in two steps: first to reconstruct the initial acoustic pressure from the full-view ultrasonic data after each optical illumination, and then to quantitatively reconstruct optical coefficients (e.g., absorption and scattering coefficients) from the initial acoustic pressure, using multi-source or multi-wavelength scheme.Based on a novel limited-view multi-source scheme here, We have to consider the direct reconstruction of optical coefficients from the ultrasonic data, since the initial acoustic pressure can no longer be reconstructed as an intermediate variable due to the incomplete acoustic data in the proposed limited-view scheme. In this work, based on a coupled photo-acoustic forward model combining diffusion approximation and wave equation, we develop a limited-memory Quasi-Newton method (LBFGS) for image reconstruction that utilizes the adjoint forward problem for fast computation of gradients. Furthermore, the tensor framelet sparsity is utilized to improve the image reconstruction which is solved by Alternative Direction Method of Multipliers (ADMM). Results: The simulation was performed on a modified Shepp-Logan phantom to validate the feasibility of the proposed limited-view scheme and its corresponding image reconstruction algorithms. Conclusion: A limited-view multi-source QPAT scheme is proposed, i.e., the partial-view acoustic data acquisition accompanying each optical illumination, and then the simultaneous rotations of both optical sources and ultrasonic detectors for next optical illumination. Moreover, LBFGS and ADMM algorithms are developed for the direct reconstruction of optical coefficients from the

  1. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    Energy Technology Data Exchange (ETDEWEB)

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  2. Controlled Carbon Source Addition to an Alternating Nitrification-Denitrification Wastewater Treatment Process Including Biological P Removal

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Henze, Mogens

    1995-01-01

    The paper investigates the effect of adding an external carbon source on the rate of denitrification in an alternating activated sludge process including biological P removal. Two carbon sources were examined, acetate and hydrolysate derived from biologically hydrolyzed sludge. Preliminary batch ...

  3. Projects of SR sources including research and development for insertion devices in the USSR

    International Nuclear Information System (INIS)

    Kulipanov, G.

    1990-01-01

    Some technical information on the electron and positron storage rings - SR sources that are being constructed, used or developed at the Novosibirsk Institute of Nuclear Physics (INP), is given. The parameters and construction of wigglers and undulators (electromagnetic, superconducting, and based on permanent magnets) that are intended to be used at such storage rings are described. Various schemes of installation of wigglers, undulators and FEL at storage rings is considered. The ways of minimizing the influence of their magnetic fields on particle motion in storage rings are treated. (author)

  4. Pseudodynamic Source Characterization for Strike-Slip Faulting Including Stress Heterogeneity and Super-Shear Ruptures

    KAUST Repository

    Mena, B.; Dalguer, L. A.; Mai, Paul Martin

    2012-01-01

    . (2004), we propose new relationships for PD models for moderate‐to‐large strike‐slip earthquakes that include local supershear rupture speed due to stress heterogeneities. We conduct dynamic rupture simulations using stochastic initial stress

  5. Estimating true human and animal host source contribution in quantitative microbial source tracking using the Monte Carlo method.

    Science.gov (United States)

    Wang, Dan; Silkie, Sarah S; Nelson, Kara L; Wuertz, Stefan

    2010-09-01

    Cultivation- and library-independent, quantitative PCR-based methods have become the method of choice in microbial source tracking. However, these qPCR assays are not 100% specific and sensitive for the target sequence in their respective hosts' genome. The factors that can lead to false positive and false negative information in qPCR results are well defined. It is highly desirable to have a way of removing such false information to estimate the true concentration of host-specific genetic markers and help guide the interpretation of environmental monitoring studies. Here we propose a statistical model based on the Law of Total Probability to predict the true concentration of these markers. The distributions of the probabilities of obtaining false information are estimated from representative fecal samples of known origin. Measurement error is derived from the sample precision error of replicated qPCR reactions. Then, the Monte Carlo method is applied to sample from these distributions of probabilities and measurement error. The set of equations given by the Law of Total Probability allows one to calculate the distribution of true concentrations, from which their expected value, confidence interval and other statistical characteristics can be easily evaluated. The output distributions of predicted true concentrations can then be used as input to watershed-wide total maximum daily load determinations, quantitative microbial risk assessment and other environmental models. This model was validated by both statistical simulations and real world samples. It was able to correct the intrinsic false information associated with qPCR assays and output the distribution of true concentrations of Bacteroidales for each animal host group. Model performance was strongly affected by the precision error. It could perform reliably and precisely when the standard deviation of the precision error was small (≤ 0.1). Further improvement on the precision of sample processing and q

  6. Occurance of Staphylococcus nepalensis strains in different sources including human clinical material.

    Science.gov (United States)

    Nováková, Dana; Pantůcek, Roman; Petrás, Petr; Koukalová, Dagmar; Sedlácek, Ivo

    2006-10-01

    Five isolates of coagulase-negative staphylococci were obtained from human urine, the gastrointestinal tract of squirrel monkeys, pig skin and from the environment. All key biochemical characteristics of the tested strains corresponded with the description of Staphylococcus xylosus species. However, partial 16S rRNA gene sequences obtained from analysed strains corresponded with those of Staphylococcus nepalensis reference strains, except for two strains which differed in one residue. Ribotyping with EcoRI and HindIII restriction enzymes, whole cell protein profile analysis performed by SDS-PAGE and SmaI macrorestriction analysis were used for more precise characterization and identification of the analysed strains. Obtained results showed that EcoRI and HindIII ribotyping and whole cell protein fingerprinting are suitable and reliable methods for the differentiation of S. nepalensis strains from the other novobiocin resistant staphylococci, whereas macrorestriction analysis was found to be a good tool for strain typing. The isolation of S. nepalensis is sporadic, and according to our best knowledge this study is the first report of the occurrence of this species in human clinical material as well as in other sources.

  7. Quantitative Image Feature Engine (QIFE): an Open-Source, Modular Engine for 3D Quantitative Feature Extraction from Volumetric Medical Images.

    Science.gov (United States)

    Echegaray, Sebastian; Bakr, Shaimaa; Rubin, Daniel L; Napel, Sandy

    2017-10-06

    The aim of this study was to develop an open-source, modular, locally run or server-based system for 3D radiomics feature computation that can be used on any computer system and included in existing workflows for understanding associations and building predictive models between image features and clinical data, such as survival. The QIFE exploits various levels of parallelization for use on multiprocessor systems. It consists of a managing framework and four stages: input, pre-processing, feature computation, and output. Each stage contains one or more swappable components, allowing run-time customization. We benchmarked the engine using various levels of parallelization on a cohort of CT scans presenting 108 lung tumors. Two versions of the QIFE have been released: (1) the open-source MATLAB code posted to Github, (2) a compiled version loaded in a Docker container, posted to DockerHub, which can be easily deployed on any computer. The QIFE processed 108 objects (tumors) in 2:12 (h/mm) using 1 core, and 1:04 (h/mm) hours using four cores with object-level parallelization. We developed the Quantitative Image Feature Engine (QIFE), an open-source feature-extraction framework that focuses on modularity, standards, parallelism, provenance, and integration. Researchers can easily integrate it with their existing segmentation and imaging workflows by creating input and output components that implement their existing interfaces. Computational efficiency can be improved by parallelizing execution at the cost of memory usage. Different parallelization levels provide different trade-offs, and the optimal setting will depend on the size and composition of the dataset to be processed.

  8. Space-time quantitative source apportionment of soil heavy metal concentration increments.

    Science.gov (United States)

    Yang, Yong; Christakos, George; Guo, Mingwu; Xiao, Lu; Huang, Wei

    2017-04-01

    Assessing the space-time trends and detecting the sources of heavy metal accumulation in soils have important consequences in the prevention and treatment of soil heavy metal pollution. In this study, we collected soil samples in the eastern part of the Qingshan district, Wuhan city, Hubei Province, China, during the period 2010-2014. The Cd, Cu, Pb and Zn concentrations in soils exhibited a significant accumulation during 2010-2014. The spatiotemporal Kriging technique, based on a quantitative characterization of soil heavy metal concentration variations in terms of non-separable variogram models, was employed to estimate the spatiotemporal soil heavy metal distribution in the study region. Our findings showed that the Cd, Cu, and Zn concentrations have an obvious incremental tendency from the southwestern to the central part of the study region. However, the Pb concentrations exhibited an obvious tendency from the northern part to the central part of the region. Then, spatial overlay analysis was used to obtain absolute and relative concentration increments of adjacent 1- or 5-year periods during 2010-2014. The spatial distribution of soil heavy metal concentration increments showed that the larger increments occurred in the center of the study region. Lastly, the principal component analysis combined with the multiple linear regression method were employed to quantify the source apportionment of the soil heavy metal concentration increments in the region. Our results led to the conclusion that the sources of soil heavy metal concentration increments should be ascribed to industry, agriculture and traffic. In particular, 82.5% of soil heavy metal concentration increment during 2010-2014 was ascribed to industrial/agricultural activities sources. Using STK and SOA to obtain the spatial distribution of heavy metal concentration increments in soils. Using PCA-MLR to quantify the source apportionment of soil heavy metal concentration increments. Copyright © 2017

  9. Quantitative analysis of directional spontaneous emission spectra from light sources in photonic crystals

    International Nuclear Information System (INIS)

    Nikolaev, Ivan S.; Lodahl, Peter; Vos, Willem L.

    2005-01-01

    We have performed angle-resolved measurements of spontaneous-emission spectra from laser dyes and quantum dots in opal and inverse opal photonic crystals. Pronounced directional dependencies of the emission spectra are observed: angular ranges of strongly reduced emission adjoin with angular ranges of enhanced emission. It appears that emission from embedded light sources is affected both by the periodicity and by the structural imperfections of the crystals: the photons are Bragg diffracted by lattice planes and scattered by unavoidable structural disorder. Using a model comprising diffuse light transport and photonic band structure, we quantitatively explain the directional emission spectra. This work provides detailed understanding of the transport of spontaneously emitted light in real photonic crystals, which is essential in the interpretation of quantum optics in photonic-band-gap crystals and for applications wherein directional emission and total emission power are controlled

  10. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    International Nuclear Information System (INIS)

    Gray, Jeffrey F.; Puri, Ashok

    2007-01-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10 -6 , comparable with the recent results reported in the literature

  11. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  12. Free and Open Source Chemistry Software in Research of Quantitative Structure-Toxicity Relationship of Pesticides

    Directory of Open Access Journals (Sweden)

    Rastija Vesna

    2017-01-01

    Full Text Available Pesticides are toxic chemicals aimed for the destroying pest on crops. Numerous data evidence about toxicity of pesticides on aquatic organisms. Since pesticides with similar properties tend to have similar biological activities, toxicity may be predicted from structure. Their structure feature and properties are encoded my means of molecular descriptors. Molecular descriptors can capture quite simple two-dimensional (2D chemical structures to highly complex three-dimensional (3D chemical structures. Quantitative structure-toxicity relationship (QSTR method uses linear regression analyses for correlation toxicity of chemical with their structural feature using molecular descriptors. Molecular descriptors were calculated using open source software PaDEL and in-house built PyMOL plugin (PyDescriptor. PyDescriptor is a new script implemented with the commonly used visualization software PyMOL for calculation of a large and diverse set of easily interpretable molecular descriptors encoding pharmacophoric patterns and atomic fragments. PyDescriptor has several advantages like free and open source, can work on all major platforms (Windows, Linux, MacOS. QSTR method allows prediction of toxicity of pesticides without experimental assay. In the present work, QSTR analysis for toxicity of a dataset of mixtures of 5 classes of pesticides comprising has been performed.

  13. Intravenous streptokinase therapy in acute myocardial infarction: Assessment of therapy effects by quantitative 201Tl myocardial imaging (including SPECT) and radionuclide ventriculography

    International Nuclear Information System (INIS)

    Koehn, H.; Bialonczyk, C.; Mostbeck, A.; Frohner, K.; Unger, G.; Steinbach, K.

    1984-01-01

    To evaluate a potential beneficial effect of systemic streptokinase therapy in acute myocardial infarction, 36 patients treated with streptokinase intravenously were assessed by radionuclide ventriculography and quantitative 201 Tl myocardial imaging (including SPECT) in comparison with 18 conventionally treated patients. Patients after thrombolysis had significantly higher EF, PFR, and PER as well as fewer wall motion abnormalities compared with controls. These differences were also observed in the subset of patients with anterior wall infarction (AMI), but not in patients with inferior wall infarction (IMI). Quantitative 201 Tl imaging demonstrated significantly smaller percent myocardial defects and fewer pathological stress segments in patients with thrombolysis compared with controls. The same differences were also found in both AMI and IMI patients. Our data suggest a favorable effect of intravenous streptokinase on recovery of left ventricular function and myocardial salvage. Radionuclide ventriculography and quantitative 201 Tl myocardial imaging seem to be reliable tools for objective assessment of therapy effects. (orig.)

  14. A gate evaluation of the sources of error in quantitative90 Y PET.

    Science.gov (United States)

    Strydhorst, Jared; Carlier, Thomas; Dieudonné, Arnaud; Conti, Maurizio; Buvat, Irène

    2016-10-01

    Accurate reconstruction of the dose delivered by 90 Y microspheres using a postembolization PET scan would permit the establishment of more accurate dose-response relationships for treatment of hepatocellular carcinoma with 90 Y. However, the quality of the PET data obtained is compromised by several factors, including poor count statistics and a very high random fraction. This work uses Monte Carlo simulations to investigate what impact factors other than low count statistics have on the quantification of 90 Y PET. PET acquisitions of two phantoms-a NEMA PET phantom and the NEMA IEC PET body phantom-containing either 90 Y or 18 F were simulated using gate. Simulated projections were created with subsets of the simulation data allowing the contributions of random, scatter, and LSO background to be independently evaluated. The simulated projections were reconstructed using the commercial software for the simulated scanner, and the quantitative accuracy of the reconstruction and the contrast recovery of the reconstructed images were evaluated. The quantitative accuracy of the 90 Y reconstructions were not strongly influenced by the high random fraction present in the projection data, and the activity concentration was recovered to within 5% of the known value. The contrast recovery measured for simulated 90 Y data was slightly poorer than that for simulated 18 F data with similar count statistics. However, the degradation was not strongly linked to any particular factor. Using a more restricted energy range to reduce the random fraction in the projections had no significant effect. Simulations of 90 Y PET confirm that quantitative 90 Y is achievable with the same approach as that used for 18 F, and that there is likely very little margin for improvement by attempting to model aspects unique to 90 Y, such as the much higher random fraction or the presence of bremsstrahlung in the singles data. © 2016 American Association of Physicists in Medicine.

  15. A GATE evaluation of the sources of error in quantitative {sup 90}Y PET

    Energy Technology Data Exchange (ETDEWEB)

    Strydhorst, Jared, E-mail: jared.strydhorst@gmail.com; Buvat, Irène [IMIV, U1023 Inserm/CEA/Université Paris-Sud and ERL 9218 CNRS, Université Paris-Saclay, CEA/SHFJ, Orsay 91401 (France); Carlier, Thomas [Department of Nuclear Medicine, Centre Hospitalier Universitaire de Nantes and CRCNA, Inserm U892, Nantes 44000 (France); Dieudonné, Arnaud [Department of Nuclear Medicine, Hôpital Beaujon, HUPNVS, APHP and Inserm U1149, Clichy 92110 (France); Conti, Maurizio [Siemens Healthcare Molecular Imaging, Knoxville, Tennessee, 37932 (United States)

    2016-10-15

    Purpose: Accurate reconstruction of the dose delivered by {sup 90}Y microspheres using a postembolization PET scan would permit the establishment of more accurate dose–response relationships for treatment of hepatocellular carcinoma with {sup 90}Y. However, the quality of the PET data obtained is compromised by several factors, including poor count statistics and a very high random fraction. This work uses Monte Carlo simulations to investigate what impact factors other than low count statistics have on the quantification of {sup 90}Y PET. Methods: PET acquisitions of two phantoms—a NEMA PET phantom and the NEMA IEC PET body phantom-containing either {sup 90}Y or {sup 18}F were simulated using GATE. Simulated projections were created with subsets of the simulation data allowing the contributions of random, scatter, and LSO background to be independently evaluated. The simulated projections were reconstructed using the commercial software for the simulated scanner, and the quantitative accuracy of the reconstruction and the contrast recovery of the reconstructed images were evaluated. Results: The quantitative accuracy of the {sup 90}Y reconstructions were not strongly influenced by the high random fraction present in the projection data, and the activity concentration was recovered to within 5% of the known value. The contrast recovery measured for simulated {sup 90}Y data was slightly poorer than that for simulated {sup 18}F data with similar count statistics. However, the degradation was not strongly linked to any particular factor. Using a more restricted energy range to reduce the random fraction in the projections had no significant effect. Conclusions: Simulations of {sup 90}Y PET confirm that quantitative {sup 90}Y is achievable with the same approach as that used for {sup 18}F, and that there is likely very little margin for improvement by attempting to model aspects unique to {sup 90}Y, such as the much higher random fraction or the presence of

  16. Free software, Open source software, licenses. A short presentation including a procedure for research software and data dissemination

    OpenAIRE

    Gomez-Diaz , Teresa

    2014-01-01

    4 pages. Spanish version: Software libre, software de código abierto, licencias. Donde se propone un procedimiento de distribución de software y datos de investigación; The main goal of this document is to help the research community to understand the basic concepts of software distribution: Free software, Open source software, licenses. This document also includes a procedure for research software and data dissemination.

  17. Swept source optical coherence tomography for quantitative and qualitative assessment of dental composite restorations

    Science.gov (United States)

    Sadr, Alireza; Shimada, Yasushi; Mayoral, Juan Ricardo; Hariri, Ilnaz; Bakhsh, Turki A.; Sumi, Yasunori; Tagami, Junji

    2011-03-01

    The aim of this work was to explore the utility of swept-source optical coherence tomography (SS-OCT) for quantitative evaluation of dental composite restorations. The system (Santec, Japan) with a center wavelength of around 1300 nm and axial resolution of 12 μm was used to record data during and after placement of light-cured composites. The Fresnel phenomenon at the interfacial defects resulted in brighter areas indicating gaps as small as a few micrometers. The gap extension at the interface was quantified and compared to the observation by confocal laser scanning microscope after trimming the specimen to the same cross-section. Also, video imaging of the composite during polymerization could provide information about real-time kinetics of contraction stress and resulting gaps, distinguishing them from those gaps resulting from poor adaptation of composite to the cavity prior to polymerization. Some samples were also subjected to a high resolution microfocus X-ray computed tomography (μCT) assessment; it was found that differentiation of smaller gaps from the radiolucent bonding layer was difficult with 3D μCT. Finally, a clinical imaging example using a newly developed dental SS-OCT system with an intra-oral scanning probe (Panasonic Healthcare, Japan) is presented. SS-OCT is a unique tool for clinical assessment and laboratory research on resin-based dental restorations. Supported by GCOE at TMDU and NCGG.

  18. Artificial intelligence methods applied for quantitative analysis of natural radioactive sources

    International Nuclear Information System (INIS)

    Medhat, M.E.

    2012-01-01

    Highlights: ► Basic description of artificial neural networks. ► Natural gamma ray sources and problem of detections. ► Application of neural network for peak detection and activity determination. - Abstract: Artificial neural network (ANN) represents one of artificial intelligence methods in the field of modeling and uncertainty in different applications. The objective of the proposed work was focused to apply ANN to identify isotopes and to predict uncertainties of their activities of some natural radioactive sources. The method was tested for analyzing gamma-ray spectra emitted from natural radionuclides in soil samples detected by a high-resolution gamma-ray spectrometry based on HPGe (high purity germanium). The principle of the suggested method is described, including, relevant input parameters definition, input data scaling and networks training. It is clear that there is satisfactory agreement between obtained and predicted results using neural network.

  19. Quantitative spark-source analysis of UO2-PuO2 for rare earths and tantalum, tungsten

    International Nuclear Information System (INIS)

    Alkire, G.J.

    A quantitative analytical technique good to 20% for the determination of Sm, Eu, Gd, Dy, Ta, and W in Pu-U mixed oxides by spark source mass spectrography has been developed. The technique uses La as an added internal standard and an electronic integrator to measure peak areas of each line photometered on the densitometer. 3 tables

  20. Quantitative assessment of pure aortic valve regurgitation with dual-source CT

    Energy Technology Data Exchange (ETDEWEB)

    Li, Z., E-mail: lzlcd01@126.com [Department of Radiology, West China Hospital, Sichuan University, 37 Guo Xue Xiang, Chengdu, Sichuan 610041 (China); Huang, L.; Chen, X.; Xia, C.; Yuan, Y.; Shuai, T. [Department of Radiology, West China Hospital, Sichuan University, 37 Guo Xue Xiang, Chengdu, Sichuan 610041 (China)

    2012-07-15

    Aim: To assess the severity of pure aortic regurgitation by measuring regurgitation volumes (RV) and fractions (RF) with dual-source computed tomography (DSCT) as compared to magnetic resonance imaging (MRI) and echocardiography. Materials and methods: Thirty-eight patients (15 men, 23 women; mean age 46 {+-} 11 years) with isolated aortic valve regurgitation underwent retrospectively electrocardiogram (ECG)-gated DSCT, echocardiography, and MRI. Stroke volumes of the left and right ventricles were measured at DSCT and MRI. Thus, RVs and RFs were calculated and compared. The agreement between DSCT and MRI was tested by intraclass correlation coefficient and Bland-Altman analyses. Spearman's rank order correlation and weighted {kappa} tests were used for testing correlations of AR severity between DSCT results and corresponding echocardiographic grades. Results: The RV and RF measured by DSCT were not significantly different from those measured using MRI (p = 0.71 and 0.79). DSCT correlated well with MRI for the measurement of RV (r{sub I} = 0.86, p<0.001) and calculation of the RF (r{sub I} =0.90, p<0.001). Good agreement between the techniques was obtained by using Bland-Altman analyses. The severity of regurgitation estimated by echocardiography correlated well with DSCT (r{sub s} = 0.95, p<0.001) and MRI (r{sub s} = 0.95, p<0.001). Inter-technique agreement between DSCT and two-dimensional transthoracic echocardiography (2DTTE) regarding the grading of the severity of AR was excellent ({kappa} = 0.90), and good agreement was also obtained between MRI and 2DTTE assessments of the severity of AR ({kappa} = 0.87). Conclusion: DSCT using a volume approach can be used to quantitatively determine the severity of pure aortic regurgitation when compared with MRI and echocardiography.

  1. Track-a-worm, an open-source system for quantitative assessment of C. elegans locomotory and bending behavior.

    Directory of Open Access Journals (Sweden)

    Sijie Jason Wang

    Full Text Available A major challenge of neuroscience is to understand the circuit and gene bases of behavior. C. elegans is commonly used as a model system to investigate how various gene products function at specific tissue, cellular, and synaptic foci to produce complicated locomotory and bending behavior. The investigation generally requires quantitative behavioral analyses using an automated single-worm tracker, which constantly records and analyzes the position and body shape of a freely moving worm at a high magnification. Many single-worm trackers have been developed to meet lab-specific needs, but none has been widely implemented for various reasons, such as hardware difficult to assemble, and software lacking sufficient functionality, having closed source code, or using a programming language that is not broadly accessible. The lack of a versatile system convenient for wide implementation makes data comparisons difficult and compels other labs to develop new worm trackers. Here we describe Track-A-Worm, a system rich in functionality, open in source code, and easy to use. The system includes plug-and-play hardware (a stereomicroscope, a digital camera and a motorized stage, custom software written to run with Matlab in Windows 7, and a detailed user manual. Grayscale images are automatically converted to binary images followed by head identification and placement of 13 markers along a deduced spline. The software can extract and quantify a variety of parameters, including distance traveled, average speed, distance/time/speed of forward and backward locomotion, frequency and amplitude of dominant bends, overall bending activities measured as root mean square, and sum of all bends. It also plots worm travel path, bend trace, and bend frequency spectrum. All functionality is performed through graphical user interfaces and data is exported to clearly-annotated and documented Excel files. These features make Track-A-Worm a good candidate for implementation in

  2. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis.

    Science.gov (United States)

    Delorme, Arnaud; Makeig, Scott

    2004-03-15

    We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.

  3. Perceived relevance and information needs regarding food topics and preferred information sources among Dutch adults: results of a quantitative consumer study

    NARCIS (Netherlands)

    Dillen, van S.M.E.; Hiddink, G.J.; Koelen, M.A.; Graaf, de C.; Woerkum, van C.M.J.

    2004-01-01

    Objective: For more effective nutrition communication, it is crucial to identify sources from which consumers seek information. Our purpose was to assess perceived relevance and information needs regarding food topics, and preferred information sources by means of quantitative consumer research.

  4. Quantitative EEG and Current Source Density Analysis of Combined Antiepileptic Drugs and Dopaminergic Agents in Genetic Epilepsy: Two Case Studies.

    Science.gov (United States)

    Emory, Hamlin; Wells, Christopher; Mizrahi, Neptune

    2015-07-01

    Two adolescent females with absence epilepsy were classified, one as attention deficit and the other as bipolar disorder. Physical and cognitive exams identified hypotension, bradycardia, and cognitive dysfunction. Their initial electroencephalograms (EEGs) were considered slightly slow, but within normal limits. Quantitative EEG (QEEG) data included relative theta excess and low alpha mean frequencies. A combined treatment of antiepileptic drugs with a catecholamine agonist/reuptake inhibitor was sequentially used. Both patients' physical and cognitive functions improved and they have remained seizure free. The clinical outcomes were correlated with statistically significant changes in QEEG measures toward normal Z-scores in both anterior and posterior regions. In addition, low resolution electromagnetic tomography (LORETA) Z-scored source correlation analyses of the initial and treated QEEG data showed normalized patterns, supporting a neuroanatomic resolution. This study presents preliminary evidence for a neurophysiologic approach to patients with absence epilepsy and comorbid disorders and may provide a method for further research. © EEG and Clinical Neuroscience Society (ECNS) 2014.

  5. Automatisation of reading and interpreting photographically recorded spark source mass spectra for the quantitative analysis in solids

    International Nuclear Information System (INIS)

    Naudin, Guy.

    1976-01-01

    Quantitative analysis in solids by spark source mass spectrometry involves the study of photographic plates by means of a microdensitometer. After a graphic treatment of data from the plate, a scientific program is used to calculate the concentrations of isotopes. The automatisation of the three parts has been realised by using a program for computer. This program has been written in the laboratory for a small computer (Multi 8, Intertechnique) [fr

  6. Beyond nutrient-based food indices: a data mining approach to search for a quantitative holistic index reflecting the degree of food processing and including physicochemical properties.

    Science.gov (United States)

    Fardet, Anthony; Lakhssassi, Sanaé; Briffaz, Aurélien

    2018-01-24

    Processing has major impacts on both the structure and composition of food and hence on nutritional value. In particular, high consumption of ultra-processed foods (UPFs) is associated with increased risks of obesity and diabetes. Unfortunately, existing food indices only focus on food nutritional content while failing to consider either food structure or the degree of processing. The objectives of this study were thus to link non-nutrient food characteristics (texture, water activity (a w ), glycemic and satiety potentials (FF), and shelf life) to the degree of processing; search for associations between these characteristics with nutritional composition; search for a holistic quantitative technological index; and determine quantitative rules for a food to be defined as UPF using data mining. Among the 280 most widely consumed foods by the elderly in France, 139 solid/semi-solid foods were selected for textural and a w measurements, and classified according to three degrees of processing. Our results showed that minimally-processed foods were less hyperglycemic, more satiating, had better nutrient profile, higher a w , shorter shelf life, lower maximum stress, and higher energy at break than UPFs. Based on 72 food variables, multivariate analyses differentiated foods according to their degree of processing. Then technological indices including food nutritional composition, a w , FF and textural parameters were tested against technological groups. Finally, a LIM score (nutrients to limit) ≥8 per 100 kcal and a number of ingredients/additives >4 are relevant, but not sufficient, rules to define UPFs. We therefore suggest that food health potential should be first defined by its degree of processing.

  7. The mzqLibrary--An open source Java library supporting the HUPO-PSI quantitative proteomics standard.

    Science.gov (United States)

    Qi, Da; Zhang, Huaizhong; Fan, Jun; Perkins, Simon; Pisconti, Addolorata; Simpson, Deborah M; Bessant, Conrad; Hubbard, Simon; Jones, Andrew R

    2015-09-01

    The mzQuantML standard has been developed by the Proteomics Standards Initiative for capturing, archiving and exchanging quantitative proteomic data, derived from mass spectrometry. It is a rich XML-based format, capable of representing data about two-dimensional features from LC-MS data, and peptides, proteins or groups of proteins that have been quantified from multiple samples. In this article we report the development of an open source Java-based library of routines for mzQuantML, called the mzqLibrary, and associated software for visualising data called the mzqViewer. The mzqLibrary contains routines for mapping (peptide) identifications on quantified features, inference of protein (group)-level quantification values from peptide-level values, normalisation and basic statistics for differential expression. These routines can be accessed via the command line, via a Java programming interface access or a basic graphical user interface. The mzqLibrary also contains several file format converters, including import converters (to mzQuantML) from OpenMS, Progenesis LC-MS and MaxQuant, and exporters (from mzQuantML) to other standards or useful formats (mzTab, HTML, csv). The mzqViewer contains in-built routines for viewing the tables of data (about features, peptides or proteins), and connects to the R statistical library for more advanced plotting options. The mzqLibrary and mzqViewer packages are available from https://code.google.com/p/mzq-lib/. © 2015 The Authors. PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Quantitative identification of nitrate pollution sources and uncertainty analysis based on dual isotope approach in an agricultural watershed.

    Science.gov (United States)

    Ji, Xiaoliang; Xie, Runting; Hao, Yun; Lu, Jun

    2017-10-01

    Quantitative identification of nitrate (NO 3 - -N) sources is critical to the control of nonpoint source nitrogen pollution in an agricultural watershed. Combined with water quality monitoring, we adopted the environmental isotope (δD-H 2 O, δ 18 O-H 2 O, δ 15 N-NO 3 - , and δ 18 O-NO 3 - ) analysis and the Markov Chain Monte Carlo (MCMC) mixing model to determine the proportions of riverine NO 3 - -N inputs from four potential NO 3 - -N sources, namely, atmospheric deposition (AD), chemical nitrogen fertilizer (NF), soil nitrogen (SN), and manure and sewage (M&S), in the ChangLe River watershed of eastern China. Results showed that NO 3 - -N was the main form of nitrogen in this watershed, accounting for approximately 74% of the total nitrogen concentration. A strong hydraulic interaction existed between the surface and groundwater for NO 3 - -N pollution. The variations of the isotopic composition in NO 3 - -N suggested that microbial nitrification was the dominant nitrogen transformation process in surface water, whereas significant denitrification was observed in groundwater. MCMC mixing model outputs revealed that M&S was the predominant contributor to riverine NO 3 - -N pollution (contributing 41.8% on average), followed by SN (34.0%), NF (21.9%), and AD (2.3%) sources. Finally, we constructed an uncertainty index, UI 90 , to quantitatively characterize the uncertainties inherent in NO 3 - -N source apportionment and discussed the reasons behind the uncertainties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  10. Novel Method To Identify Source-Associated Phylogenetic Clustering Shows that Listeria monocytogenes Includes Niche-Adapted Clonal Groups with Distinct Ecological Preferences

    DEFF Research Database (Denmark)

    Nightingale, K. K.; Lyles, K.; Ayodele, M.

    2006-01-01

    population are identified (TreeStats test). Analysis of sequence data for 120 L. monocytogenes isolates revealed evidence of clustering between isolates from the same source, based on the phylogenies inferred from actA and inlA (P = 0.02 and P = 0.07, respectively; SourceCluster test). Overall, the Tree...... are biologically valid. Overall, our data show that (i) the SourceCluster and TreeStats tests can identify biologically meaningful source-associated phylogenetic clusters and (ii) L. monocytogenes includes clonal groups that have adapted to infect specific host species or colonize nonhost environments......., including humans, animals, and food. If the null hypothesis that the genetic distances for isolates within and between source populations are identical can be rejected (SourceCluster test), then particular clades in the phylogenetic tree with significant overrepresentation of sequences from a given source...

  11. MSQuant, an Open Source Platform for Mass Spectrometry-Based Quantitative Proteomics

    DEFF Research Database (Denmark)

    Mortensen, Peter; Gouw, Joost W; Olsen, Jesper V

    2010-01-01

    Mass spectrometry-based proteomics critically depends on algorithms for data interpretation. A current bottleneck in the rapid advance of proteomics technology is the closed nature and slow development cycle of vendor-supplied software solutions. We have created an open source software environment...

  12. Quantitative comparison of genotoxic (mutagenic and carcinogenic) risks and the choice of energy sources

    International Nuclear Information System (INIS)

    Latarjet, R.

    1983-01-01

    For 25 years, pollution for radiation has been governed by restrictive rules enacted and periodically revised by an international commission, and adopted by all countries. Nothing similar exists for mutagenic and carcinogenic chemicals. Since these substances affect the genetic material in the cells with reactions often similar to those caused by radiation, quantitative comparisons are possible, in particular for some of those compounds produced by the combustion of coal, oil and gaz. This paper describes the main results obtained at the Institut Curie, since 1975, with ethylene, ethylene oxide and vinyl chloride monomer. The consequences are discussed for: a) the establishement of control rules for the main genotoxic chemical pollutions; b) the assessment of long term risks in the cases of nuclear energy and of the energies obtained by combustion [fr

  13. Quantitation of mandibular symphysis volume as a source of bone grafting.

    Science.gov (United States)

    Verdugo, Fernando; Simonian, Krikor; Smith McDonald, Roberto; Nowzari, Hessam

    2010-06-01

    Autogenous intramembranous bone graft present several advantages such as minimal resorption and high concentration of bone morphogenetic proteins. A method for measuring the amount of bone that can be harvested from the symphysis area has not been reported in real patients. The aim of the present study was to intrasurgically quantitate the volume of the symphysis bone graft that can be safely harvested in live patients and compare it with AutoCAD (version 16.0, Autodesk, Inc., San Rafael, CA, USA) tomographic calculations. AutoCAD software program quantitated symphysis bone graft in 40 patients using computerized tomographies. Direct intrasurgical measurements were recorded thereafter and compared with AutoCAD data. The bone volume was measured at the recipient sites of a subgroup of 10 patients, 6 months post sinus augmentation. The volume of bone graft measured by AutoCAD averaged 1.4 mL (SD 0.6 mL, range: 0.5-2.7 mL). The volume of bone graft measured intrasurgically averaged 2.3 mL (SD 0.4 mL, range 1.7-2.8 mL). The statistical difference between the two measurement methods was significant. The bone volume measured at the recipient sites 6 months post sinus augmentation averaged 1.9 mL (SD 0.3 mL, range 1.3-2.6 mL) with a mean loss of 0.4 mL. AutoCAD did not overestimate the volume of bone that can be safely harvested from the mandibular symphysis. The use of the design software program may improve surgical treatment planning prior to sinus augmentation.

  14. Quantitative identification of moisture sources over the Tibetan Plateau and the relationship between thermal forcing and moisture transport

    Science.gov (United States)

    Pan, Chen; Zhu, Bin; Gao, Jinhui; Kang, Hanqing; Zhu, Tong

    2018-02-01

    Despite the importance of the Tibetan Plateau (TP) to the surrounding water cycle, the moisture sources of the TP remain uncertain. In this study, the moisture sources of the TP are quantitatively identified based on a 33-year simulation with a horizontal resolution of 1.9° × 2.5° using the Community Atmosphere Model version 5.1 (CAM5.1), in which atmospheric water tracer technology is incorporated. Results demonstrate that the major moisture sources differ over the southern TP (STP) and northern TP (NTP). During the winter, Africa, the TP, and India are the dominant source regions, contributing nearly half of the water vapour over the STP. During the summer, the tropical Indian Ocean (TIO) supplies 28.5 ± 3.6% of the water vapour over the STP and becomes the dominant source region. The dominant moisture source regions of the water vapour over the NTP are Africa (19.0 ± 2.8%) during the winter and the TP (25.8 ± 2.4%) during the summer. The overall relative contribution of each source region to the precipitation is similar to the contribution to the water vapour over the TP. Like most models, CAM5.1 generally overestimates the precipitation over the TP, yielding uncertainty in the absolute contributions to the precipitation. Composite analyses exhibit significant variations in the TIO-supplied moisture transport and precipitation over the STP during the summer alongside anomalous TP heating. This relationship between moisture transport from the TIO and the TP heating primarily involves the dynamic change in the TIO-supplied moisture flux, which further controls the variation in the TIO-contributed precipitation over the STP.

  15. Assessing the Applicability of Currently Available Methods for Attributing Foodborne Disease to Sources, Including Food and Food Commodities

    DEFF Research Database (Denmark)

    Pires, Sara Monteiro

    2013-01-01

    on the public health question being addressed, on the data requirements, on advantages and limitations of the method, and on the data availability of the country or region in question. Previous articles have described available methods for source attribution, but have focused only on foodborne microbiological...

  16. A Quantitative Methodology for Vetting Dark Network Intelligence Sources for Social Network Analysis

    Science.gov (United States)

    2012-06-01

    Figure V-7 Source Stress Contributions for the Example ............................................ V-24  Figure V-8 ROC Curve for the Example...resilience is the ability of the organization “to avoid disintegration when coming under stress (Milward & Raab, 2006, p. 351).” Despite numerous...members of the network. Examples such as subordinates directed to meetings in place of their superiors, virtual participation via telecommuting

  17. A quantitative approach to the loading rate of seismogenic sources in Italy

    Science.gov (United States)

    Caporali, Alessandro; Braitenberg, Carla; Montone, Paola; Rossi, Giuliana; Valensise, Gianluca; Viganò, Alfio; Zurutuza, Joaquin

    2018-06-01

    To investigate the transfer of elastic energy between a regional stress field and a set of localized faults, we project the stress rate tensor inferred from the Italian GNSS (Global Navigation Satellite Systems) velocity field onto faults selected from the Database of Individual Seismogenic Sources (DISS 3.2.0). For given Lamé constants and friction coefficient, we compute the loading rate on each fault in terms of the Coulomb failure function (CFF) rate. By varying the strike, dip and rake angles around the nominal DISS values, we also estimate the geometry of planes that are optimally oriented for maximal CFF rate. Out of 86 Individual Seismogenic Sources (ISSs), all well covered by GNSS data, 78-81 (depending on the assumed friction coefficient) load energy at a rate of 0-4 kPa yr-1. The faults displaying larger CFF rates (4-6 ± 1 kPa yr-1) are located in the central Apennines and are all characterized by a significant strike-slip component. We also find that the loading rate of 75% of the examined sources is less than 1 kPa yr-1 lower than that of optimally oriented faults. We also analysed 2016 August 24 and October 30 central Apennines earthquakes (Mw 6.0-6.5, respectively). The strike of their causative faults based on seismological and tectonic data and the geodetically inferred strike differ by <30°. Some sources exhibit a strike oblique to the direction of maximum strain rate, suggesting that in some instances the present-day stress acts on inherited faults. The choice of the friction coefficient only marginally affects this result.

  18. A quantitative approach to the loading rate of seismogenic sources in Italy

    Science.gov (United States)

    Caporali, Alessandro; Braitenberg, Carla; Montone, Paola; Rossi, Giuliana; Valensise, Gianluca; Viganò, Alfio; Zurutuza, Joaquin

    2018-03-01

    To investigate the transfer of elastic energy between a regional stress field and a set of localized faults we project the stress rate tensor inferred from the Italian GNSS velocity field onto faults selected from the Database of Individual Seismogenic Sources (DISS 3.2.0). For given Lamé constants and friction coefficient we compute the loading rate on each fault in terms of the Coulomb Failure Function (CFF) rate. By varying the strike, dip and rake angles around the nominal DISS values, we also estimate the geometry of planes that are optimally oriented for maximal CFF rate. Out of 86 Individual Seismogenic Sources (ISSs), all well covered by GNSS data, 78 to 81 (depending on the assumed friction coefficient) load energy at a rate of 0-4 kPa/yr. The faults displaying larger CFF rates (4 to 6 ± 1 kPa/yr) are located in the central Apennines and are all characterized by a significant strike-slip component. We also find that the loading rate of 75 per cent of the examined sources is less than 1 kPa/yr lower than that of optimally oriented faults. We also analyzed the 24 August and 30 October 2016, central Apennines earthquakes (Mw 6.0-6.5 respectively). The strike of their causative faults based on seismological and tectonic data and the geodetically inferred strike differ by < 30°. Some sources exhibit a strike oblique to the direction of maximum strain rate, suggesting that in some instances the present-day stress acts on inherited faults. The choice of the friction coefficient only marginally affects this result.

  19. Quantitative evaluation of emission controls on primary and secondary organic aerosol sources during Beijing 2008 Olympics

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-08-01

    Full Text Available To assess the primary and secondary sources of fine organic aerosols after the aggressive implementation of air pollution controls during the 2008 Beijing Olympic Games, 12 h PM2.5 values were measured at an urban site at Peking University (PKU and an upwind rural site at Yufa during the CAREBEIJING-2008 (Campaigns of Air quality REsearch in BEIJING and surrounding region summer field campaign. The average PM2.5 concentrations were 72.5 ± 43.6 μg m−3 and 64.3 ± 36.2 μg m−3 (average ± standard deviation, below as the same at PKU and Yufa, respectively, showing the lowest concentrations in recent years. Combining the results from a CMB (chemical mass balance model and secondary organic aerosol (SOA tracer-yield model, five primary and four secondary fine organic aerosol sources were compared with the results from previous studies in Beijing. The relative contribution of mobile sources to PM2.5 concentrations was increased in 2008, with diesel engines contributing 16.2 ± 5.9% and 14.5 ± 4.1% and gasoline vehicles contributing 10.3 ± 8.7% and 7.9 ± 6.2% to organic carbon (OC at PKU and Yufa, respectively. Due to the implementation of emission controls, the absolute OC concentrations from primary sources were reduced during the Olympics, and the contributions from secondary formation of OC represented a larger relative source of fine organic aerosols. Compared with the non-controlled period prior to the Olympics, primary vehicle contributions were reduced by 30% at the urban site and 24% at the rural site. The reductions in coal combustion contributions were 57% at PKU and 7% at Yufa. Our results demonstrate that the emission control measures implemented in 2008 significantly alleviated the primary organic particle pollution in and around Beijing. However, additional studies are needed to provide a more comprehensive assessment of the emission control effectiveness on SOA formation.

  20. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part II: Data Sources from Specific Library Applications

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the second part of a two-part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage in statistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  1. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Science.gov (United States)

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  2. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    Directory of Open Access Journals (Sweden)

    Sho Manabe

    Full Text Available In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  3. Qualitative and quantitative analysis of Dibenzofuran, Alkyldibenzofurans, and Benzo[b]naphthofurans in crude oils and source rock extracts

    Science.gov (United States)

    Meijun Li,; Ellis, Geoffrey S.

    2015-01-01

    Dibenzofuran (DBF), its alkylated homologues, and benzo[b]naphthofurans (BNFs) are common oxygen-heterocyclic aromatic compounds in crude oils and source rock extracts. A series of positional isomers of alkyldibenzofuran and benzo[b]naphthofuran were identified in mass chromatograms by comparison with internal standards and standard retention indices. The response factors of dibenzofuran in relation to internal standards were obtained by gas chromatography-mass spectrometry analyses of a set of mixed solutions with different concentration ratios. Perdeuterated dibenzofuran and dibenzothiophene are optimal internal standards for quantitative analyses of furan compounds in crude oils and source rock extracts. The average concentration of the total DBFs in oils derived from siliciclastic lacustrine rock extracts from the Beibuwan Basin, South China Sea, was 518 μg/g, which is about 5 times that observed in the oils from carbonate source rocks in the Tarim Basin, Northwest China. The BNFs occur ubiquitously in source rock extracts and related oils of various origins. The results of this work suggest that the relative abundance of benzo[b]naphthofuran isomers, that is, the benzo[b]naphtho[2,1-d]furan/{benzo[b]naphtho[2,1-d]furan + benzo[b]naphtho[1,2-d]furan} ratio, may be a potential molecular geochemical parameter to indicate oil migration pathways and distances.

  4. Quantitative diagnosis of moisture sources and transport pathways for summer precipitation over the mid-lower Yangtze River Basin

    Science.gov (United States)

    Wang, Ning; Zeng, Xin-Min; Guo, Wei-Dong; Chen, Chaohui; You, Wei; Zheng, Yiqun; Zhu, Jian

    2018-04-01

    Using a moisture tracking model with 32-year reanalysis data and station precipitation observations, we diagnosed the sources of moisture for summer (June 1-August 31) precipitation in mid-lower reaches of the Yangtze River Basin (YRB). Results indicate the dominant role of oceanic evaporation compared to terrestrial evapotranspiration, and the previously overlooked southern Indian Ocean, as a source region, is found to contribute more moisture than the well-known Arabian Sea or Bay of Bengal. Terrestrial evapotranspiration appears to be important for summer precipitation, especially in early June when moisture contribution is more than 50%. The terrestrial contribution then decreases and is generally less than 40% after late June. The Indian Ocean is the most important oceanic source before mid-July, with its largest contribution during the period of heavy precipitation, while the Pacific Ocean becomes the more important oceanic source after mid-July. To quantitatively analyze paths of moisture transport to YRB, we proposed the Trajectory Frequency Method. The most intense branch of water vapor transport to YRB stretches from the Arabian Sea through the Bay of Bengal, the Indochina Peninsula, the South China Sea, and South China. The other main transport branches are westerly moisture fluxes to the south of the Tibetan Plateau, cross-equatorial flows north of Australia, and separate branches located in the north and equatorial Pacific Ocean. Significant intraseasonal variability for these branches is presented. Additionally, the importance of the South China Sea for moisture transport to YRB, especially from the sea areas, is emphasized.

  5. Combining emission inventory and isotope ratio analyses for quantitative source apportionment of heavy metals in agricultural soil.

    Science.gov (United States)

    Chen, Lian; Zhou, Shenglu; Wu, Shaohua; Wang, Chunhui; Li, Baojie; Li, Yan; Wang, Junxiao

    2018-08-01

    Two quantitative methods (emission inventory and isotope ratio analysis) were combined to apportion source contributions of heavy metals entering agricultural soils in the Lihe River watershed (Taihu region, east China). Source apportionment based on the emission inventory method indicated that for Cd, Cr, Cu, Pb, and Zn, the mean percentage input from atmospheric deposition was highest (62-85%), followed by irrigation (12-27%) and fertilization (1-14%). Thus, the heavy metals were derived mainly from industrial activities and traffic emissions. For Ni the combined percentage input from irrigation and fertilization was approximately 20% higher than that from atmospheric deposition, indicating that Ni was mainly derived from agricultural activities. Based on isotope ratio analysis, atmospheric deposition accounted for 57-93% of Pb entering soil, with the mean value of 69.3%, which indicates that this was the major source of Pb entering soil in the study area. The mean contributions of irrigation and fertilization to Pb pollution of soil ranged from 0% to 10%, indicating that they played only a marginally important role. Overall, the results obtained using the two methods were similar. This study provides a reliable approach for source apportionment of heavy metals entering agricultural soils in the study area, and clearly have potential application for future studies in other regions. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Electromagnetic Pulse (EMP) from the Magnetic Bubble Source as a Discriminator of Underground Nuclear Explosions, Including Cavity Decoupling

    Science.gov (United States)

    2011-02-01

    planned shock physics experiments (SPE) 4. Design/develop a very low frequency (VLF)/ELF pulsar to serve as an underground calibration source 5...Carry out underground (in tunnels, etc.) pulsar calibration experiments  A-1 APPENDIX A. ABBREVIATIONS AND ACRONYMS CORRTEX Continuous Reflectometry...Site Office P.O. Box 98521 M/S NLV 101 Las Vegas, NV 89193-8521 ATTN: Ping Lee 1 Los Alamos National Laboratory PO Box 1663 Los Alamos, NM 87545

  7. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  8. Tracers application method for the quantitative determination of the source of oxygenic inclusions in steel

    International Nuclear Information System (INIS)

    Rewienska-Kosciukowa, B.; Dalecki, W.; Michalik, J.S.

    1976-01-01

    The sense and the possibility of radioactive and nonradioactive isotopic tracers application in investigations of the origin of oxygenic nonmetalic inclusions is presented. The discussed methods touch the investigations such as the origin of egzogenic inclusions which passed from external sources (fireproof lining, slag) to the steel or as the endogenic ones formed during the process of steel deoxidisation. The question of the tracers choice for refractory material and the further investigations concerned the determination of the origin of nonmetallic inclusions are discussed. The question of so called isotopic replacement tracers for the main steel deoxidizing agents is considered. The criterion of determination of oxygenic inclusions formed during the process of steel deoxidization is also discussed. Several results of laboratory and industrial investigations and also the examples of application of the discussed methods in the industrial scale are presented. (author)

  9. Quantitative analysis of biological responses to low dose-rate γ-radiation, including dose, irradiation time, and dose-rate

    International Nuclear Information System (INIS)

    Magae, J.; Furukawa, C.; Kawakami, Y.; Hoshi, Y.; Ogata, H.

    2003-01-01

    Full text: Because biological responses to radiation are complex processes dependent on irradiation time as well as total dose, it is necessary to include dose, dose-rate and irradiation time simultaneously to predict the risk of low dose-rate irradiation. In this study, we analyzed quantitative relationship among dose, irradiation time and dose-rate, using chromosomal breakage and proliferation inhibition of human cells. For evaluation of chromosome breakage we assessed micronuclei induced by radiation. U2OS cells, a human osteosarcoma cell line, were exposed to gamma-ray in irradiation room bearing 50,000 Ci 60 Co. After the irradiation, they were cultured for 24 h in the presence of cytochalasin B to block cytokinesis, cytoplasm and nucleus were stained with DAPI and propidium iodide, and the number of binuclear cells bearing micronuclei was determined by fluorescent microscopy. For proliferation inhibition, cells were cultured for 48 h after the irradiation and [3H] thymidine was pulsed for 4 h before harvesting. Dose-rate in the irradiation room was measured with photoluminescence dosimeter. While irradiation time less than 24 h did not affect dose-response curves for both biological responses, they were remarkably attenuated as exposure time increased to more than 7 days. These biological responses were dependent on dose-rate rather than dose when cells were irradiated for 30 days. Moreover, percentage of micronucleus-forming cells cultured continuously for more than 60 days at the constant dose-rate, was gradually decreased in spite of the total dose accumulation. These results suggest that biological responses at low dose-rate, are remarkably affected by exposure time, that they are dependent on dose-rate rather than total dose in the case of long-term irradiation, and that cells are getting resistant to radiation after the continuous irradiation for 2 months. It is necessary to include effect of irradiation time and dose-rate sufficiently to evaluate risk

  10. Tissue sources of serum alkaline phosphatase in 34 hyperthyroid cats: a qualitative and quantitative study.

    Science.gov (United States)

    Foster, D J; Thoday, K L

    2000-02-01

    The concentration of serum alkaline phosphatase (SALP) is commonly elevated in hyperthyroid cats. Agarose gel electrophoresis, in tris -barbital-sodium barbital buffer, with and without the separation enhancer neuraminidase, was used to investigate the sources of the constituent isoenzymes of SALP in serum samples from 34 hyperthyroid cats, comparing them to sera from five healthy cats and to tissue homogenates from liver, kidney, bone and duodenum. Contrary to previous reports, treatment of serum with neuraminidase made differentiation of the various isoenzymes more difficult to achieve. A single band corresponding to the liver isoenzyme (LALP) was found in 100 per cent of healthy cats. Eighty-eight per cent of the hyperthyroid cats showed two bands, corresponding to the liver and bone (BALP) isoenzymes while 12 per cent showed a LALP band alone. In hyperthyroid cats, there was a significant correlation between the serum L-thyroxine concentrations and the SALP concentrations. These findings suggest pathological changes in both bone and liver in most cases of feline thyrotoxicosis. Copyright 2000 Harcourt Publishers LtdCopyright 2000 Harcourt Publishers Ltd.

  11. Importance of Including the Acoustic Medium in Rooms on the Transmission Path between Source and Receiver Rooms within a Building

    DEFF Research Database (Denmark)

    Andersen, Lars; Kirkegaard, Poul Henning; Dickow, Kristoffer Ahrens

    2011-01-01

    Low-frequency noise is a potential nuisance to inhabitants in lightweight building structures. Hence, development of efficient and accurat methods for prediction of noice in such buildings is important. The aim of this paper is to assess the necessity of including the acoustic medium in rooms along...

  12. Interlaboratory comparison of three microbial source tracking quantitative polymerase chain reaction (qPCR) assays from fecal-source and environmental samples

    Science.gov (United States)

    Stelzer, Erin A.; Strickler, Kriston M.; Schill, William B.

    2012-01-01

    During summer and early fall 2010, 15 river samples and 6 fecal-source samples were collected in West Virginia. These samples were analyzed by three laboratories for three microbial source tracking (MST) markers: AllBac, a general fecal indicator; BacHum, a human-associated fecal indicator; and BoBac, a ruminant-associated fecal indicator. MST markers were analyzed by means of the quantitative polymerase chain reaction (qPCR) method. The aim was to assess interlaboratory precision when the three laboratories used the same MST marker and shared deoxyribonucleic acid (DNA) extracts of the samples, but different equipment, reagents, and analyst experience levels. The term assay refers to both the markers and the procedure differences listed above. Interlaboratory precision was best for all three MST assays when using the geometric mean absolute relative percent difference (ARPD) and Friedman's statistical test as a measure of interlaboratory precision. Adjustment factors (one for each MST assay) were calculated using results from fecal-source samples analyzed by all three laboratories and applied retrospectively to sample concentrations to account for differences in qPCR results among labs using different standards and procedures. Following the application of adjustment factors to qPCR results, ARPDs were lower; however, statistically significant differences between labs were still observed for the BacHum and BoBac assays. This was a small study and two of the MST assays had 52 percent of samples with concentrations at or below the limit of accurate quantification; hence, more testing could be done to determine if the adjustment factors would work better if the majority of sample concentrations were above the quantification limit.

  13. A method for energy window optimization for quantitative tasks that includes the effects of model-mismatch on bias: application to Y-90 bremsstrahlung SPECT imaging

    International Nuclear Information System (INIS)

    Rong Xing; Du Yong; Frey, Eric C

    2012-01-01

    Quantitative Yttrium-90 ( 90 Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of 90 Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for 90 Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as 90 Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In 90 Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative 90 Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were derived for

  14. A method for energy window optimization for quantitative tasks that includes the effects of model-mismatch on bias: application to Y-90 bremsstrahlung SPECT imaging.

    Science.gov (United States)

    Rong, Xing; Du, Yong; Frey, Eric C

    2012-06-21

    Quantitative Yttrium-90 ((90)Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of (90)Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for (90)Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as (90)Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In (90)Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative (90)Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were

  15. A revised dosimetric characterization of the model S700 electronic brachytherapy source containing an anode-centering plastic insert and other components not included in the 2006 model

    International Nuclear Information System (INIS)

    Hiatt, Jessica R.; Davis, Stephen D.; Rivard, Mark J.

    2015-01-01

    Purpose: The model S700 Axxent electronic brachytherapy source by Xoft, Inc., was characterized by Rivard et al. in 2006. Since then, the source design was modified to include a new insert at the source tip. Current study objectives were to establish an accurate source model for simulation purposes, dosimetrically characterize the new source and obtain its TG-43 brachytherapy dosimetry parameters, and determine dose differences between the original simulation model and the current model S700 source design. Methods: Design information from measurements of dissected model S700 sources and from vendor-supplied CAD drawings was used to aid establishment of an updated Monte Carlo source model, which included the complex-shaped plastic source-centering insert intended to promote water flow for cooling the source anode. These data were used to create a model for subsequent radiation transport simulations in a water phantom. Compared to the 2006 simulation geometry, the influence of volume averaging close to the source was substantially reduced. A track-length estimator was used to evaluate collision kerma as a function of radial distance and polar angle for determination of TG-43 dosimetry parameters. Results for the 50 kV source were determined every 0.1 cm from 0.3 to 15 cm and every 1° from 0° to 180°. Photon spectra in water with 0.1 keV resolution were also obtained from 0.5 to 15 cm and polar angles from 0° to 165°. Simulations were run for 10 10 histories, resulting in statistical uncertainties on the transverse plane of 0.04% at r = 1 cm and 0.06% at r = 5 cm. Results: The dose-rate distribution ratio for the model S700 source as compared to the 2006 model exceeded unity by more than 5% for roughly one quarter of the solid angle surrounding the source, i.e., θ ≥ 120°. The radial dose function diminished in a similar manner as for an 125 I seed, with values of 1.434, 0.636, 0.283, and 0.0975 at 0.5, 2, 5, and 10 cm, respectively. The radial dose function

  16. A revised dosimetric characterization of the model S700 electronic brachytherapy source containing an anode-centering plastic insert and other components not included in the 2006 model

    Energy Technology Data Exchange (ETDEWEB)

    Hiatt, Jessica R. [Department of Radiation Oncology, Rhode Island Hospital, The Warren Alpert Medical School of Brown University, Providence, Rhode Island 02903 (United States); Davis, Stephen D. [Department of Medical Physics, McGill University Health Centre, Montreal, Quebec H3G 1A4 (Canada); Rivard, Mark J., E-mail: mark.j.rivard@gmail.com [Department of Radiation Oncology, Tufts University School of Medicine, Boston, Massachusetts 02111 (United States)

    2015-06-15

    Purpose: The model S700 Axxent electronic brachytherapy source by Xoft, Inc., was characterized by Rivard et al. in 2006. Since then, the source design was modified to include a new insert at the source tip. Current study objectives were to establish an accurate source model for simulation purposes, dosimetrically characterize the new source and obtain its TG-43 brachytherapy dosimetry parameters, and determine dose differences between the original simulation model and the current model S700 source design. Methods: Design information from measurements of dissected model S700 sources and from vendor-supplied CAD drawings was used to aid establishment of an updated Monte Carlo source model, which included the complex-shaped plastic source-centering insert intended to promote water flow for cooling the source anode. These data were used to create a model for subsequent radiation transport simulations in a water phantom. Compared to the 2006 simulation geometry, the influence of volume averaging close to the source was substantially reduced. A track-length estimator was used to evaluate collision kerma as a function of radial distance and polar angle for determination of TG-43 dosimetry parameters. Results for the 50 kV source were determined every 0.1 cm from 0.3 to 15 cm and every 1° from 0° to 180°. Photon spectra in water with 0.1 keV resolution were also obtained from 0.5 to 15 cm and polar angles from 0° to 165°. Simulations were run for 10{sup 10} histories, resulting in statistical uncertainties on the transverse plane of 0.04% at r = 1 cm and 0.06% at r = 5 cm. Results: The dose-rate distribution ratio for the model S700 source as compared to the 2006 model exceeded unity by more than 5% for roughly one quarter of the solid angle surrounding the source, i.e., θ ≥ 120°. The radial dose function diminished in a similar manner as for an {sup 125}I seed, with values of 1.434, 0.636, 0.283, and 0.0975 at 0.5, 2, 5, and 10 cm, respectively. The radial dose

  17. A revised dosimetric characterization of the model S700 electronic brachytherapy source containing an anode-centering plastic insert and other components not included in the 2006 model.

    Science.gov (United States)

    Hiatt, Jessica R; Davis, Stephen D; Rivard, Mark J

    2015-06-01

    The model S700 Axxent electronic brachytherapy source by Xoft, Inc., was characterized by Rivard et al. in 2006. Since then, the source design was modified to include a new insert at the source tip. Current study objectives were to establish an accurate source model for simulation purposes, dosimetrically characterize the new source and obtain its TG-43 brachytherapy dosimetry parameters, and determine dose differences between the original simulation model and the current model S700 source design. Design information from measurements of dissected model S700 sources and from vendor-supplied CAD drawings was used to aid establishment of an updated Monte Carlo source model, which included the complex-shaped plastic source-centering insert intended to promote water flow for cooling the source anode. These data were used to create a model for subsequent radiation transport simulations in a water phantom. Compared to the 2006 simulation geometry, the influence of volume averaging close to the source was substantially reduced. A track-length estimator was used to evaluate collision kerma as a function of radial distance and polar angle for determination of TG-43 dosimetry parameters. Results for the 50 kV source were determined every 0.1 cm from 0.3 to 15 cm and every 1° from 0° to 180°. Photon spectra in water with 0.1 keV resolution were also obtained from 0.5 to 15 cm and polar angles from 0° to 165°. Simulations were run for 10(10) histories, resulting in statistical uncertainties on the transverse plane of 0.04% at r = 1 cm and 0.06% at r = 5 cm. The dose-rate distribution ratio for the model S700 source as compared to the 2006 model exceeded unity by more than 5% for roughly one quarter of the solid angle surrounding the source, i.e., θ ≥ 120°. The radial dose function diminished in a similar manner as for an (125)I seed, with values of 1.434, 0.636, 0.283, and 0.0975 at 0.5, 2, 5, and 10 cm, respectively. The radial dose function ratio between the current

  18. Quantitative analysis of Internet television and video (WebTV: A study of formats, content, and source

    Directory of Open Access Journals (Sweden)

    José Borja ARJONA MARTÍN

    2014-07-01

    Full Text Available Due to the significant increase in the last five years of audiovisual content distribution over the web, this paper is focused on a study aimed at the description and classification of a wide sample of audiovisual initiatives whose access is carried out by means of the World Wide Web. The purpose of this study is to promote the debate concerning the different names of these incipient media, as well as their categorization and description so that an organised universe of the WebTV phenomenon could be provided. An analysis of formats and content is carried out on the basis of quantitative techniques in order to propose a categorization typology. These formats and content will be studied under three key variables: "Content", "Origin" and "Domain .tv". "Content" will help us define the programmatic lines of our study sample; “Source” refers to the source of a particular item of study (“Native WebTV or WebTV representative of a conventional media and "Domain.tv" will specify the proportion of case studies hosted with domain .tv. The results obtained in this study will offer the researchers and the professionals a comprehensive description of the models currently adopted in the field of video and television on the net.

  19. Field Measurements of Trace Gases and Aerosols Emitted by Undersampled Combustion Sources Including Wood and Dung Cooking Fires, Garbage and Crop Residue Burning, and Indonesian Peat Fires

    Science.gov (United States)

    Stockwell, C.; Jayarathne, T. S.; Goetz, D.; Simpson, I. J.; Selimovic, V.; Bhave, P.; Blake, D. R.; Cochrane, M. A.; Ryan, K. C.; Putra, E. I.; Saharjo, B.; Stone, E. A.; DeCarlo, P. F.; Yokelson, R. J.

    2017-12-01

    Field measurements were conducted in Nepal and in the Indonesian province of Central Kalimantan to improve characterization of trace gases and aerosols emitted by undersampled combustion sources. The sources targeted included cooking with a variety of stoves, garbage burning, crop residue burning, and authentic peat fires. Trace gas and aerosol emissions were studied using a land-based Fourier transform infrared spectrometer, whole air sampling, photoacoustic extinctiometers (405 and 870nm), and filter samples that were analyzed off-line. These measurements were used to calculate fuel-based emission factors (EFs) for up to 90 gases, PM2.5, and PM2.5 constituents. The aerosol optical data measured included EFs for the scattering and absorption coefficients, the single scattering albedo (at 870 and 405 nm), as well as the absorption Ångström exponent. The emissions varied significantly by source, although light absorption by both brown and black carbon (BrC and BC, respectively) was important for all non-peat sources. For authentic peat combustion, the emissions of BC were negligible and absorption was dominated by organic aerosol. The field results from peat burning were in reasonable agreement with recent lab measurements of smoldering Kalimantan peat and compare well to the limited data available from other field studies. The EFs can be used with estimates of fuel consumption to improve regional emissions inventories and assessments of the climate and health impacts of these undersampled sources.

  20. Performance of two quantitative PCR methods for microbial source tracking of human sewage and implications for microbial risk assessment in recreational waters

    Science.gov (United States)

    Before new, rapid quantitative PCR (qPCR) methods for recreational water quality assessment and microbial source tracking (MST) can be useful in a regulatory context, an understanding of the ability of the method to detect a DNA target (marker) when the contaminant soure has been...

  1. Quantitative amd Qualitative Sources of Affect: How Unexpectedness and Valence Relate to Pleasantness and Preference. Technical Report No. 293.

    Science.gov (United States)

    Iran-Nejad, Asghar; Ortony, Andrew

    Optimal-level theories maintain that the quality of affect is a function of a quantitative arousal potential dimension. An alternative view is that the quantitative dimension merely modulates preexisting qualitative properties and is therefore only responsible for changes in the degree of affect. Thus, the quality of affect, whether it is positive…

  2. Including pathogen risk in life cycle assessment of wastewater management. 2. Quantitative comparison of pathogen risk to other impacts on human health.

    Science.gov (United States)

    Heimersson, Sara; Harder, Robin; Peters, Gregory M; Svanström, Magdalena

    2014-08-19

    Resource recovery from sewage sludge has the potential to save natural resources, but the potential risks connected to human exposure to heavy metals, organic micropollutants, and pathogenic microorganisms attract stakeholder concern. The purpose of the presented study was to include pathogen risks to human health in life cycle assessment (LCA) of wastewater and sludge management systems, as this is commonly omitted from LCAs due to methodological limitations. Part 1 of this article series estimated the overall pathogen risk for such a system with agricultural use of the sludge, in a way that enables the results to be integrated in LCA. This article (part 2) presents a full LCA for two model systems (with agricultural utilization or incineration of sludge) to reveal the relative importance of pathogen risk in relation to other potential impacts on human health. The study showed that, for both model systems, pathogen risk can constitute an important part (in this study up to 20%) of the total life cycle impacts on human health (expressed in disability adjusted life years) which include other important impacts such as human toxicity potential, global warming potential, and photochemical oxidant formation potential.

  3. Repeatability, interocular correlation and agreement of quantitative swept-source optical coherence tomography angiography macular metrics in healthy subjects.

    Science.gov (United States)

    Fang, Danqi; Tang, Fang Yao; Huang, Haifan; Cheung, Carol Y; Chen, Haoyu

    2018-05-29

    To investigate the repeatability, interocular correlation and agreement of quantitative swept-source optical coherence tomography angiography (SS-OCTA) metrics in healthy subjects. Thirty-three healthy normal subjects were enrolled. The macula was scanned four times by an SS-OCTA system using the 3 mm×3 mm mode. The superficial capillary map images were analysed using a MATLAB program. A series of parameters were measured: foveal avascular zone (FAZ) area, FAZ perimeter, FAZ circularity, parafoveal vessel density, fractal dimension and vessel diameter index (VDI). The repeatability of four scans was determined by intraclass correlation coefficient (ICC). Then the averaged results were analysed for intereye difference, correlation and agreement using paired t-test, Pearson's correlation coefficient (r), ICC and Bland-Altman plot. The repeatability assessment of the macular metrics exported high ICC values (ranged from 0.853 to 0.996). There is no statistically significant difference in the OCTA metrics between the two eyes. FAZ area (ICC=0.961, r=0.929) and FAZ perimeter (ICC=0.884, r=0.802) showed excellent binocular correlation. Fractal dimension (ICC=0.732, r=0.578) and VDI (ICC=0.707, r=0.547) showed moderate binocular correlation, while parafoveal vessel density had poor binocular correlation. Bland-Altman plots showed the range of agreement was from -0.0763 to 0.0954 mm 2 for FAZ area and from -0.0491 to 0.1136 for parafoveal vessel density. The macular metrics obtained using SS-OCTA showed excellent repeatability in healthy subjects. We showed high intereye correlation in FAZ area and perimeter, moderate correlation in fractal dimension and VDI, while vessel density had poor correlation in normal healthy subjects. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Monte Carlo calculations and neutron spectrometry in quantitative prompt gamma neutron activation analysis (PGNAA) of bulk samples using an isotopic neutron source

    International Nuclear Information System (INIS)

    Spyrou, N.M.; Awotwi-Pratt, J.B.; Williams, A.M.

    2004-01-01

    An activation analysis facility based on an isotopic neutron source (185 GBq 241 Am/Be) which can perform both prompt and cyclic activation analysis on bulk samples, has been used for more than 20 years in many applications including 'in vivo' activation analysis and the determination of the composition of bio-environmental samples, such as, landfill waste and coal. Although the comparator method is often employed, because of the variety in shape, size and elemental composition of these bulk samples, it is often difficult and time consuming to construct appropriate comparator samples for reference. One of the obvious problems is the distribution and energy of the neutron flux in these bulk and comparator samples. In recent years, it was attempted to adopt the absolute method based on a monostandard and to make calculations using a Monte Carlo code (MCNP4C2) to explore this further. In particular, a model of the irradiation facility has been made using the MCNP4C2 code in order to investigate the factors contributing to the quantitative determination of the elemental concentrations through prompt gamma neutron activation analysis (PGNAA) and most importantly, to estimate how the neutron energy spectrum and neutron dose vary with penetration depth into the sample. This simulation is compared against the scattered and transmitted neutron energy spectra that are experimentally and empirically determined using a portable neutron spectrometry system. (author)

  5. Quantitative hepatic CT perfusion measurement: Comparison of Couinaud's hepatic segments with dual-source 128-slice CT

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xuan [The Department of Radiology, Peking Union Medical College Hospital, Dongcheng District, Beijing, 100730 (China); Xue, Hua-dan, E-mail: bjdanna95@hotmail.com [The Department of Radiology, Peking Union Medical College Hospital, Dongcheng District, Beijing, 100730 (China); Jin, Zheng-yu, E-mail: jin_zhengyu@163.com [The Department of Radiology, Peking Union Medical College Hospital, Dongcheng District, Beijing, 100730 (China); Su, Bai-yan; Li, Zhuo; Sun, Hao; Chen, Yu; Liu, Wei [The Department of Radiology, Peking Union Medical College Hospital, Dongcheng District, Beijing, 100730 (China)

    2013-02-15

    Purpose: To compare the quantitative liver computed tomography perfusion (CTP) differences among eight hepatic segments. Materials and methods: This retrospective study was based on 72 acquired upper abdomen CTP scans for detecting suspected pancreas tumor. Patients with primary or metastatic liver tumor, any focal liver lesions except simple cyst (<3 cm in diameter), history of liver operation or splenectomy, evidence of liver cirrhosis or invasion of portal vein were excluded. The final analysis included 50 patients (M:F = 21:29, mean age = 43.2 years, 15–76 years). Arterial liver perfusion (ALP), portal-venous perfusion (PVP), total hepatic perfusion (THP = ALP + PVP), and hepatic perfusion index (HPI) of each hepatic segment were calculated and compared by means of one-way analysis of variance (ANOVA) and the Bonferonni correction method. Results: Compared to hepatic segments 5, 6, 7 and 8, segments 2 and 3 showed a tendency of higher ALPs, lower PVPs, and higher HPIs, most of which were statistically significant (p < 0.05). Hepatic segments 1 and 4 had higher mean values of ALP and HPI and lower mean values of PVP than segments 5, 6, 7 and 8 as well, although no significant differences were detected except for ALP and HPI for liver segments 1 and 7 (p = 0.001 and 0.035 respectively), and ALP for liver segments 1 and 5 (p = 0.039). Higher ALP and HPI were showed in hepatic segment 3 compared to segment 4 (p = 0.000 and 0.000 respectively). No significant differences were found for THP among eight segments. Conclusions: Intra-hepatic perfusion differences exist in normal hepatic parenchyma especially between lateral sector (segments 2 and 3) and right lobe (segments 5, 6, 7 and 8). This might have potential clinical significance in liver-perfusion-related protocol design and result analysis.

  6. Geologic sources and concentrations of selenium in the West-Central Denver Basin, including the Toll Gate Creek watershed, Aurora, Colorado, 2003-2007

    Science.gov (United States)

    Paschke, Suzanne S.; Walton-Day, Katherine; Beck, Jennifer A.; Webbers, Ank; Dupree, Jean A.

    2014-01-01

    Toll Gate Creek, in the west-central part of the Denver Basin, is a perennial stream in which concentrations of dissolved selenium have consistently exceeded the Colorado aquatic-life standard of 4.6 micrograms per liter. Recent studies of selenium in Toll Gate Creek identified the Denver lignite zone of the non-marine Cretaceous to Tertiary-aged (Paleocene) Denver Formation underlying the watershed as the geologic source of dissolved selenium to shallow ground-water and surface water. Previous work led to this study by the U.S. Geological Survey, in cooperation with the City of Aurora Utilities Department, which investigated geologic sources of selenium and selenium concentrations in the watershed. This report documents the occurrence of selenium-bearing rocks and groundwater within the Cretaceous- to Tertiary-aged Denver Formation in the west-central part of the Denver Basin, including the Toll Gate Creek watershed. The report presents background information on geochemical processes controlling selenium concentrations in the aquatic environment and possible geologic sources of selenium; the hydrogeologic setting of the watershed; selenium results from groundwater-sampling programs; and chemical analyses of solids samples as evidence that weathering of the Denver Formation is a geologic source of selenium to groundwater and surface water in the west-central part of the Denver Basin, including Toll Gate Creek. Analyses of water samples collected from 61 water-table wells in 2003 and from 19 water-table wells in 2007 indicate dissolved selenium concentrations in groundwater in the west-central Denver Basin frequently exceeded the Colorado aquatic-life standard and in some locations exceeded the primary drinking-water standard of 50 micrograms per liter. The greatest selenium concentrations were associated with oxidized groundwater samples from wells completed in bedrock materials. Selenium analysis of geologic core samples indicates that total selenium

  7. An innovative expression model of human health risk based on the quantitative analysis of soil metals sources contribution in different spatial scales.

    Science.gov (United States)

    Zhang, Yimei; Li, Shuai; Wang, Fei; Chen, Zhuang; Chen, Jie; Wang, Liqun

    2018-09-01

    Toxicity of heavy metals from industrialization poses critical concern, and analysis of sources associated with potential human health risks is of unique significance. Assessing human health risk of pollution sources (factored health risk) concurrently in the whole and the sub region can provide more instructive information to protect specific potential victims. In this research, we establish a new expression model of human health risk based on quantitative analysis of sources contribution in different spatial scales. The larger scale grids and their spatial codes are used to initially identify the level of pollution risk, the type of pollution source and the sensitive population at high risk. The smaller scale grids and their spatial codes are used to identify the contribution of various sources of pollution to each sub region (larger grid) and to assess the health risks posed by each source for each sub region. The results of case study show that, for children (sensitive populations, taking school and residential area as major region of activity), the major pollution source is from the abandoned lead-acid battery plant (ALP), traffic emission and agricultural activity. The new models and results of this research present effective spatial information and useful model for quantifying the hazards of source categories and human health a t complex industrial system in the future. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Data Acceptance Criteria for Standardized Human-Associated Fecal Source Identification Quantitative Real-Time PCR Methods

    Science.gov (United States)

    There is a growing interest in the application of human-associated fecal sourceidentification quantitative real-time PCR (qPCR) technologies for water quality management. The transition from a research tool to a standardized protocol requires a high degree of confidence in data q...

  9. Combination of qualitative and quantitative sources of knowledge for risk assessment in the framework of possibility theory

    NARCIS (Netherlands)

    Oussalah, M.; Newby, M.J.

    2004-01-01

    This paper focuses on a representation of system reliability in the framework of possibility theory. Particularly, given a (probabilistic) quantitative knowledge pertaining to the time to failure of a system (risk function) and some qualitative knowledge about the degree of pessimism and optimism of

  10. Genotypic and phenotypic diversity of Ralstonia pickettii and Ralstonia insidiosa isolates from clinical and environmental sources including High-purity Water.

    LENUS (Irish Health Repository)

    Ryan, Michael P

    2011-08-30

    Abstract Background Ralstonia pickettii is a nosocomial infectious agent and a significant industrial contaminant. It has been found in many different environments including clinical situations, soil and industrial High Purity Water. This study compares the phenotypic and genotypic diversity of a selection of strains of Ralstonia collected from a variety of sources. Results Ralstonia isolates (fifty-nine) from clinical, industrial and environmental origins were compared genotypically using i) Species-specific-PCR, ii) PCR and sequencing of the 16S-23S rRNA Interspatial region (ISR) iii) the fliC gene genes, iv) RAPD and BOX-PCR and v) phenotypically using biochemical testing. The species specific-PCR identified fifteen out of fifty-nine designated R. pickettii isolates as actually being the closely related species R. insidiosa. PCR-ribotyping of the 16S-23S rRNA ISR indicated few major differences between the isolates. Analysis of all isolates demonstrated different banding patterns for both the RAPD and BOX primers however these were found not to vary significantly. Conclusions R. pickettii species isolated from wide geographic and environmental sources appear to be reasonably homogenous based on genotypic and phenotypic characteristics. R. insidiosa can at present only be distinguished from R. pickettii using species specific PCR. R. pickettii and R. insidiosa isolates do not differ significantly phenotypically or genotypically based on environmental or geographical origin.

  11. High-performance control of a three-phase voltage-source converter including feedforward compensation of the estimated load current

    International Nuclear Information System (INIS)

    Leon, Andres E.; Solsona, Jorge A.; Busada, Claudio; Chiacchiarini, Hector; Valla, Maria Ines

    2009-01-01

    In this paper a new control strategy for voltage-source converters (VSC) is introduced. The proposed strategy consists of a nonlinear feedback controller based on feedback linearization plus a feedforward compensation of the estimated load current. In our proposal an energy function and the direct-axis current are considered as outputs, in order to avoid the internal dynamics. In this way, a full linearization is obtained via nonlinear transformation and feedback. An estimate of the load current is feedforwarded to improve the performance of the whole system and to diminish the capacitor size. This estimation allows to obtain a more rugged and cheaper implementation. The estimate is calculated by using a nonlinear reduced-order observer. The proposal is validated through different tests. These tests include performance in presence of switching frequency, measurement filters delays, parameters uncertainties and disturbances in the input voltage.

  12. A dozen useful tips on how to minimise the influence of sources of error in quantitative electron paramagnetic resonance (EPR) spectroscopy-A review

    International Nuclear Information System (INIS)

    Mazur, Milan

    2006-01-01

    The principal and the most important error sources in quantitative electron paramagnetic resonance (EPR) measurements arising from sample-associated factors are the influence of the variation of the sample material (dielectric constant), sample size and shape, sample tube wall thickness, and sample orientation and positioning within the microwave cavity on the EPR signal intensity. Variation in these parameters can cause significant and serious errors in the primary phase of quantitative EPR analysis (i.e., data acquisition). The primary aim of this review is to provide useful suggestions, recommendations and simple procedures to minimise the influence of such primary error sources in quantitative EPR measurements. According to the literature, as well as results obtained in our EPR laboratory, the following are recommendations for samples, which are compared in quantitative EPR studies: (i) the shape of all samples should be identical; (ii) the position of the sample/reference in the cavity should be identical; (iii) a special alignment procedure for precise sample positioning within the cavity should be adopted; (iv) a special/consistent procedure for sample packing for a powder material should be used; (v) the wall thickness of sample tubes should be identical; (vi) the shape and wall thickness of quartz Dewars, where used, should be identical; (vii) where possible a double TE 104 cavity should be used in quantitative EPR spectroscopy; (viii) the dielectric properties of unknown and standard samples should be as close as possible; (ix) sample length less than double the cavity length should be used; (x) the optimised sample geometry for the X-band cavity is a 30 mm-length capillary with i.d. less then 1.5 mm; (xi) use of commercially distributed software for post-recording spectra manipulation is a basic necessity; and (xii) the sample and laboratory temperature should be kept constant during measurements. When the above recommendations and procedures were used

  13. Semi-quantitative and simulation analyses of effects of {gamma} rays on determination of calibration factors of PET scanners with point-like {sup 22}Na sources

    Energy Technology Data Exchange (ETDEWEB)

    Hasegawa, Tomoyuki [School of Allied Health Sciences, Kitasato University, 1-15-1, Kitasato, Minamiku, Sagamihara, Kanagawa, 252-0373 (Japan); Sato, Yasushi [National Institute of Advanced Industrial Science and Technology, 1-1-1, Umezono, Tsukuba, Ibaraki, 305-8568 (Japan); Oda, Keiichi [Tokyo Metropolitan Institute of Gerontology, 1-1, Nakamachi, Itabashi, Tokyo, 173-0022 (Japan); Wada, Yasuhiro [RIKEN Center for Molecular Imaging Science, 6-7-3, Minamimachi, Minatoshima, Chuo, Kobe, Hyogo, 650-0047 (Japan); Murayama, Hideo [National Institute of Radiological Sciences, 4-9-1, Anagawa, Inage, Chiba, 263-8555 (Japan); Yamada, Takahiro, E-mail: hasegawa@kitasato-u.ac.jp [Japan Radioisotope Association, 2-28-45, Komagome, Bunkyo-ku, Tokyo, 113-8941 (Japan)

    2011-09-21

    The uncertainty of radioactivity concentrations measured with positron emission tomography (PET) scanners ultimately depends on the uncertainty of the calibration factors. A new practical calibration scheme using point-like {sup 22}Na radioactive sources has been developed. The purpose of this study is to theoretically investigate the effects of the associated 1.275 MeV {gamma} rays on the calibration factors. The physical processes affecting the coincidence data were categorized in order to derive approximate semi-quantitative formulae. Assuming the design parameters of some typical commercial PET scanners, the effects of the {gamma} rays as relative deviations in the calibration factors were evaluated by semi-quantitative formulae and a Monte Carlo simulation. The relative deviations in the calibration factors were less than 4%, depending on the details of the PET scanners. The event losses due to rejecting multiple coincidence events of scattered {gamma} rays had the strongest effect. The results from the semi-quantitative formulae and the Monte Carlo simulation were consistent and were useful in understanding the underlying mechanisms. The deviations are considered small enough to correct on the basis of precise Monte Carlo simulation. This study thus offers an important theoretical basis for the validity of the calibration method using point-like {sup 22}Na radioactive sources.

  14. Getting to the Source: a Survey of Quantitative Data Sources Available to the Everyday Librarian: Part 1: Web Server Log Analysis

    Directory of Open Access Journals (Sweden)

    Lisa Goddard

    2007-03-01

    Full Text Available This is the first part of a two‐part article that provides a survey of data sources which are likely to be immediately available to the typical practitioner who wishes to engage instatistical analysis of collections and services within his or her own library. Part I outlines the data elements which can be extracted from web server logs, and discusses web log analysis tools. Part II looks at logs, reports, and data sources from proxy servers, resource vendors, link resolvers, federated search engines, institutional repositories, electronic reference services, and the integrated library system.

  15. Comparison of PCR and quantitative real-time PCR methods for the characterization of ruminant and cattle fecal pollution sources

    Science.gov (United States)

    The state of California has mandated the preparation of a guidance document on the application of fecal source identification methods for recreational water quality management. California contains the fifth highest population of cattle in the United States, making the inclusio...

  16. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  17. Quantitative analysis of histone modifications: formaldehyde is a source of pathological n(6-formyllysine that is refractory to histone deacetylases.

    Directory of Open Access Journals (Sweden)

    Bahar Edrissi

    Full Text Available Aberrant protein modifications play an important role in the pathophysiology of many human diseases, in terms of both dysfunction of physiological modifications and the formation of pathological modifications by reaction of proteins with endogenous electrophiles. Recent studies have identified a chemical homolog of lysine acetylation, N(6-formyllysine, as an abundant modification of histone and chromatin proteins, one possible source of which is the reaction of lysine with 3'-formylphosphate residues from DNA oxidation. Using a new liquid chromatography-coupled to tandem mass spectrometry method to quantify all N(6-methyl-, -acetyl- and -formyl-lysine modifications, we now report that endogenous formaldehyde is a major source of N(6-formyllysine and that this adduct is widespread among cellular proteins in all compartments. N(6-formyllysine was evenly distributed among different classes of histone proteins from human TK6 cells at 1-4 modifications per 10(4 lysines, which contrasted strongly with lysine acetylation and mono-, di-, and tri-methylation levels of 1.5-380, 5-870, 0-1400, and 0-390 per 10(4 lysines, respectively. While isotope labeling studies revealed that lysine demethylation is not a source of N(6-formyllysine in histones, formaldehyde exposure was observed to cause a dose-dependent increase in N(6-formyllysine, with use of [(13C,(2H2]-formaldehyde revealing unchanged levels of adducts derived from endogenous sources. Inhibitors of class I and class II histone deacetylases did not affect the levels of N(6-formyllysine in TK6 cells, and the class III histone deacetylase, SIRT1, had minimal activity (<10% with a peptide substrate containing the formyl adduct. These data suggest that N(6-formyllysine is refractory to removal by histone deacetylases, which supports the idea that this abundant protein modification could interfere with normal regulation of gene expression if it arises at conserved sites of physiological protein secondary

  18. Development of an open source software of quantitative analysis for radionuclide determination by gamma-ray spectrometry using semiconductor detectors

    International Nuclear Information System (INIS)

    Maduar, Marcelo Francis

    2010-01-01

    Radioactivity quantification of gamma-ray emitter radionuclides in samples measured by HPGe gamma-ray spectrometry relies on the analysis of the photopeaks present in the spectra, especially on the accurate determination of their net areas. Such a task is usually performed with the aid of proprietary software tools. This work presents a methodology, algorithm descriptions and an open source application, called OpenGamma, for the peak search and analysis in order to obtain the relevant peaks parameters and radionuclides activities. The computational implementation is released entirely in open-source license for the main code and with the use of open software packages for interface design and mathematical libraries. The procedure for the peak search is performed on a three step approach. Firstly a preliminary search is done by using the second-difference method, consisting in the generation of a derived spectrum in order to find candidate peaks. In the second step, the experimental peaks widths are assessed and well formed and isolated ones are chosen to obtain a FWHM vs. channel relationship, by application of the Levenberg-Marquardt minimization method for non-linear fitting. Lastly, regions of the spectrum with grouped peaks are marked and a non-linear fit is again applied to each region to obtain baseline and photopeaks terms; from these terms, peaks net areas are then assessed. (author)

  19. Quantitative assessment of source contributions to PM2.5 on the west coast of Peninsular Malaysia to determine the burden of Indonesian peatland fire

    Science.gov (United States)

    Fujii, Yusuke; Tohno, Susumu; Amil, Norhaniza; Latif, Mohd Talib

    2017-12-01

    Almost every dry season, peatland fires occur in Sumatra and Kalimantan Inlands. Dense smoke haze from Indonesian peatland fires (IPFs) causes impacts on health, visibility, transport and regional climate in Southeast Asian countries such as Indonesia, Malaysia, and Singapore. Quantitative knowledge of IPF source contribution to ambient aerosols in Southeast Asia (SEA) is so useful to make appropriate suggestions to policy makers to mitigate IPF-induced haze pollution. However, its quantitative contribution to ambient aerosols in SEA remains unclarified. In this study, the source contributions to PM2.5 were determined by the Positive Matrix Factorization (PMF) model with annual comprehensive observation data at Petaling Jaya on the west coast of Peninsular Malaysia, which is downwind of the IPF areas in Sumatra Island, during the dry (southwest monsoon: June-September) season. The average PM2.5 mass concentration during the whole sampling periods (Aug 2011-Jul 2012) based on the PMF and chemical mass closure models was determined as 20-21 μg m-3. Throughout the sampling periods, IPF contributed (on average) 6.1-7.0 μg m-3 to the PM2.5, or ∼30% of the retrieved PM2.5 concentration. In particular, the PM2.5 was dominantly sourced from IPF during the southwest monsoon season (51-55% of the total PM2.5 concentration on average). Thus, reducing the IPF burden in the PM2.5 levels would drastically improve the air quality (especially during the southwest monsoon season) around the west coast of Peninsular Malaysia.

  20. The Minister Council decree about conditions for to bring into the Polish customs area, to take away from the Polish customs area, and to transit through this area nuclear materials, radioactive sources and device including such sources

    International Nuclear Information System (INIS)

    Miller, L.

    2002-01-01

    The decree refers to conditions for to bring into the Polish customs area, to take away from the Polish customs area, and to transit through this area nuclear materials, radioactive sources and devices containing such sources

  1. Quantitative x-ray absorption imaging with a broadband source: application to high-intensity discharge lamps

    Energy Technology Data Exchange (ETDEWEB)

    Curry, J J [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)], E-mail: jjcurry@nist.gov

    2008-07-21

    The case of x-ray absorption imaging in which the x-ray source is broadband and the detector does not provide spectral resolution is analysed. The specific motivation is observation of the Hg vapour distribution in high-intensity discharge (HID) lamps. When absorption by the vapour is small, the problem can be couched accurately in terms of a mean absorption cross section averaged over the x-ray spectral distribution, weighted by the energy-dependent response of the detector. The method is tested against a Au foil standard and then applied to Hg. The mean absorption cross section for Hg is calculated for a Ag-anode x-ray tube at accelerating voltages of 25, 30 and 35 kV, and for HIDs in fused silica or polycrystalline alumina arc tubes.

  2. Study on the quantitative relationship between Agricultural water and fertilization process and non-point source pollution based on field experiments

    Science.gov (United States)

    Wang, H.; Chen, K.; Wu, Z.; Guan, X.

    2017-12-01

    In recent years, with the prominent of water environment problem and the relative increase of point source pollution governance, especially the agricultural non-point source pollution problem caused by the extensive use of fertilizers and pesticides has become increasingly aroused people's concern and attention. In order to reveal the quantitative relationship between agriculture water and fertilizer and non-point source pollution, on the basis of elm field experiment and combined with agricultural drainage irrigation model, the agricultural irrigation water and the relationship between fertilizer and fertilization scheme and non-point source pollution were analyzed and calculated by field emission intensity index. The results show that the variation of displacement varies greatly under different irrigation conditions. When the irrigation water increased from 22cm to 42cm, the irrigation water increased by 20 cm while the field displacement increased by 11.92 cm, about 66.22% of the added value of irrigation water. Then the irrigation water increased from 42 to 68, irrigation water increased 26 cm, and the field displacement increased by 22.48 cm, accounting for 86.46% of irrigation water. So there is an "inflection point" between the irrigation water amount and field displacement amount. The load intensity increases with the increase of irrigation water and shows a significant power correlation. Under the different irrigation condition, the increase amplitude of load intensity with the increase of irrigation water is different. When the irrigation water is smaller, the load intensity increase relatively less, and when the irrigation water increased to about 42 cm, the load intensity will increase considerably. In addition, there was a positive correlation between the fertilization and load intensity. The load intensity had obvious difference in different fertilization modes even with same fertilization level, in which the fertilizer field unit load intensity

  3. The occurrence and removal of algae (including cyanobacteria) and their related organic compounds from source water in Vaalkop Dam with conventional and advanced drinking water treatment processes

    OpenAIRE

    Swanepoel, A; Du Preez, HH; Cloete, N

    2017-01-01

    Cyanobacterial bloom formation in freshwaters, such as rivers, lakes and dams, is known to occur throughout the world. The Vaalkop Dam, which serves as source to the Vaalkop drinking water treatment works (DWTW), is no exception. Blooms of cyanobacteria occur annually in Vaalkop Dam as well as in dams from which Vaalkop is replenished during low-rainfall periods. These blooms during the summer months are associated with the production of cyanotoxins and taste and odour compounds such as geosm...

  4. Improvement of gamma-ray Sn transport calculations including coherent and incoherent scatterings and secondary sources of bremsstrahlung and fluorescence: Determination of gamma-ray buildup factors

    International Nuclear Information System (INIS)

    Kitsos, S.; Diop, C.M.; Assad, A.; Nimal, J.C.; Ridoux, P.

    1996-01-01

    Improvements of gamma-ray transport calculations in S n codes aim at taking into account the bound-electron effect of Compton scattering (incoherent), coherent scattering (Rayleigh), and secondary sources of bremsstrahlung and fluorescence. A computation scheme was developed to take into account these phenomena by modifying the angular and energy transfer matrices, and no modification in the transport code has been made. The incoherent and coherent scatterings as well as the fluorescence sources can be strictly treated by the transfer matrix change. For bremsstrahlung sources, this is possible if one can neglect the charged particles path as they pass through the matter (electrons and positrons) and is applicable for the energy range of interest for us (below 10 MeV). These improvements have been reported on the kernel attenuation codes by the calculation of new buildup factors. The gamma-ray buildup factors have been carried out for 25 natural elements up to 30 mean free paths in the energy range between 15 keV and 10 MeV

  5. Quantitative assessment of left ventricular function with dual-source CT in comparison to cardiac magnetic resonance imaging: initial findings

    Energy Technology Data Exchange (ETDEWEB)

    Busch, S.; Johnson, T.R.C.; Wintersperger, B.J.; Minaifar, N.; Bhargava, A.; Rist, C.; Reiser, M.F.; Becker, C.; Nikolaou, K. [University of Munich, Department of Clinical Radiology, Munich (Germany)

    2008-03-15

    Cardiac magnetic resonance imaging and echocardiography are currently regarded as standard modalities for the quantification of left ventricular volumes and ejection fraction. With the recent introduction of dual-source computedtomography (DSCT), the increased temporal resolution of 83 ms should also improve the assessment of cardiac function in CT. The aim of this study was to evaluate the accuracy of DSCT in the assessment of left ventricular functional parameters with cardiac magnetic resonance imaging (MRI) as standard of reference. Fifteen patients (two female, 13 male; mean age 50.8 {+-} 19.2 years) underwent CT and MRI examinations on a DSCT (Somatom Definition; Siemens Medical Solutions, Forchheim, Germany) and a 3.0-Tesla MR scanner (Magnetom Trio; Siemens Medical Solutions), respectively. Multiphase axial CT images were analysed with a semiautomatic region growing algorithms (Syngo Circulation; Siemens Medical Solutions) by two independent blinded observers. In MRI, dynamic cine loops of short axis slices were evaluated with semiautomatic contour detection software (ARGUS; Siemens Medical Solutions) independently by two readers. End-systolic volume (ESV), end-diastolic volume (EDV), ejection fraction (EF) and stroke volume (SV) were determined for both modalities, and correlation coefficient, systematic error, limits of agreement and inter-observer variability were assessed. In DSCT, EDV and ESV were 135.8 {+-} 41.9 ml and 54.9 {+-} 29.6 ml, respectively, compared with 132.1 {+-} 40.8 ml EDV and 57.6 {+-} 27.3 ml ESV in MRI. Thus, EDV was overestimated by 3.7 ml (limits of agreement -46.1/+53.6), while ESV was underestimated by 2.6 ml (-36.6/+31.4). Mean EF was 61.6 {+-} 12.4% in DSCT and 57.9 {+-} 9.0% in MRI, resulting in an overestimation of EF by 3.8% with limits of agreement at -14.7 and +22.2%. Rank correlation rho values were 0.81 for EDV (P = 0.0024), 0.79 for ESV (P = 0.0031) and 0.64 for EF (P = 0.0168). The kappa value of inter

  6. Quantitative assessment of left ventricular function with dual-source CT in comparison to cardiac magnetic resonance imaging: initial findings

    International Nuclear Information System (INIS)

    Busch, S.; Johnson, T.R.C.; Wintersperger, B.J.; Minaifar, N.; Bhargava, A.; Rist, C.; Reiser, M.F.; Becker, C.; Nikolaou, K.

    2008-01-01

    Cardiac magnetic resonance imaging and echocardiography are currently regarded as standard modalities for the quantification of left ventricular volumes and ejection fraction. With the recent introduction of dual-source computedtomography (DSCT), the increased temporal resolution of 83 ms should also improve the assessment of cardiac function in CT. The aim of this study was to evaluate the accuracy of DSCT in the assessment of left ventricular functional parameters with cardiac magnetic resonance imaging (MRI) as standard of reference. Fifteen patients (two female, 13 male; mean age 50.8 ± 19.2 years) underwent CT and MRI examinations on a DSCT (Somatom Definition; Siemens Medical Solutions, Forchheim, Germany) and a 3.0-Tesla MR scanner (Magnetom Trio; Siemens Medical Solutions), respectively. Multiphase axial CT images were analysed with a semiautomatic region growing algorithms (Syngo Circulation; Siemens Medical Solutions) by two independent blinded observers. In MRI, dynamic cine loops of short axis slices were evaluated with semiautomatic contour detection software (ARGUS; Siemens Medical Solutions) independently by two readers. End-systolic volume (ESV), end-diastolic volume (EDV), ejection fraction (EF) and stroke volume (SV) were determined for both modalities, and correlation coefficient, systematic error, limits of agreement and inter-observer variability were assessed. In DSCT, EDV and ESV were 135.8 ± 41.9 ml and 54.9 ± 29.6 ml, respectively, compared with 132.1 ± 40.8 ml EDV and 57.6 ± 27.3 ml ESV in MRI. Thus, EDV was overestimated by 3.7 ml (limits of agreement -46.1/+53.6), while ESV was underestimated by 2.6 ml (-36.6/+31.4). Mean EF was 61.6 ± 12.4% in DSCT and 57.9 ± 9.0% in MRI, resulting in an overestimation of EF by 3.8% with limits of agreement at -14.7 and +22.2%. Rank correlation rho values were 0.81 for EDV (P = 0.0024), 0.79 for ESV (P 0.0031) and 0.64 for EF (P = 0.0168). The kappa value of inter-observer variability were

  7. Wavelet-Transform-Based Power Management of Hybrid Vehicles with Multiple On-board Energy Sources Including Fuel Cell, Battery and Ultracapacitor

    Science.gov (United States)

    2008-09-12

    considered to be promising for application as distributed generation sources due to high efficiency and compactness [1-2], [21-24]. The PEMFC is...also a primary candidate for environment-friendly vehicles. The nomenclatures of the PEMFC are as follows: B , C : Constants to calculate the...0 O H H-O H-O 1 2 N I q q r r FU = (10) The block diagram of the PEMFC model based on the above equations is shown in Fig

  8. Effects of Different Sources of Nutrition on Quantitative and Qualitative Characteristics of Lycopersicon esculentum under Ecological Cropping System

    Directory of Open Access Journals (Sweden)

    M.B. Amiri

    2016-02-01

    Full Text Available Introduction: Increasing usage of chemical fertilizers imposes irreparable damages to the environment. Disadvantages of chemical fertilizers has led to more attention to the application of organic fertilizers and manures. The use of organic fertilizers and livestock, especially in nutrient poor soils, it is necessary to maintain soil quality. Plant growth promoting rhizobacteria (PGPR occupy the rhizosphere of many plant species and have beneficial effects on the host plant. They may influence the plant in a direct or indirect manner. A direct mechanism would be to increase plant growth by supplying the plant with nutrients and hormones. Indirect mechanisms on the otherhand, include, reduced susceptibility to diseases, and activing a form of defese referred to as induces systematic resistance. Examples of bacteria which have been found to enhance plant growth, include Pseudomonas, Enterobacter and Arthrobacter. Biofertilizers contain organic compounds that increase soil fertility either directly or as a result of their decay (9, 10. Tomato (Lycopersicon esculentum L. belongs to the nightshade family, Solanaceae. The plant typically grow 1-3 meters in height and a weak stem. It is a perennial in its native habitat, although often grown outdoors in temperate climates as an annual. An average common tomato weighs approximately 100 grams. Tomatoes contain the carotene lycopene, one of the most powerful natural antioxidants. In some studies, lycopene, especially in cooked tomatoes, has been found to help prevent prostate cancer. Lycopene has also been shown to improve the skin’s ability to protect against harmful UV rays. Tomatoes might help in managing human neurodegenerative diseases. The lycopene has no effect on the risk of developing diabetes, but may help relieve the oxidative stress of people who already have diabetes. The purpose of this study was the possibility of replacing chemical fertilizers with biofertilizers, reducing production

  9. Cellular Phone Towers, Cell tower locations as derived from various sources including the Department of Licenses and Inspections and the Department of Planning and Zoning., Published in 2010, 1:2400 (1in=200ft) scale, Howard County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Cellular Phone Towers dataset current as of 2010. Cell tower locations as derived from various sources including the Department of Licenses and Inspections and the...

  10. Modular design of processing and storage facilities for small volumes of low and intermediate level radioactive waste including disused sealed sources

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-06-15

    A number of IAEA Member States generate relatively small quantities of radioactive waste and/or disused sealed sources in research or in the application of nuclear techniques in medicine and industry. This publication presents a modular approach to the design of waste processing and storage facilities to address the needs of such Member States with a cost effective and flexible solution that allows easy adjustment to changing needs in terms of capacity and variety of waste streams. The key feature of the publication is the provision of practical guidance to enable the users to determine their waste processing and storage requirements, specify those requirements to allow the procurement of the appropriate processing and storage modules and to install and eventually operate those modules.

  11. An Online Q-learning Based Multi-Agent LFC for a Multi-Area Multi-Source Power System Including Distributed Energy Resources

    Directory of Open Access Journals (Sweden)

    H. Shayeghi

    2017-12-01

    Full Text Available This paper presents an online two-stage Q-learning based multi-agent (MA controller for load frequency control (LFC in an interconnected multi-area multi-source power system integrated with distributed energy resources (DERs. The proposed control strategy consists of two stages. The first stage is employed a PID controller which its parameters are designed using sine cosine optimization (SCO algorithm and are fixed. The second one is a reinforcement learning (RL based supplementary controller that has a flexible structure and improves the output of the first stage adaptively based on the system dynamical behavior. Due to the use of RL paradigm integrated with PID controller in this strategy, it is called RL-PID controller. The primary motivation for the integration of RL technique with PID controller is to make the existing local controllers in the industry compatible to reduce the control efforts and system costs. This novel control strategy combines the advantages of the PID controller with adaptive behavior of MA to achieve the desired level of robust performance under different kind of uncertainties caused by stochastically power generation of DERs, plant operational condition changes, and physical nonlinearities of the system. The suggested decentralized controller is composed of the autonomous intelligent agents, who learn the optimal control policy from interaction with the system. These agents update their knowledge about the system dynamics continuously to achieve a good frequency oscillation damping under various severe disturbances without any knowledge of them. It leads to an adaptive control structure to solve LFC problem in the multi-source power system with stochastic DERs. The results of RL-PID controller in comparison to the traditional PID and fuzzy-PID controllers is verified in a multi-area power system integrated with DERs through some performance indices.

  12. The choice of primary energy source including PV installation for providing electric energy to a public utility building - a case study

    Science.gov (United States)

    Radomski, Bartosz; Ćwiek, Barbara; Mróz, Tomasz M.

    2017-11-01

    The paper presents multicriteria decision aid analysis of the choice of PV installation providing electric energy to a public utility building. From the energy management point of view electricity obtained by solar radiation has become crucial renewable energy source. Application of PV installations may occur a profitable solution from energy, economic and ecologic point of view for both existing and newly erected buildings. Featured variants of PV installations have been assessed by multicriteria analysis based on ANP (Analytic Network Process) method. Technical, economical, energy and environmental criteria have been identified as main decision criteria. Defined set of decision criteria has an open character and can be modified in the dialog process between the decision-maker and the expert - in the present case, an expert in planning of development of energy supply systems. The proposed approach has been used to evaluate three variants of PV installation acceptable for existing educational building located in Poznań, Poland - the building of Faculty of Chemical Technology, Poznań University of Technology. Multi-criteria analysis based on ANP method and the calculation software Super Decisions has proven to be an effective tool for energy planning, leading to the indication of the recommended variant of PV installation in existing and newly erected public buildings. Achieved results show prospects and possibilities of rational renewable energy usage as complex solution to public utility buildings.

  13. Health effects of an increased protein intake on kidney function and colorectal cancer risk factors, including the role of animal and plant protein sources – the PREVIEW project

    DEFF Research Database (Denmark)

    Møller, Grith

    intake, including the role of animal and plant protein in pre-diabetic, overweight or obese individuals on health outcomes: markers of kidney function and putative risk factors for colorectal cancer as well as insulin sensitivity and kidney function in healthy individuals. The thesis is based on PREVIEW......, especially plant protein, on insulin sensitivity and kidney function. In paper II, the aim of the study was to assess the effect after one year of a higher protein intake on kidney function, measured by in creatinine clearance. This was investigated in pre-diabetic older adults based on a sub-group of 310...... pre-diabetic individuals included in the PREVIEW RCT. We found that a higher protein intake was associated with a significant increase in urea to creatinine ratio and serum urea after one year. There were no associations between increased protein intake and creatinine clearance, estimated glomerular...

  14. In vivo quantitative imaging of point-like bioluminescent and fluorescent sources: Validation studies in phantoms and small animals post mortem

    Science.gov (United States)

    Comsa, Daria Craita

    2008-10-01

    There is a real need for improved small animal imaging techniques to enhance the development of therapies in which animal models of disease are used. Optical methods for imaging have been extensively studied in recent years, due to their high sensitivity and specificity. Methods like bioluminescence and fluorescence tomography report promising results for 3D reconstructions of source distributions in vivo. However, no standard methodology exists for optical tomography, and various groups are pursuing different approaches. In a number of studies on small animals, the bioluminescent or fluorescent sources can be reasonably approximated as point or line sources. Examples include images of bone metastases confined to the bone marrow. Starting with this premise, we propose a simpler, faster, and inexpensive technique to quantify optical images of point-like sources. The technique avoids the computational burden of a tomographic method by using planar images and a mathematical model based on diffusion theory. The model employs in situ optical properties estimated from video reflectometry measurements. Modeled and measured images are compared iteratively using a Levenberg-Marquardt algorithm to improve estimates of the depth and strength of the bioluminescent or fluorescent inclusion. The performance of the technique to quantify bioluminescence images was first evaluated on Monte Carlo simulated data. Simulated data also facilitated a methodical investigation of the effect of errors in tissue optical properties on the retrieved source depth and strength. It was found that, for example, an error of 4 % in the effective attenuation coefficient led to 4 % error in the retrieved depth for source depths of up to 12mm, while the error in the retrieved source strength increased from 5.5 % at 2mm depth, to 18 % at 12mm depth. Experiments conducted on images from homogeneous tissue-simulating phantoms showed that depths up to 10mm could be estimated within 8 %, and the relative

  15. Quantitative analysis of polyethylene glycol (PEG) and PEGylated proteins in animal tissues by LC-MS/MS coupled with in-source CID.

    Science.gov (United States)

    Gong, Jiachang; Gu, Xiaomei; Achanzar, William E; Chadwick, Kristina D; Gan, Jinping; Brock, Barry J; Kishnani, Narendra S; Humphreys, W Griff; Iyer, Ramaswamy A

    2014-08-05

    The covalent conjugation of polyethylene glycol (PEG, typical MW > 10k) to therapeutic peptides and proteins is a well-established approach to improve their pharmacokinetic properties and diminish the potential for immunogenicity. Even though PEG is generally considered biologically inert and safe in animals and humans, the slow clearance of large PEGs raises concerns about potential adverse effects resulting from PEG accumulation in tissues following chronic administration, particularly in the central nervous system. The key information relevant to the issue is the disposition and fate of the PEG moiety after repeated dosing with PEGylated proteins. Here, we report a novel quantitative method utilizing LC-MS/MS coupled with in-source CID that is highly selective and sensitive to PEG-related materials. Both (40K)PEG and a tool PEGylated protein (ATI-1072) underwent dissociation in the ionization source of mass spectrometer to generate a series of PEG-specific ions, which were subjected to further dissociation through conventional CID. To demonstrate the potential application of the method to assess PEG biodistribution following PEGylated protein administration, a single dose study of ATI-1072 was conducted in rats. Plasma and various tissues were collected, and the concentrations of both (40K)PEG and ATI-1072 were determined using the LC-MS/MS method. The presence of (40k)PEG in plasma and tissue homogenates suggests the degradation of PEGylated proteins after dose administration to rats, given that free PEG was absent in the dosing solution. The method enables further studies for a thorough characterization of disposition and fate of PEGylated proteins.

  16. Quantitative parameters to compare image quality of non-invasive coronary angiography with 16-slice, 64-slice and dual-source computed tomography

    International Nuclear Information System (INIS)

    Burgstahler, Christof; Reimann, Anja; Brodoefel, Harald; Tsiflikas, Ilias; Thomas, Christoph; Heuschmid, Martin; Daferner, Ulrike; Drosch, Tanja; Schroeder, Stephen; Herberts, Tina

    2009-01-01

    Multi-slice computed tomography (MSCT) is a non-invasive modality to visualize coronary arteries with an overall good image quality. Improved spatial and temporal resolution of 64-slice and dual-source computed tomography (DSCT) scanners are supposed to have a positive impact on diagnostic accuracy and image quality. However, quantitative parameters to compare image quality of 16-slice, 64-slice MSCT and DSCT are missing. A total of 256 CT examinations were evaluated (Siemens, Sensation 16: n=90; Siemens Sensation 64: n=91; Siemens Definition: n=75). Mean Hounsfield units (HU) were measured in the cavum of the left ventricle (LV), the ascending aorta (Ao), the left ventricular myocardium (My) and the proximal part of the left main (LM), the left anterior descending artery (LAD), the right coronary artery (RCA) and the circumflex artery (CX). Moreover, the ratio of intraluminal attenuation (HU) to myocardial attenuation was assessed for all coronary arteries. Clinical data [body mass index (BMI), gender, heart rate] were accessible for all patients. Mean attenuation (CA) of the coronary arteries was significantly higher for DSCT in comparison to 64- and 16-slice MSCT within the RCA [347±13 vs. 254±14 (64-MSCT) vs. 233±11 (16-MSCT) HU], LM (362±11/275 ± 12/262±9), LAD (332±17/248±19/219±14) and LCX (310±12/210±13/221±10, all p<0.05), whereas there was no significant difference between DSCT and 64-MSCT for the LV, the Ao and My. Heart rate had a significant impact on CA ratio in 16-slice and 64-slice CT only (p<0.05). BMI had no impact on the CA ratio in DSCT only (p<0.001). Improved spatial and temporal resolution of dual-source CT is associated with better opacification of the coronary arteries and a better contrast with the myocardium, which is independent of heart rate. In comparison to MSCT, opacification of the coronary arteries at DSCT is not affected by BMI. The main advantage of DSCT lies with the heart rate independency, which might have a

  17. Caffeine Intake from Food and Beverage Sources and Trends among Children and Adolescents in the United States: Review of National Quantitative Studies from 1999 to 201112345

    Science.gov (United States)

    Ahluwalia, Namanjeet; Herrick, Kirsten

    2015-01-01

    There is increasing concern about potential adverse effects of caffeine in children. Our understanding of caffeine intake relies on studies dating to the late 1990s. This article synthesizes information from national studies since then to describe caffeine consumption, its association with sociodemographic factors, key dietary sources including caffeine-containing energy drinks (CCEDs), and trends in caffeine intake and sources among US children. Findings from the Kanter Worldpanel (KWP) Beverage Consumption Panel and the NHANES showed that caffeine consumption prevalence was generally consistent across studies and over time; more than one-half of 2- to 5-y-olds and ∼75% of older children (>5 y) consumed caffeine. The usual intakes of caffeine were 25 and 50 mg/d for children and adolescents aged 2–11 and 12–17 y, respectively (NHANES 2007–2010). Caffeine consumption correlated with age and was higher in non-Hispanic white children. The key sources of caffeine were soda and tea as well as flavored dairy (for children aged caffeine intake was noted in children overall during the 10- to 12-y period examined; intakes remained stable among older children (≥12 y). A significant increasing trend in CCED and coffee consumption and a decline in soda intake were noted (1999–2010). In 2009–2010, 10% of 12- to 19-y-olds and 10–25% of caffeine consumers (aged 12–19 y) had intakes exceeding Canadian maximal guidelines. Continued monitoring can help better understand changes in caffeine consumption patterns of youth. PMID:25593149

  18. Quantitative analysis of pulmonary artery and pulmonary collaterals in preoperative patients with pulmonary artery atresia using dual-source computed tomography

    International Nuclear Information System (INIS)

    Yin Lei; Lu, Bin; Han Lei; Wu Runze; Johnson, Laura; Xu Zhongying; Jiang Shiliang; Dai Ruping

    2011-01-01

    Objective: To evaluate the value of dual-source computed tomography (DSCT) in quantitatively measuring pulmonary arteries and major aortopulmonary collateral vessels in comparison with conventional angiographic (CA) on preoperative patients with pulmonary artery atresia and ventricular septal defect (PAA-VSD). Materials and methods: Twenty PAA-VSD patients who had complete imaging data of DSCT, CA and echocardiography (ECHO) studies were retrospectively analyzed. Using final clinical diagnosis as the standard, results of DSCT, CA and ECHO on the detection of cardiac malformations, measurement of diameters of pulmonary artery and collateral vessel, as well as the values of McGoon ratio, pulmonary arterial index (PAI) and total neopulmonary arterial index (TNPAI) were derived and compared. Results: In 20 patients, 51 of 54 (94.4%) cardiac malformations were visualized by DSCT, whereas 42 (77.8%) by ECHO (p = 0.027). Fourteen cases with aortopulmonary collateral vessels were all (100%) detected by DSCT, whereas 5 cases (35.7%) by ECHO (p = 0.001), and 13 cases (92.9%) by CA (p = 0.995). Sixteen cases with confluence of native pulmonary arteries were diagnosed by DSCT, whereas 10 cases by CA (p = 0.024). Measurement of the diameters of pulmonary arteries, collateral vessels, and descending aorta at the level of diaphragm were correlated well between DSCT and CA (r = 0.95-0.99). McGoon ratio (DSCT = 1.18 ± 0.60, CA = 1.23 ± 0.64), PAI (DSCT = 130.96 ± 99.38 mm 2 /m 2 , CA = 140.91 ± 107.87 mm 2 /m 2 ) and TNPAI (DSCT = 160.31 ± 125.62 mm 2 /m 2 , CA = 169.14 ± 122.81 mm 2 /m 2 ) were calculated respectively, without significant differences between DSCT and CA by paired t-tests (all p > 0.05). Conclusion: DSCT was efficient for evaluating and measuring native pulmonary artery and aortopulmonary collateral vessels prior to surgical procedures in PAA-VSD patients. Combined with echocardiography, DSCT showed potential to replace CA for evaluating pulmonary artery

  19. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  20. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  1. Canine toys and training devices as sources of exposure to phthalates and bisphenol A: quantitation of chemicals in leachate and in vitro screening for endocrine activity.

    Science.gov (United States)

    Wooten, Kimberly J; Smith, Philip N

    2013-11-01

    Chewing and mouthing behaviors exhibited by pet dogs are likely to lead to oral exposures to a variety of environmental chemicals. Products intended for chewing and mouthing uses include toys and training devices that are often made of plastics. The goal of the current study was to determine if a subset of phthalates and bisphenol A (BPA), endocrine disrupting chemicals commonly found in plastics, leach out of dog toys and training devices (bumpers) into synthetic canine saliva. In vitro assays were used to screen leachates for endocrine activity. Bumper leachates were dominated by di-2-ethylhexyl phthalate (DEHP) and BPA, with concentrations reaching low μg mL(-1) following short immersions in synthetic saliva. Simulated chewing of bumpers during immersion in synthetic saliva increased concentrations of phthalates and BPA as compared to new bumpers, while outdoor storage had variable effects on concentrations (increased DEHP; decreased BPA). Toys leached substantially lower concentrations of phthalates and BPA, with the exception of one toy which leached considerable amounts of diethyl phthalate. In vitro assays indicated anti-androgenic activity of bumper leachates, and estrogenic activity of both bumper and toy leachates. These results confirm that toys and training devices are potential sources of exposure to endocrine disrupting chemicals in pet dogs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Assessing the Impact of Security Behavior on the Awareness of Open-Source Intelligence: A Quantitative Study of IT Knowledge Workers

    Science.gov (United States)

    Daniels, Daniel B., III

    2014-01-01

    There is a lack of literature linking end-user behavior to the availability of open-source intelligence (OSINT). Most OSINT literature has been focused on the use and assessment of open-source intelligence, not the proliferation of personally or organizationally identifiable information (PII/OII). Additionally, information security studies have…

  3. Quantitative Thermochronology

    Science.gov (United States)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  4. Application of a compact diode pumped solid-state laser source for quantitative laser-induced breakdown spectroscopy analysis of steel

    Science.gov (United States)

    Tortschanoff, Andreas; Baumgart, Marcus; Kroupa, Gerhard

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) technology holds the potential for onsite real-time measurements of steel products. However, for a mobile and robust LIBS measurement system, an adequate small and ruggedized laser source is a key requirement. In this contribution, we present tests with our compact high-power laser source, which, initially, was developed for ignition applications. The CTR HiPoLas® laser is a robust diode pumped solid-state laser with a passive Q-switch with dimensions of less than 10 cm3. The laser generates 2.5-ns pulses with 30 mJ at a maximum continuous repetition rate of about 30 Hz. Feasibility of LIBS experiments with the laser source was experimentally verified with steel samples. The results show that the laser with its current optical output parameters is very well-suited for LIBS measurements. We believe that the miniaturized laser presented here will enable very compact and robust portable high-performance LIBS systems.

  5. Health risk assessment of polycyclic aromatic hydrocarbons in the source water and drinking water of China: Quantitative analysis based on published monitoring data.

    Science.gov (United States)

    Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei

    2011-12-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Multi-element sewer slime impact pattern--a quantitative characteristic enabling identification of the source of heavy metal discharges into sewer systems.

    Science.gov (United States)

    Kintrup, J; Wünsch, G

    2001-11-01

    The capability of sewer slime to accumulate heavy metals from municipal wastewater can be exploited to identify the sources of sewage sludge pollution. Former investigations of sewer slime looked for a few elements only and could, therefore, not account for deviations of the enrichment efficiency of the slime or for irregularities from sampling. Results of ICP-MS multi element determinations were analyzed by multivariate statistical methods. A new dimensionless characteristic "sewer slime impact" is proposed, which is zero for unloaded samples. Patterns expressed in this data format specifically extract the information required to identify the type of pollution and polluter quicker and with less effort and cost than hitherto.

  7. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  8. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  9. Multi-element sewer slime impact pattern - a quantitative characteristic enabling identification of the source of heavy metal discharges into sewer systems

    Energy Technology Data Exchange (ETDEWEB)

    Kintrup, J. [Chemistry Lab., Wessling GmbH, Hannover (Germany); Wuensch, G. [Lehrgebiet Analytische Chemie, Univ. Hannover (Germany)

    2001-11-01

    The capability of sewer slime to accumulate heavy metals from municipal wastewater can be exploited to identify the sources of sewage sludge pollution. Former investigations of sewer slime looked for a few elements only and could, therefore, not account for deviations of the enrichment efficiency of the slime or for irregularities from sampling. Results of ICP-MS multi element determinations were analyzed by multivariate statistical methods. A new dimensionless characteristic ''sewer slime impact'' is proposed, which is zero for unloaded samples. Patterns expressed in this data format specifically extract the information required to identify the type of pollution and polluter quicker and with less effort and cost than hitherto. (orig.)

  10. Olfaction-based Detection Distance: A Quantitative Analysis of How Far Away Dogs Recognize Tortoise Odor and Follow It to Source

    Directory of Open Access Journals (Sweden)

    Cindee Valentin

    2008-03-01

    Full Text Available The use of detector dogs has been demonstrated to be effective and safe for finding Mojave desert tortoises and provides certain advantages over humans in field surveys. Unlike humans who rely on visual cues for target identification, dogs use primarily olfactory cues and can therefore locate targets that are not visually obvious. One of the key benefits of surveying with dogs is their efficiency at covering ground and their ability to detect targets from long distances. Dogs may investigate potential targets using visual cues but confirm the presence of a target based on scent. Everything that emits odor does so via vapor-phase molecules and the components comprising a particular scent are carried primarily though bulk movement of the atmosphere. It is the ability to search for target odor and then go to its source that makes dogs ideal for rapid target recognition in the field setting. Using tortoises as targets, we quantified distances that dogs detected tortoise scent, followed it to source, and correctly identified tortoises as targets. Detection distance data were collected during experimental trials with advanced global positioning system (GPS technology and then analyzed using geographic information system (GIS modeling techniques. Detection distances ranged from 0.5 m to 62.8 m for tortoises on the surface. We did not observe bias with tortoise size, age class, sex or the degree to which tortoises were handled prior to being found by the dogs. The methodology we developed to quantify olfaction-based detection distance using dogs can be applied to other targets that dogs are trained to find.

  11. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  12. Laser-induced plasmas as an analytical source for quantitative analysis of gaseous and aerosol systems: Fundamentals of plasma-particle interactions

    Science.gov (United States)

    Diwakar, Prasoon K.

    2009-11-01

    Laser-induced Breakdown Spectroscopy (LIBS) is a relatively new analytical diagnostic technique which has gained serious attention in recent past due to its simplicity, robustness, and portability and multi-element analysis capabilities. LIBS has been used successfully for analysis of elements in different media including solids, liquids and gases. Since 1963, when the first breakdown study was reported, to 1983, when the first LIBS experiments were reported, the technique has come a long way, but the majority of fundamental understanding of the processes that occur has taken place in last few years, which has propelled LIBS in the direction of being a well established analytical technique. This study, which mostly focuses on LIBS involving aerosols, has been able to unravel some of the mysteries and provide knowledge that will be valuable to LIBS community as a whole. LIBS processes can be broken down to three basic steps, namely, plasma formation, analyte introduction, and plasma-analyte interactions. In this study, these three steps have been investigated in laser-induced plasma, focusing mainly on the plasma-particle interactions. Understanding plasma-particle interactions and the fundamental processes involved is important in advancing laser-induced breakdown spectroscopy as a reliable and accurate analytical technique. Critical understanding of plasma-particle interactions includes study of the plasma evolution, analyte atomization, and the particle dissociation and diffusion. In this dissertation, temporal and spatial studies have been done to understand the fundamentals of the LIBS processes including the breakdown of gases by the laser pulse, plasma inception mechanisms, plasma evolution, analyte introduction and plasma-particle interactions and their influence on LIBS signal. Spectral measurements were performed in a laser-induced plasma and the results reveal localized perturbations in the plasma properties in the vicinity of the analyte species, for

  13. High-resolution measurements of elemental mercury in surface water for an improved quantitative understanding of the Baltic Sea as a source of atmospheric mercury

    Science.gov (United States)

    Kuss, Joachim; Krüger, Siegfried; Ruickoldt, Johann; Wlost, Klaus-Peter

    2018-03-01

    Marginal seas are directly subjected to anthropogenic and natural influences from land in addition to receiving inputs from the atmosphere and open ocean. Together these lead to pronounced gradients and strong dynamic changes. However, in the case of mercury emissions from these seas, estimates often fail to adequately account for the spatial and temporal variability of the elemental mercury concentration in surface water (Hg0wat). In this study, a method to measure Hg0wat at high resolution was devised and subsequently validated. The better-resolved Hg0wat dataset, consisting of about one measurement per nautical mile, yielded insight into the sea's small-scale variability and thus improved the quantification of the sea's Hg0 emission. This is important because global marine Hg0 emissions constitute a major source of atmospheric mercury. Research campaigns in the Baltic Sea were carried out between 2011 and 2015 during which Hg0 both in surface water and in ambient air were measured. For the former, two types of equilibrators were used. A membrane equilibrator enabled continuous equilibration and a bottle equilibrator assured that equilibrium was reached for validation. The measurements were combined with data obtained in the Baltic Sea in 2006 from a bottle equilibrator only. The Hg0 sea-air flux was newly calculated with the combined dataset based on current knowledge of the Hg0 Schmidt number, Henry's law constant, and a widely used gas exchange transfer velocity parameterization. By using a newly developed pump-CTD with increased pumping capability in the Hg0 equilibrator measurements, Hg0wat could also be characterized in deeper water layers. A process study carried out near the Swedish island Øland in August 2015 showed that the upwelling of Hg0-depleted water contributed to Hg0 emissions of the Baltic Sea. However, a delay of a few days after contact between the upwelled water and light was apparently necessary before the biotic and abiotic transformations

  14. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  15. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  16. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  17. New seismograph includes filters

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-02

    The new Nimbus ES-1210 multichannel signal enhancement seismograph from EG and G geometrics has recently been redesigned to include multimode signal fillers on each amplifier. The ES-1210F is a shallow exploration seismograph for near subsurface exploration such as in depth-to-bedrock, geological hazard location, mineral exploration, and landslide investigations.

  18. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  19. Saskatchewan resources. [including uranium

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    The production of chemicals and minerals for the chemical industry in Saskatchewan are featured, with some discussion of resource taxation. The commodities mentioned include potash, fatty amines, uranium, heavy oil, sodium sulfate, chlorine, sodium hydroxide, sodium chlorate and bentonite. Following the successful outcome of the Cluff Lake inquiry, the uranium industry is booming. Some developments and production figures for Gulf Minerals, Amok, Cenex and Eldorado are mentioned.

  20. Being Included and Excluded

    DEFF Research Database (Denmark)

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural...... politics. It analyzes how formal education and mobility either challenge or reinforce traditional gendered norms which dictate a lowly position for young married women in the household and their absence from community politics. The article concludes that women are simultaneously excluded and included from...... community politics. On the one hand, their mobility and decision-making powers decrease with the increase in the labor mobility of men and their newly gained education is politically devalued when compared to the informal education that men gain through mobility, but on the other hand, schooling strengthens...

  1. Alternative Energy Sources

    CERN Document Server

    Michaelides, Efstathios E (Stathis)

    2012-01-01

    Alternative Energy Sources is designed to give the reader, a clear view of the role each form of alternative energy may play in supplying the energy needs of the human society in the near and intermediate future (20-50 years).   The two first chapters on energy demand and supply and environmental effects, set the tone as to why the widespread use of alternative energy is essential for the future of human society. The third chapter exposes the reader to the laws of energy conversion processes, as well as the limitations of converting one energy form to another. The sections on exergy give a succinct, quantitative background on the capability/potential of each energy source to produce power on a global scale. The fourth, fifth and sixth chapters are expositions of fission and fusion nuclear energy. The following five chapters (seventh to eleventh) include detailed descriptions of the most common renewable energy sources – wind, solar, geothermal, biomass, hydroelectric – and some of the less common sources...

  2. Quantitative Phase Imaging Using Hard X Rays

    International Nuclear Information System (INIS)

    Nugent, K.A.; Gureyev, T.E.; Cookson, D.J.; Paganin, D.; Barnea, Z.

    1996-01-01

    The quantitative imaging of a phase object using 16keV xrays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. We find that our phase image is in quantitative agreement with independent measurements of the object. copyright 1996 The American Physical Society

  3. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  4. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  5. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  6. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  7. Sources of polarized neutrons

    International Nuclear Information System (INIS)

    Walter, L.

    1983-01-01

    Various sources of polarized neutrons are reviewed. Monoenergetic source produced with unpolarized or polarized beams, white sources of polarized neutrons, production by transmissions through polarized hydrogen targets and polarized thermal neutronsare discussed, with appropriate applications included. (U.K.)

  8. Pacemakers lower sources

    International Nuclear Information System (INIS)

    Greatbatch, W.

    1984-01-01

    Energy sources for cardiac facing are considered including radioisotope sources, in a broad conceptual and historical framework.The main guidelines for future development of energy sources are assessed

  9. Elements of a thermic method of preparing beta-sources with fused carriers, including strontium-90; Elements d'une methode thermique de preparation de sources beta avec des entraineurs fondus, y compris le strontium-90; Osnovy termicheskogo metoda prigotovleniya beta-istochnikov s plavlennymi nositelyami, vklyuchayushchimi strontsij-90; Bases de un metodo termico de preparacion de fuentes beta con portadores fundidos, incluido el estroncio-90

    Energy Technology Data Exchange (ETDEWEB)

    Bogdanov, N I; Zakharova, K P; Zimakov, P V; Kulichenko, V V

    1962-01-15

    Sources of ionizing radiation based on the radioisotope Sr{sup 90} are widely used in apparatus and systems of automatic control and regulation of industrial processes. The technology of the preparation of sources is based on dehydration of a mixture of a radioactive solution of strontium nitrate with components such as boric anhydride, silica, and alumina. Thermic treatment of the dehydrated mixture at a high temperature produces a very mobile melt. This cools to a vitreous mass containing the required quantity of the radioisotope Sr{sup 90}. The paper gives data and discusses the results of dehydration of the system SrO - B{sub 2}O{sub 3} - SiO{sub 2} within a temperature range of 100 - 1000{sup o}C and justifies the choice of the main parameters of the technological process. It summarizes a method of mounting a vitreous preparation containing the required quantity of the radioisotope Sr{sup 90} on bases of various shapes and sizes made of steel, ceramic and other materials. The authors discuss the main parameters, ensuring that various types of sources shall be reliable and safe in operation, and give data of Sr{sup 90} sources prepared by the thermic method. (author) [French] Les sources de rayonnements ionisants a base de strontium-90 trouvent une large application dans les appareils et les systemes de controle et de reglage automatique des procedes de production. Le procede de preparation des sources se fonde sur la deshydratation d'un melange compose d'une solution radioactive de nitrate de strontium et d'elements comme l'anhydride borique, la silice, l'alumine, etc. Le traitement thermique du melange deshydrate a haute temperature donne lieu a la formation d'une masse fondue tres mobile, dont le refroidissement fournit une masse vitreuse contenant la quantite requise du radioisotope {sup 59}Sr. Les auteurs citent les donnees et examinent Jes resultats d'une etude sur la deshydratation du systeme SrO - B{sub 2}O{sub 3} - SiO{sub 2} a des temperatures

  10. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  11. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  12. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  13. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  14. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  15. Allometric trajectories and "stress": a quantitative approach

    Directory of Open Access Journals (Sweden)

    Tommaso Anfodillo

    2016-11-01

    Full Text Available The term stress is an important but vague term in plant biology. We show situations in which thinking in terms of stress is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between source and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, too little leaf area (e.g. due to herbivory or disease per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to stress, without need for recourse to this term. Our approach contrasts with traditional approaches for studying stress, e.g. revealing that small stressed plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as stress, plasticity, adaptation, and acclimation.

  16. Allometric Trajectories and "Stress": A Quantitative Approach.

    Science.gov (United States)

    Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E

    2016-01-01

    The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.

  17. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  18. Neutron source

    International Nuclear Information System (INIS)

    Cason, J.L. Jr.; Shaw, C.B.

    1975-01-01

    A neutron source which is particularly useful for neutron radiography consists of a vessel containing a moderating media of relatively low moderating ratio, a flux trap including a moderating media of relatively high moderating ratio at the center of the vessel, a shell of depleted uranium dioxide surrounding the moderating media of relatively high moderating ratio, a plurality of guide tubes each containing a movable source of neutrons surrounding the flux trap, a neutron shield surrounding one part of each guide tube, and at least one collimator extending from the flux trap to the exterior of the neutron source. The shell of depleted uranium dioxide has a window provided with depleted uranium dioxide shutters for each collimator. Reflectors are provided above and below the flux trap and on the guide tubes away from the flux trap

  19. Crowd Sourcing.

    Science.gov (United States)

    Baum, Neil

    2016-01-01

    The Internet has contributed new words and slang to our daily vernacular. A few terms, such as tweeting, texting, sexting, blogging, and googling, have become common in most vocabularies and in many languages, and are now included in the dictionary. A new buzzword making the rounds in industry is crowd sourcing, which involves outsourcing an activity, task, or problem by sending it to people or groups outside a business or a practice. Crowd sourcing allows doctors and practices to tap the wisdom of many instead of relying only on the few members of their close-knit group. This article defines "crowd sourcing," offers examples, and explains how to get started with this approach that can increase your ability to finish a task or solve problems that you don't have the time or expertise to accomplish.

  20. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  1. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  2. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  3. Quantitative and comparative visualization applied to cosmological simulations

    International Nuclear Information System (INIS)

    Ahrens, James; Heitmann, Katrin; Habib, Salman; Ankeny, Lee; McCormick, Patrick; Inman, Jeff; Armstrong, Ryan; Ma, Kwan-Liu

    2006-01-01

    Cosmological simulations follow the formation of nonlinear structure in dark and luminous matter. The associated simulation volumes and dynamic range are very large, making visualization both a necessary and challenging aspect of the analysis of these datasets. Our goal is to understand sources of inconsistency between different simulation codes that are started from the same initial conditions. Quantitative visualization supports the definition and reasoning about analytically defined features of interest. Comparative visualization supports the ability to visually study, side by side, multiple related visualizations of these simulations. For instance, a scientist can visually distinguish that there are fewer halos (localized lumps of tracer particles) in low-density regions for one simulation code out of a collection. This qualitative result will enable the scientist to develop a hypothesis, such as loss of halos in low-density regions due to limited resolution, to explain the inconsistency between the different simulations. Quantitative support then allows one to confirm or reject the hypothesis. If the hypothesis is rejected, this step may lead to new insights and a new hypothesis, not available from the purely qualitative analysis. We will present methods to significantly improve the Scientific analysis process by incorporating quantitative analysis as the driver for visualization. Aspects of this work are included as part of two visualization tools, ParaView, an open-source large data visualization tool, and Scout, an analysis-language based, hardware-accelerated visualization tool

  4. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.

    Science.gov (United States)

    Bulczak, David; Lambers, Martin; Kolb, Andreas

    2017-12-22

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  5. Study on the plasma proteins of A-bomb survived patients including those suffered by the remained radioactivities. Report 2. Quantitative observation of the plasma protein fractions by electrophoretic test and to solve the problems for physiological clinical significance of its patterns

    Energy Technology Data Exchange (ETDEWEB)

    Makidono, J; Takanashi, S; Yoshimoto, T; Kai, T; Yoshimoto, K; Matsutani, M; Miura, M

    1963-10-01

    The plasma proteins of A-bombed survivors, healthy persons, long term x-ray equipment handling people (for instance the radiologists and x-ray technicians), cancer patients, and tumor irradiated cancer patients were examined by the electrophoretic test. It was found that the electrophoretic patterns of plasma proteins could be divided into normal (N-pattern) and abnormal (..beta.. and ..gamma.. patterns) patterns, when they were classified according to the accents of each fraction. The patterns of the healthy persons and the long term x-ray handling people showed normal (N) pattern, however, it showed 43% abnormal patterns in A-bombed survivors and 48% in cancer patients. Furthermore, the patterns could be changed by radiotherapy to cancer, ie., from N to ..beta.. or vice versa. As a result of the quantitative observation about individual pattern, the accents of ..beta..-globulins in ..beta..-patterns and ..gamma..-globulins in ..gamma..-patterns were found. The globulins increased in the A bomb survivors and the long term x-ray handling people, and this increase was also seen in the cases of cancer patients which showed 85% of them were effected with uclers (self disintegrated) by clinical examinations. A physiological clinical significance of these abnormal patterns (..beta.. and ..gamma..) in the plasma proteins indicates the disorders in its body and an important immunological meaning. Abnormal patterns in those who suffered by the remained radioactivities caused by the A-bomb showed 70%, whose average was much higher than those of direct A-bombed survivors. It is pointed out that, in recent days, there is a trend of more and gradual increase in the malignant neoplamsm than the disorders of direct A-bombed survivors.

  6. THE CHANDRA SOURCE CATALOG

    International Nuclear Information System (INIS)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Harbo, Peter N.; He Xiangqun; Karovska, Margarita; Kashyap, Vinay L.; Davis, John E.; Houck, John C.; Hall, Diane M.

    2010-01-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents ∼<30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of ∼<1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  7. The Chandra Source Catalog

    Science.gov (United States)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2010-07-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents lsim30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of lsim1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  8. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  9. 2011 NATA - Emissions Sources

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset includes all emissions sources that were modeled in the 2011 National Air Toxics Assessment (NATA), inlcluding point, nonpoint, and mobile sources, and...

  10. A Quantitative Gas Chromatographic Ethanol Determination.

    Science.gov (United States)

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  11. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  12. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  13. Electric Power Monthly, August 1990. [Glossary included

    Energy Technology Data Exchange (ETDEWEB)

    1990-11-29

    The Electric Power Monthly (EPM) presents monthly summaries of electric utility statistics at the national, Census division, and State level. The purpose of this publication is to provide energy decisionmakers with accurate and timely information that may be used in forming various perspectives on electric issues that lie ahead. Data includes generation by energy source (coal, oil, gas, hydroelectric, and nuclear); generation by region; consumption of fossil fuels for power generation; sales of electric power, cost data; and unusual occurrences. A glossary is included.

  14. Source rock

    Directory of Open Access Journals (Sweden)

    Abubakr F. Makky

    2014-03-01

    Full Text Available West Beni Suef Concession is located at the western part of Beni Suef Basin which is a relatively under-explored basin and lies about 150 km south of Cairo. The major goal of this study is to evaluate the source rock by using different techniques as Rock-Eval pyrolysis, Vitrinite reflectance (%Ro, and well log data of some Cretaceous sequences including Abu Roash (E, F and G members, Kharita and Betty formations. The BasinMod 1D program is used in this study to construct the burial history and calculate the levels of thermal maturity of the Fayoum-1X well based on calibration of measured %Ro and Tmax against calculated %Ro model. The calculated Total Organic Carbon (TOC content from well log data compared with the measured TOC from the Rock-Eval pyrolysis in Fayoum-1X well is shown to match against the shale source rock but gives high values against the limestone source rock. For that, a new model is derived from well log data to calculate accurately the TOC content against the limestone source rock in the study area. The organic matter existing in Abu Roash (F member is fair to excellent and capable of generating a significant amount of hydrocarbons (oil prone produced from (mixed type I/II kerogen. The generation potential of kerogen in Abu Roash (E and G members and Betty formations is ranging from poor to fair, and generating hydrocarbons of oil and gas prone (mixed type II/III kerogen. Eventually, kerogen (type III of Kharita Formation has poor to very good generation potential and mainly produces gas. Thermal maturation of the measured %Ro, calculated %Ro model, Tmax and Production index (PI indicates that Abu Roash (F member exciting in the onset of oil generation, whereas Abu Roash (E and G members, Kharita and Betty formations entered the peak of oil generation.

  15. The determination by irradiation with a pulsed neutron generator and delayed neutron counting of the amount of fissile material present in a sample; Determination de la quantite de matiere fissile presente dans un echantillon par irradiation au moyen d'une source pulsee de neutrons et comptage des neutrons retardes

    Energy Technology Data Exchange (ETDEWEB)

    Beliard, L; Janot, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    A preliminary study was conducted to determine the amount of fissile material present in a sample. The method used consisted in irradiating the sample by means of a pulsed neutron generator and delayed neutron counting. Results show the validity of this method provided some experimental precautions are taken. Checking on the residual proportion of fissile material in leached hulls seems possible. (authors) [French] Ce rapport rend compte d'une etude preliminaire effectuee en vue de determiner la quantite de matiere fissile presente dans un echantillon. La methode utilisee consiste a irradier l'echantillon considere au moyen d'une source puisee de neutrons et a compter les neutrons retardes produits. Les resultats obtenus permettent de conclure a la validite de la methode moyennant certaines precautions. Un controle de la teneur residuelle en matiere fissile des gaines apres traitement semble possible. (auteurs)

  16. (including travel dates) Proposed itinerary

    Indian Academy of Sciences (India)

    Ashok

    31 July to 22 August 2012 (including travel dates). Proposed itinerary: Arrival in Bangalore on 1 August. 1-5 August: Bangalore, Karnataka. Suggested institutions: Indian Institute of Science, Bangalore. St Johns Medical College & Hospital, Bangalore. Jawaharlal Nehru Centre, Bangalore. 6-8 August: Chennai, TN.

  17. Quantitative secondary electron detection

    Science.gov (United States)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    2018-05-08

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  18. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  19. Digital radiography: a quantitative approach

    International Nuclear Information System (INIS)

    Retraint, F.

    2004-01-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  20. Quantitative criticism of literary relationships.

    Science.gov (United States)

    Dexter, Joseph P; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A; Bonilla Lopez, Jorge A; Schroeder, Lea A; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-04-18

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships ("intertextuality") and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term "quantitative criticism," focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca's main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions.

  1. AMS at the ANU including biomedical applications

    Energy Technology Data Exchange (ETDEWEB)

    Fifield, L K; Allan, G L; Cresswell, R G; Ophel, T R [Australian National Univ., Canberra, ACT (Australia); King, S J; Day, J P [Manchester Univ. (United Kingdom). Dept. of Chemistry

    1994-12-31

    An extensive accelerator mass spectrometry program has been conducted on the 14UD accelerator at the Australian National University since 1986. In the two years since the previous conference, the research program has expanded significantly to include biomedical applications of {sup 26}Al and studies of landform evolution using isotopes produced in situ in surface rocks by cosmic ray bombardment. The system is now used for the measurement of {sup 10}Be, {sup 14}C, {sup 26}Al, {sup 36}Cl, {sup 59}Ni and {sup 129}I, and research is being undertaken in hydrology, environmental geochemistry, archaeology and biomedicine. On the technical side, a new test system has permitted the successful off-line development of a high-intensity ion source. A new injection line to the 14UD has been established and the new source is now in position and providing beams to the accelerator. 4 refs.

  2. AMS at the ANU including biomedical applications

    Energy Technology Data Exchange (ETDEWEB)

    Fifield, L.K.; Allan, G.L.; Cresswell, R.G.; Ophel, T.R. [Australian National Univ., Canberra, ACT (Australia); King, S.J.; Day, J.P. [Manchester Univ. (United Kingdom). Dept. of Chemistry

    1993-12-31

    An extensive accelerator mass spectrometry program has been conducted on the 14UD accelerator at the Australian National University since 1986. In the two years since the previous conference, the research program has expanded significantly to include biomedical applications of {sup 26}Al and studies of landform evolution using isotopes produced in situ in surface rocks by cosmic ray bombardment. The system is now used for the measurement of {sup 10}Be, {sup 14}C, {sup 26}Al, {sup 36}Cl, {sup 59}Ni and {sup 129}I, and research is being undertaken in hydrology, environmental geochemistry, archaeology and biomedicine. On the technical side, a new test system has permitted the successful off-line development of a high-intensity ion source. A new injection line to the 14UD has been established and the new source is now in position and providing beams to the accelerator. 4 refs.

  3. Digital intelligence sources transporter

    International Nuclear Information System (INIS)

    Zhang Zhen; Wang Renbo

    2011-01-01

    It presents from the collection of particle-ray counting, infrared data communication, real-time monitoring and alarming, GPRS and other issues start to realize the digital management of radioactive sources, complete the real-time monitoring of all aspects, include the storing of radioactive sources, transporting and using, framing intelligent radioactive sources transporter, as a result, achieving reliable security supervision of radioactive sources. (authors)

  4. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  5. Theory including future not excluded

    DEFF Research Database (Denmark)

    Nagao, K.; Nielsen, H.B.

    2013-01-01

    We study a complex action theory (CAT) whose path runs over not only past but also future. We show that, if we regard a matrix element defined in terms of the future state at time T and the past state at time TA as an expectation value in the CAT, then we are allowed to have the Heisenberg equation......, Ehrenfest's theorem, and the conserved probability current density. In addition,we showthat the expectation value at the present time t of a future-included theory for large T - t and large t - T corresponds to that of a future-not-included theory with a proper inner product for large t - T. Hence, the CAT...

  6. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  7. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  8. Enhancement of a Virtual Reality Wheelchair Simulator to Include Qualitative and Quantitative Performance Metrics

    Science.gov (United States)

    Harrison, C. S.; Grant, P. M.; Conway, B. A.

    2010-01-01

    The increasing importance of inclusive design and in particular accessibility guidelines established in the U.K. 1996 Disability Discrimination Act (DDA) has been a prime motivation for the work on wheelchair access, a subset of the DDA guidelines, described in this article. The development of these guidelines mirrors the long-standing provisions…

  9. The History of Cartographic Sources Development

    Directory of Open Access Journals (Sweden)

    L. Volkotrub

    2016-07-01

    Full Text Available Cartographic sources are the variety of descriptive sources. They include historical and geographical maps and circuits maps. The image maps are a special kind of modeling the real phenomenon, that broadcasts their quantitative and qualitative characteristics, structure, interconnections and dynamic in a graphic form. The prototypes of maps appeared as a way of transmitting information around the world. People began to use this way of communication long before the appearance of writing. The quality of mapping images matched with the evolution of techniques and methods of mapping and publishing. The general development of cartographic sources is determined primarily by three factors: the development of science and technology, the needs of society in different cartographic works, political and economic situation of country. Given this, map is an all-sufficient phenomenon, its sources expert study is based on understanding of invariance of its perception. Modern theoretical concepts show us the invariance of maps. Specifially, map is viewed in the following aspects: 1 it is one of the universal models of land and existing natural and social processes.2 it is one of the tools of researching and forecasting. 3 it is a specific language formation. 4 it is a method of transferring information. As a source map may contain important information about physical geography, geology, hydrology, political-administrative division, population, flora and fauna of a particular area in a particular period. Mostly, cartographic sources are complex, because they contain a lot of cognitive and historical information.

  10. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  11. Device including a contact detector

    DEFF Research Database (Denmark)

    2011-01-01

    arms (12) may extend from the supporting body in co-planar relationship with the first surface. The plurality of cantilever arms (12) may extend substantially parallel to each other and each of the plurality of cantilever arms (12) may include an electrical conductive tip for contacting the area......The present invention relates to a probe for determining an electrical property of an area of a surface of a test sample, the probe is intended to be in a specific orientation relative to the test sample. The probe may comprise a supporting body defining a first surface. A plurality of cantilever...... of the test sample by movement of the probe relative to the surface of the test sample into the specific orientation.; The probe may further comprise a contact detector (14) extending from the supporting body arranged so as to contact the surface of the test sample prior to any one of the plurality...

  12. Neoclassical transport including collisional nonlinearity.

    Science.gov (United States)

    Candy, J; Belli, E A

    2011-06-10

    In the standard δf theory of neoclassical transport, the zeroth-order (Maxwellian) solution is obtained analytically via the solution of a nonlinear equation. The first-order correction δf is subsequently computed as the solution of a linear, inhomogeneous equation that includes the linearized Fokker-Planck collision operator. This equation admits analytic solutions only in extreme asymptotic limits (banana, plateau, Pfirsch-Schlüter), and so must be solved numerically for realistic plasma parameters. Recently, numerical codes have appeared which attempt to compute the total distribution f more accurately than in the standard ordering by retaining some nonlinear terms related to finite-orbit width, while simultaneously reusing some form of the linearized collision operator. In this work we show that higher-order corrections to the distribution function may be unphysical if collisional nonlinearities are ignored.

  13. Addressing Stillbirth in India Must Include Men.

    Science.gov (United States)

    Roberts, Lisa; Montgomery, Susanne; Ganesh, Gayatri; Kaur, Harinder Pal; Singh, Ratan

    2017-07-01

    Millennium Development Goal 4, to reduce child mortality, can only be achieved by reducing stillbirths globally. A confluence of medical and sociocultural factors contribute to the high stillbirth rates in India. The psychosocial aftermath of stillbirth is a well-documented public health problem, though less is known of the experience for men, particularly outside of the Western context. Therefore, men's perceptions and knowledge regarding reproductive health, as well as maternal-child health are important. Key informant interviews (n = 5) were analyzed and 28 structured interviews were conducted using a survey based on qualitative themes. Qualitative themes included men's dual burden and right to medical and reproductive decision making power. Wives were discouraged from expressing grief and pushed to conceive again. If not successful, particularly if a son was not conceived, a second wife was considered a solution. Quantitative data revealed that men with a history of stillbirths had greater anxiety and depression, perceived less social support, but had more egalitarian views towards women than men without stillbirth experience. At the same time fathers of stillbirths were more likely to be emotionally or physically abusive. Predictors of mental health, attitudes towards women, and perceived support are discussed. Patriarchal societal values, son preference, deficient women's autonomy, and sex-selective abortion perpetuate the risk for future poor infant outcomes, including stillbirth, and compounds the already higher risk of stillbirth for males. Grief interventions should explore and take into account men's perceptions, attitudes, and behaviors towards reproductive decision making.

  14. Quantitative skeletal scintiscanning

    International Nuclear Information System (INIS)

    Haushofer, R.

    1982-01-01

    330 patients were examined by skeletal scintiscanning with sup(99m)Tc pyrophosphate and sup(99m)methylene diphosphonate in the years between 1977 and 1979. Course control examinations were carried out in 12 patients. The collective of patients presented with primary skeletal tumours, metastases, inflammatory and degenerative skeletal diseases. Bone scintiscanning combined with the ''region of interest'' technique was found to be an objective and reproducible technique for quantitative measurement of skeletal radioactivity concentrations. The validity of nuclear skeletal examinations can thus be enhanced as far as diagnosis, course control, and differential diagnosis are concerned. Quantitative skeletal scintiscanning by means of the ''region of interest'' technique has opened up a new era in skeletal diagnosis by nuclear methods. (orig./MG) [de

  15. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Quantitative FDG in depression

    Energy Technology Data Exchange (ETDEWEB)

    Chua, P.; O`Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D. [Austin Hospital, Melbourne, VIC (Australia). Dept of Psychiatry and Centre for PET

    1998-03-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual``s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals.

  17. Quantitative FDG in depression

    International Nuclear Information System (INIS)

    Chua, P.; O'Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D.

    1998-01-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual''s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals

  18. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  19. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  20. Source Water Protection Contaminant Sources

    Data.gov (United States)

    Iowa State University GIS Support and Research Facility — Simplified aggregation of potential contaminant sources used for Source Water Assessment and Protection. The data is derived from IDNR, IDALS, and US EPA program...

  1. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  2. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  3. Quantitative cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Thelen, M.; Dueber, C.; Wolff, P.; Erbel, R.; Hoffmann, T.

    1985-06-01

    The scope and limitations of quantitative cardiac CT have been evaluated in a series of experimental and clinical studies. The left ventricular muscle mass was estimated by computed tomography in 19 dogs (using volumetric methods, measurements in two axes and planes and reference volume). There was good correlation with anatomical findings. The enddiastolic volume of the left ventricle was estimated in 22 patients with cardiomyopathies; using angiography as a reference, CT led to systematic under-estimation. It is also shown that ECG-triggered magnetic resonance tomography results in improved visualisation and may be expected to improve measurements of cardiac morphology.

  4. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  5. Multi-photon absorption limits to heralded single photon sources

    Science.gov (United States)

    Husko, Chad A.; Clark, Alex S.; Collins, Matthew J.; De Rossi, Alfredo; Combrié, Sylvain; Lehoucq, Gaëlle; Rey, Isabella H.; Krauss, Thomas F.; Xiong, Chunle; Eggleton, Benjamin J.

    2013-01-01

    Single photons are of paramount importance to future quantum technologies, including quantum communication and computation. Nonlinear photonic devices using parametric processes offer a straightforward route to generating photons, however additional nonlinear processes may come into play and interfere with these sources. Here we analyse spontaneous four-wave mixing (SFWM) sources in the presence of multi-photon processes. We conduct experiments in silicon and gallium indium phosphide photonic crystal waveguides which display inherently different nonlinear absorption processes, namely two-photon (TPA) and three-photon absorption (ThPA), respectively. We develop a novel model capturing these diverse effects which is in excellent quantitative agreement with measurements of brightness, coincidence-to-accidental ratio (CAR) and second-order correlation function g(2)(0), showing that TPA imposes an intrinsic limit on heralded single photon sources. We build on these observations to devise a new metric, the quantum utility (QMU), enabling further optimisation of single photon sources. PMID:24186400

  6. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  7. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  8. Sealed radioactive source management

    International Nuclear Information System (INIS)

    2005-01-01

    Sealed radioactive sources have been used in a wide range of application in medicine, agriculture, geology, industry and other fields. Since its utilization many sources have become out of use and became waste but no proper management. This has lead to many accidents causing deaths and serious radiation injuries worldwide. Spent sources application is expanding but their management has seen little improvements. Sealed radioactive sources have become a security risk calling for prompt action. Source management helps to maintain sources in a good physical status and provide means of source tracking and control. It also provides a well documented process of the sources making any future management options safe, secure and cost effective. Last but not least good source management substantially reduces the risk of accidents and eliminates the risk of malicious use. The International Atomic Energy Agency assists Member States to build the infrastructure to properly manage sealed radioactive sources. The assistance includes training of national experts to handle, condition and properly store the sources. For Member States that do not have proper facilities, we provide the technical assistance to design a proper facility to properly manage the radioactive sources and provide for their proper storage. For Member States that need to condition their sources properly but don't have the required infrastructure we provide direct assistance to physically help them with source recovery and provide an international expert team to properly condition their sources and render them safe and secure. We offer software (Radioactive Waste Management Registry) to properly keep a complete record on the sources and provide for efficient tracking. This also helps with proper planning and decision making for long term management

  9. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  10. Positron sources

    International Nuclear Information System (INIS)

    Chehab, R.

    1994-01-01

    A tentative survey of positron sources is given. Physical processes on which positron generation is based are indicated and analyzed. Explanation of the general features of electromagnetic interactions and nuclear β + decay makes it possible to predict the yield and emittance for a given optical matching system between the positron source and the accelerator. Some kinds of matching systems commonly used - mainly working with solenoidal field - are studied and the acceptance volume calculated. Such knowledge is helpful in comparing different matching systems. Since for large machines, a significant distance exists between the positron source and the experimental facility, positron emittance has to be preserved during beam transfer over large distances and methods used for that purpose are indicated. Comparison of existing positron sources leads to extrapolation to sources for future linear colliders. Some new ideas associated with these sources are also presented. (orig.)

  11. Chronic Obstructive Pulmonary Disease (COPD) Includes: Chronic Bronchitis and Emphysema

    Science.gov (United States)

    ... Submit Button NCHS Home Chronic Obstructive Pulmonary Disease (COPD) Includes: Chronic Bronchitis and Emphysema Recommend on Facebook ... Percent of visits to office-based physicians with COPD indicated on the medical record: 3.2% Source: ...

  12. Sources management

    International Nuclear Information System (INIS)

    Mansoux, H.; Gourmelon; Scanff, P.; Fournet, F.; Murith, Ch.; Saint-Paul, N.; Colson, P.; Jouve, A.; Feron, F.; Haranger, D.; Mathieu, P.; Paycha, F.; Israel, S.; Auboiroux, B.; Chartier, P.

    2005-01-01

    Organized by the section of technical protection of the French society of radiation protection ( S.F.R.P.), these two days had for objective to review the evolution of the rule relative to the sources of ionising radiations 'sealed and unsealed radioactive sources, electric generators'. They addressed all the actors concerned by the implementation of the new regulatory system in the different sectors of activities ( research, medicine and industry): Authorities, manufacturers, and suppliers of sources, holders and users, bodies involved in the approval of sources, carriers. (N.C.)

  13. Radioisotopic heat source

    Science.gov (United States)

    Jones, G.J.; Selle, J.E.; Teaney, P.E.

    1975-09-30

    Disclosed is a radioisotopic heat source and method for a long life electrical generator. The source includes plutonium dioxide shards and yttrium or hafnium in a container of tantalum-tungsten-hafnium alloy, all being in a nickel alloy outer container, and subjected to heat treatment of from about 1570$sup 0$F to about 1720$sup 0$F for about one h. (auth)

  14. Using Primary Source Documents.

    Science.gov (United States)

    Mintz, Steven

    2003-01-01

    Explores the use of primary sources when teaching about U.S. slavery. Includes primary sources from the Gilder Lehrman Documents Collection (New York Historical Society) to teach about the role of slaves in the Revolutionary War, such as a proclamation from Lord Dunmore offering freedom to slaves who joined his army. (CMK)

  15. Quantitative Activities for Introductory Astronomy

    Science.gov (United States)

    Keohane, Jonathan W.; Bartlett, J. L.; Foy, J. P.

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: "The Mass of Neptune” "The Temperature on Titan” "Rocks in the Early Solar System” "Comets Hitting Planets” "Ages of Meteorites” "How Flat are Saturn's Rings?” "Tides of the Sun and Moon on the Earth” "The Gliese 581 Solar System"; "Buckets in the Rain” "How Hot, Bright and Big is Betelgeuse?” "Bombs and the Sun” "What Forms Stars?” "Lifetimes of Cars and Stars” "The Mass of the Milky” "How Old is the Universe?” "Is The Universe Speeding up or Slowing Down?"

  16. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  17. Quantitative Psychology Research : The 80th Annual Meeting of the Psychometric Society, Beijing, 2015

    NARCIS (Netherlands)

    van der Ark, L.A.; Bolt, D.M.; Wang, W.-C.; Douglas, J.A.; Wiberg, M.

    2016-01-01

    The research articles in this volume cover timely quantitative psychology topics, including new methods in item response theory, computerized adaptive testing, cognitive diagnostic modeling, and psychological scaling. Topics within general quantitative methodology include structural equation

  18. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  19. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  20. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  1. Quantitative Penetration Testing with Item Response Theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle Ida Antoinette

    2014-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  2. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  3. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  4. Environmental Data Sources

    Data.gov (United States)

    Kansas Data Access and Support Center — This database includes gauging stations, climatic data centers, and storet sites. The accuracy of the locations is dependent on the source data for each of the...

  5. Sourcing Excellence

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi

    2011-01-01

    Sourcing Excellence is one of the key performance indicators (KPIs) in this world of ever changing sourcing strategies. Manufacturing companies need to access and diagnose the reliability and competencies of existing suppliers in order to coordinate and develop them. This would help in managing...

  6. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    Science.gov (United States)

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher

  7. Emission sources and quantities

    International Nuclear Information System (INIS)

    Heinen, B.

    1991-01-01

    The paper examines emission sources and quantities for SO 2 and NO x . Natural SO 2 is released from volcanic sources and to a much lower extent from marsh gases. In nature NO x is mainly produced in the course of the chemical and bacterial denitrification processes going on in the soil. Manmade pollutants are produced in combustion processes. The paper concentrates on manmade pollution. Aspects discussed include: mechanism of pollution development; manmade emission sources (e.g. industry, traffic, power plants and domestic sources); and emission quantities and forecasts. 11 refs., 2 figs., 5 tabs

  8. Positron sources

    International Nuclear Information System (INIS)

    Chehab, R.

    1989-01-01

    A tentative survey of positron sources is given. Physical processes on which positron generation is based are indicated and analyzed. Explanation of the general features of electromagnetic interactions and nuclear β + decay makes it possible to predict the yield and emittance for a given optical matching system between the positron source and the accelerator. Some kinds of matching systems commonly used - mainly working with solenoidal fields - are studied and the acceptance volume calculated. Such knowledge is helpful in comparing different matching systems. Since for large machines, a significant distance exists between the positron source and the experimental facility, positron emittance has to be preserved during beam transfer over large distances and methods used for that purpose are indicated. Comparison of existing positron sources leads to extrapolation to sources for future linear colliders

  9. Lunar neutron source function

    International Nuclear Information System (INIS)

    Kornblum, J.J.

    1974-01-01

    The search for a quantitative neutron source function for the lunar surface region is justified because it contributes to our understanding of the history of the lunar surface and of nuclear process occurring on the moon since its formation. A knowledge of the neutron source function and neutron flux distribution is important for the interpretation of many experimental measurements. This dissertation uses the available pertinent experimental measurements together with theoretical calculations to obtain an estimate of the lunar neutron source function below 15 MeV. Based upon reasonable assumptions a lunar neutron source function having adjustable parameters is assumed for neutrons below 15 MeV. The lunar neutron source function is composed of several components resulting from the action of cosmic rays with lunar material. A comparison with previous neutron calculations is made and significant differences are discussed. Application of the results to the problem of lunar soil histories is examined using the statistical model for soil development proposed by Fireman. The conclusion is drawn that the moon is losing mass

  10. Review of progress in quantitative nondestructive evaluation

    International Nuclear Information System (INIS)

    Thompson, D.O.; Chimenti, D.E.

    1983-01-01

    A comprehensive review of the current state of quantitative nondestructive evaluation (NDE), this volume brings together papers by researchers working in government, private industry, and university laboratories. Their papers cover a wide range of interests and concerns for researchers involved in theoretical and applied aspects of quantitative NDE. Specific topics examined include reliability probability of detection--ultrasonics and eddy currents weldments closure effects in fatigue cracks technology transfer ultrasonic scattering theory acoustic emission ultrasonic scattering, reliability and penetrating radiation metal matrix composites ultrasonic scattering from near-surface flaws ultrasonic multiple scattering

  11. Nonradioactive Environmental Emissions Chemical Source Term for the Double-Shell Tank (DST) Vapor Space During Waste Retrieval Operations

    International Nuclear Information System (INIS)

    MAY, T.H.

    2000-01-01

    A nonradioactive chemical vapor space source term for tanks on the Phase 1 and the extended Phase 1 delivery, storage, and disposal mission was determined. Operations modeled included mixer pump operation and DST waste transfers. Concentrations of ammonia, specific volatile organic compounds, and quantitative volumes of aerosols were estimated

  12. CG/MS quantitation of diamondoid compounds in crude oils and petroleum products

    International Nuclear Information System (INIS)

    Yang, C.; Wang, Z.D.; Hollebone, B.P.; Fingas, M.; Peng, X.; Landriault, M.

    2006-01-01

    Diamondoids are a class of saturated hydrocarbons that consist of 3-dimensionally fused cyclohexane rings. Diamondoid compounds in petroleum are the result of carbonium ion rearrangements of cyclic precursors on clay superacids in the source rock during oil generation. They are considered to be a problem due to their deposition during production of reservoir fluids and transportation of natural gas, gas condensates and light crude oils. At high concentrations, and with changes in pressure and temperature, diamondoid compounds can segregate out of reservoir fluids during production. Environmental scientists have considered fingerprinting the diamondoid hydrocarbons as a forensic method for oil spill studies. Since diamondoid compounds are thermodynamically stable, they have potential applications in oil-source correlation and differentiation for cases where traditional biomarker terpanes and steranes are absent because of environmental weathering or refining of petroleum products. Although there is increased awareness of possible use of diamondoid compounds for source identification, there is no systematic approach for using these compounds. Quantitative surveys of the abundances of diamondoids are not available. Therefore, this study developed a reliable analytical method for quantitative diamondoid analysis. Gas chromatography/mass spectrometry (GC/MS) was used to quantitatively determine diamondoid compounds (adamantane, diamantane and their alkylated homologues) in 14 fresh crude oils and 23 refined petroleum products, including light and mid-range distillate fuels, residual fuels and lubricating oils collected from different sources. Results were compared with 2 types of biomarker compounds in oil saturated hydrocarbon fractions. Several diagnostic ratios of diamondoids were developed based on their concentrations. Their potential use for forensic oil spill source identification was evaluated. 24 refs., 8 tabs., 4 figs

  13. Quantitative myocardial perfusion by O-15-water PET

    DEFF Research Database (Denmark)

    Thomassen, Anders; Petersen, Henrik; Johansen, Allan

    2015-01-01

    AIMS: Reporting of quantitative myocardial blood flow (MBF) is typically performed in standard coronary territories. However, coronary anatomy and myocardial vascular territories vary among individuals, and a coronary artery may erroneously be deemed stenosed or not if territorial demarcation...... disease (CAD). METHODS AND RESULTS: Forty-four patients with suspected CAD were included prospectively and underwent coronary CT-angiography and quantitative MBF assessment with O-15-water PET followed by invasive, quantitative coronary angiography, which served as reference. MBF was calculated...

  14. Orphan sources in Slovenia

    International Nuclear Information System (INIS)

    Janzekovic, H.; Cesarek, J.

    2005-01-01

    For decades the international standards and requirements postulate severe control over all lifecycle phases of radioactive sources in order to prevent risks associated with exposure of people and the environment. Despite this fact the orphan sources became a serious problem as a consequence of enlargement of economic transactions in many countries in Europe as well as in the world. The countries as well as international organisations, aware of this emerging problem, are trying to gain control over orphan sources using different approaches. These approaches include control over sources before they could become orphan sources. In addition, countries are also developing action plans in case that an orphan source could be found. The problems related to orphan sources in Slovenia is discussed based on the case studies from the last years. While in the nineties of the last century just a few cases of orphan sources were identified their number has increased substantially since 2003. The paper discusses the general reasons for the phenomena of orphan sources as well as the experience related to regaining control over orphan sources. (author)

  15. Environmental protection law of the European Community (EU). Source index and content index including the jurisdiction of the European Court of Justice with actual jurisdiction service and special literature according to the individual legal regulations. 34. ed.; Umweltschutzrecht der Europaeischen Union (EU). Fundstellen- und Inhaltsnachweis, einschliesslich der Rechtsprechung des Europaeischen Gerichtshofes - EuGH; mit aktuellem Rechtsprechungsdienst und Spezialliteratur zu den einzelnen Rechtsvorschriften

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Bernd

    2009-07-01

    The 34th edition of the source index of the environment law of the European Union contains the documentary evidence of the total jurisdiction of the European Court of Justice (Luxemburg) with respect to the following topics: (a) General infrastructure / integral environment law; (b) Nature protection, landscape protection as well as protection of species; (c) Dangerous materials and preparations; (d) Waste management law; (e) Water legislation; (f) environmental traffic law; (g) law of air pollution control of climate protection; (h) noise control; (i) environmental commercial law; (j) environmental law of energy.

  16. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  17. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  18. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  19. Inversion of Atmospheric Tracer Measurements, Localization of Sources

    Science.gov (United States)

    Issartel, J.-P.; Cabrit, B.; Hourdin, F.; Idelkadi, A.

    When abnormal concentrations of a pollutant are observed in the atmosphere, the question of its origin arises immediately. The radioactivity from Tchernobyl was de- tected in Sweden before the accident was announced. This situation emphasizes the psychological, political and medical stakes of a rapid identification of sources. In tech- nical terms, most industrial sources can be modeled as a fixed point at ground level with undetermined duration. The classical method of identification involves the cal- culation of a backtrajectory departing from the detector with an upstream integration of the wind field. We were first involved in such questions as we evaluated the ef- ficiency of the international monitoring network planned in the frame of the Com- prehensive Test Ban Treaty. We propose a new approach of backtracking based upon the use of retroplumes associated to available measurements. Firstly the retroplume is related to inverse transport processes, describing quantitatively how the air in a sam- ple originates from regions that are all the more extended and diffuse as we go back far in the past. Secondly it clarifies the sensibility of the measurement with respect to all potential sources. It is therefore calculated by adjoint equations including of course diffusive processes. Thirdly, the statistical interpretation, valid as far as sin- gle particles are concerned, should not be used to investigate the position and date of a macroscopic source. In that case, the retroplume rather induces a straightforward constraint between the intensity of the source and its position. When more than one measurements are available, including zero valued measurements, the source satisfies the same number of linear relations tightly related to the retroplumes. This system of linear relations can be handled through the simplex algorithm in order to make the above intensity-position correlation more restrictive. This method enables to manage in a quantitative manner the

  20. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects

    Directory of Open Access Journals (Sweden)

    David Bulczak

    2017-12-01

    Full Text Available In the last decade, Time-of-Flight (ToF range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF measurements for selected, purchasable materials in the near-infrared (NIR range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  1. Sources of groundwater contamination

    International Nuclear Information System (INIS)

    Assaf, H.; Al-Masri, M. S.

    2007-09-01

    In spite of the importance of water for life, either for drinking, irrigation, industry or other wide uses in many fields, human beings seem to contaminate it and make it unsuitable for human uses. This is due to disposal of wastes in the environment without treatment. In addition to population increase and building expanding higher living costs, industrial and economical in growth that causes an increase in water consumption. All of these factors have made an increase pressure on our water environment quantitatively and qualitatively. In addition, there is an increase of potential risks to the water environmental due to disposal of domestic and industrial wastewater in areas near the water sources. Moreover, the use of unacceptable irrigation systems may increase soil salinity and evaporation rates. The present report discusses the some groundwater sources and problem, hot and mineral waters that become very important in our life and to our health due to its chemical and radioactivity characteristics.(authors)

  2. Photon sources for absorptiometric measurements

    International Nuclear Information System (INIS)

    Witt, R.M.; Sandrik, J.M.; Cameron, J.R.

    1976-01-01

    Photon absorptiometry is defined and the requirements of photon sources for these measurements are described. Both x-ray tubes and radionuclide sources are discussed, including the advantages of each in absorptiometric systems

  3. Cigarette makers pioneered many of our black arts of disinformation, including the funding of research to distract from the hazards of smoking. Ten Nobel prizes were the result. By funding distraction research, the cigarette industry became an important source of academic corruption, helping also to forge other forms of denialism on a global scale.

    Science.gov (United States)

    Proctor, R. N.

    2014-12-01

    Cigarette Disinformation: Origins and Global Impact Robert N. Proctor The cigarette is the deadliest artifact in the history of human civilization. And whereas "only" a hundred million people died in the 20th century from smoking, we are presently on a pace to have several times that toll in the present century. Much of that catastrophe would not be possible without a massive campaign of disinformation. The cigarette industry pioneered many of the black arts of disinformation, cleverly exploiting the inherent skepticism of science to claim that "more research" was needed to resolve a purported "cigarette controversy." Cigarette makers funded hundreds of millions of dollars worth of "distraction research," most of which was solid empirical science but off topic, focusing on basic biology and biochemistry, viral and genetic causes of disease, and other "cigarette friendly" topics. At least ten Nobel prizes were the result. Cigarette skepticism was thus more complex than we normally imagine: the tobacco industry corrupted science by funding "alternative causation," meaning anything that could be used to draw attention away from cigarettes as a source of disease. The cigarette industry by this means became the most important source of academic corruption since the Nazi era. That corruption has also helped forge other forms of denialism and corruption on a global scale.

  4. Advancing the Fork detector for quantitative spent nuclear fuel verification

    Science.gov (United States)

    Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.

    2018-04-01

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms

  5. Problems of standardized handling and quantitative evaluation of autoradiograms

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1985-01-01

    In the last years autoradiography has gained increasing importance as a quantitative method of measuring radioactivity or element concentration. Mostly relative measurements are carried out. The optical density of the photographic emulsion produced by a calibrated radiation source is compared with that produced by a sample. The influences of different parameters, such as beta particle energy, backscattering, fading of the latent image, developing conditions, matrix effects and others on the results are described and the errors of the quantitative evaluation of autoradiograms are assessed. The performance of the method is demonstrated taking the quantitative determination of gold in silicon as an example

  6. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  7. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  8. Schools K-12, School locations within Sedgwick County. This layer is maintained interactively by GIS staff. Primary attributes include school name, class, funding source, address, and parachial status. Published to scschoop.shp., Published in 2008, 1:1200 (1in=100ft) scale, Sedgwick County Government.

    Data.gov (United States)

    NSGIC Local Govt | GIS Inventory — Schools K-12 dataset current as of 2008. School locations within Sedgwick County. This layer is maintained interactively by GIS staff. Primary attributes include...

  9. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  10. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  11. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    Science.gov (United States)

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  12. Quantitative Approaches to Group Research: Suggestions for Best Practices

    Science.gov (United States)

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  13. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  14. New generation quantitative x-ray microscopy encompassing phase-contrast

    International Nuclear Information System (INIS)

    Wilkins, S.W.; Mayo, S.C.; Gureyev, T.E.; Miller, P.R.; Pogany, A.; Stevenson, A.W.; Gao, D.; Davis, T.J.; Parry, D.J.; Paganin, D.

    2000-01-01

    Full text: We briefly outline a new approach to X-ray ultramicroscopy using projection imaging in a scanning electron microscope (SEM). Compared to earlier approaches, the new approach offers spatial resolution of ≤0.1 micron and includes novel features such as: i) phase contrast to give additional sample information over a wide energy range, rapid phase/amplitude extraction algorithms to enable new real-time modes of microscopic imaging widespread applications are envisaged to fields such as materials science, biomedical research, and microelectronics device inspection. Some illustrative examples are presented. The quantitative methods described here are also very relevant to X-ray projection microscopy using synchrotron sources

  15. Design database for quantitative trait loci (QTL) data warehouse, data mining, and meta-analysis.

    Science.gov (United States)

    Hu, Zhi-Liang; Reecy, James M; Wu, Xiao-Lin

    2012-01-01

    A database can be used to warehouse quantitative trait loci (QTL) data from multiple sources for comparison, genomic data mining, and meta-analysis. A robust database design involves sound data structure logistics, meaningful data transformations, normalization, and proper user interface designs. This chapter starts with a brief review of relational database basics and concentrates on issues associated with curation of QTL data into a relational database, with emphasis on the principles of data normalization and structure optimization. In addition, some simple examples of QTL data mining and meta-analysis are included. These examples are provided to help readers better understand the potential and importance of sound database design.

  16. Including gauge corrections to thermal leptogenesis

    Energy Technology Data Exchange (ETDEWEB)

    Huetig, Janine

    2013-05-17

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  17. Including gauge corrections to thermal leptogenesis

    International Nuclear Information System (INIS)

    Huetig, Janine

    2013-01-01

    . Furthermore, we have computed the Majorana neutrino production rate itself in chapter 6 to test our numerical procedure. In this context we have calculated the tree-level result as well as the gauge corrected result for the Majorana neutrino production rate. Finally in chapter 7, we have implemented the Majorana neutrino ladder rung diagram into our setup for leptogenesis: As a first consideration, we have collected all gauge corrected diagrams up to three-loop order for the asymmetry-causing two-loop diagrams. However, the results of chap. 5 showed that it is not sufficient to just include diagrams up to three-loop level. Due to the necessity of resumming all n-loop diagrams, we have constructed a cylindrical diagram that fulfils this condition. This diagram is the link between the Majorana neutrino ladder rung diagram calculated before on the one hand and the lepton asymmetry on the other. Therefore we have been able to derive a complete expression for the integrated lepton number matrix including all leading order corrections. The numerical analysis of this lepton number matrix needs a great computational effort since for the resulting eight-dimensional integral two ordinary differential equations have to be computed for each point the routine evaluates. Thus the result remains yet inaccessible. Research perspectives: Summarising, this thesis provides the basis for a systematic inclusion of gauge interactions in thermal leptogenesis scenarios. As a next step, one should evaluate the expression for the integrated lepton number numerically to gain a value, which can be used for comparison to earlier results such as the solutions of the Boltzmann equations as well as the Kadanoff-Baym ansatz with the implemented Standard Model widths. This numerical result would be the first quantitative number, which contains leading order corrections due to all interactions of the Majorana neutrino with the Standard Model particles. Further corrections by means of including washout effects

  18. Photonic crystal light source

    Science.gov (United States)

    Fleming, James G [Albuquerque, NM; Lin, Shawn-Yu [Albuquerque, NM; Bur, James A [Corrales, NM

    2004-07-27

    A light source is provided by a photonic crystal having an enhanced photonic density-of-states over a band of frequencies and wherein at least one of the dielectric materials of the photonic crystal has a complex dielectric constant, thereby producing enhanced light emission at the band of frequencies when the photonic crystal is heated. The dielectric material can be a metal, such as tungsten. The spectral properties of the light source can be easily tuned by modification of the photonic crystal structure and materials. The photonic crystal light source can be heated electrically or other heating means. The light source can further include additional photonic crystals that exhibit enhanced light emission at a different band of frequencies to provide for color mixing. The photonic crystal light source may have applications in optical telecommunications, information displays, energy conversion, sensors, and other optical applications.

  19. Neutron sources and applications

    Energy Technology Data Exchange (ETDEWEB)

    Price, D.L. [ed.] [Argonne National Lab., IL (United States); Rush, J.J. [ed.] [National Inst. of Standards and Technology, Gaithersburg, MD (United States)

    1994-01-01

    Review of Neutron Sources and Applications was held at Oak Brook, Illinois, during September 8--10, 1992. This review involved some 70 national and international experts in different areas of neutron research, sources, and applications. Separate working groups were asked to (1) review the current status of advanced research reactors and spallation sources; and (2) provide an update on scientific, technological, and medical applications, including neutron scattering research in a number of disciplines, isotope production, materials irradiation, and other important uses of neutron sources such as materials analysis and fundamental neutron physics. This report summarizes the findings and conclusions of the different working groups involved in the review, and contains some of the best current expertise on neutron sources and applications.

  20. Neutron sources and applications

    International Nuclear Information System (INIS)

    Price, D.L.; Rush, J.J.

    1994-01-01

    Review of Neutron Sources and Applications was held at Oak Brook, Illinois, during September 8--10, 1992. This review involved some 70 national and international experts in different areas of neutron research, sources, and applications. Separate working groups were asked to (1) review the current status of advanced research reactors and spallation sources; and (2) provide an update on scientific, technological, and medical applications, including neutron scattering research in a number of disciplines, isotope production, materials irradiation, and other important uses of neutron sources such as materials analysis and fundamental neutron physics. This report summarizes the findings and conclusions of the different working groups involved in the review, and contains some of the best current expertise on neutron sources and applications

  1. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  2. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  3. Energy sources

    International Nuclear Information System (INIS)

    Vajda, Gy.

    1998-01-01

    A comprehensive review is presented of the available sources of energy in the world is presented. About 80 percent of primary energy utilization is based on fossile fuels, and their dominant role is not expected to change in the foreseeable future. Data are given on petroleum, natural gas and coal based power production. The role and economic aspects of nuclear power are analyzed. A brief summary of renewable energy sources is presented. The future prospects of the world's energy resources are discussed, and the special position of Hungary regarding fossil, nuclear and renewable energy and the country's energy potential is evaluated. (R.P.)

  4. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  5. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  6. Radiation sources working group summary

    International Nuclear Information System (INIS)

    Fazio, M.V.

    1998-01-01

    The Radiation Sources Working Group addressed advanced concepts for the generation of RF energy to power advanced accelerators. The focus of the working group included advanced sources and technologies above 17 GHz. The topics discussed included RF sources above 17 GHz, pulse compression techniques to achieve extreme peak power levels, components technology, technology limitations and physical limits, and other advanced concepts. RF sources included gyroklystrons, magnicons, free-electron masers, two beam accelerators, and gyroharmonic and traveling wave devices. Technology components discussed included advanced cathodes and electron guns, high temperature superconductors for producing magnetic fields, RF breakdown physics and mitigation, and phenomena that impact source design such as fatigue in resonant structures due to RF heating. New approaches for RF source diagnostics located internal to the source were discussed for detecting plasma and beam phenomena existing in high energy density electrodynamic systems in order to help elucidate the reasons for performance limitations

  7. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  8. Ion source

    International Nuclear Information System (INIS)

    1977-01-01

    The specifications of a set of point-shape electrodes of non-corrodable material that can hold a film of liquid material of equal thickness is described. Contained in a jacket, this set forms an ion source. The electrode is made of tungsten with a glassy carbon layer for insulation and an outer layer of aluminium-oxide ceramic material

  9. Vacuum Arc Ion Sources

    CERN Document Server

    Brown, I.

    2013-12-16

    The vacuum arc ion source has evolved into a more or less standard laboratory tool for the production of high-current beams of metal ions, and is now used in a number of different embodiments at many laboratories around the world. Applications include primarily ion implantation for material surface modification research, and good performance has been obtained for the injection of high-current beams of heavy-metal ions, in particular uranium, into particle accelerators. As the use of the source has grown, so also have the operational characteristics been improved in a variety of different ways. Here we review the principles, design, and performance of vacuum arc ion sources.

  10. Chandra Source Catalog: User Interface

    Science.gov (United States)

    Bonaventura, Nina; Evans, Ian N.; Rots, Arnold H.; Tibbetts, Michael S.; van Stone, David W.; Zografou, Panagoula; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Winkelman, Sherry L.

    2009-09-01

    The Chandra Source Catalog (CSC) is intended to be the definitive catalog of all X-ray sources detected by Chandra. For each source, the CSC provides positions and multi-band fluxes, as well as derived spatial, spectral, and temporal source properties. Full-field and source region data products are also available, including images, photon event lists, light curves, and spectra. The Chandra X-ray Center CSC website (http://cxc.harvard.edu/csc/) is the place to visit for high-level descriptions of each source property and data product included in the catalog, along with other useful information, such as step-by-step catalog tutorials, answers to FAQs, and a thorough summary of the catalog statistical characterization. Eight categories of detailed catalog documents may be accessed from the navigation bar on most of the 50+ CSC pages; these categories are: About the Catalog, Creating the Catalog, Using the Catalog, Catalog Columns, Column Descriptions, Documents, Conferences, and Useful Links. There are also prominent links to CSCview, the CSC data access GUI, and related help documentation, as well as a tutorial for using the new CSC/Google Earth interface. Catalog source properties are presented in seven scientific categories, within two table views: the Master Source and Source Observations tables. Each X-ray source has one ``master source'' entry and one or more ``source observation'' entries, the details of which are documented on the CSC ``Catalog Columns'' pages. The master source properties represent the best estimates of the properties of a source; these are extensively described on the following pages of the website: Position and Position Errors, Source Flags, Source Extent and Errors, Source Fluxes, Source Significance, Spectral Properties, and Source Variability. The eight tutorials (``threads'') available on the website serve as a collective guide for accessing, understanding, and manipulating the source properties and data products provided by the catalog.

  11. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  12. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  13. Review of progress in quantitative NDE

    International Nuclear Information System (INIS)

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques

  14. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  15. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  16. Quantitative Assays for RAS Pathway Proteins and Phosphorylation States

    Science.gov (United States)

    The NCI CPTAC program is applying its expertise in quantitative proteomics to develop assays for RAS pathway proteins. Targets include key phosphopeptides that should increase our understanding of how the RAS pathway is regulated.

  17. Quantitative maps of groundwater resources in Africa

    International Nuclear Information System (INIS)

    MacDonald, A M; Bonsor, H C; Dochartaigh, B É Ó; Taylor, R G

    2012-01-01

    In Africa, groundwater is the major source of drinking water and its use for irrigation is forecast to increase substantially to combat growing food insecurity. Despite this, there is little quantitative information on groundwater resources in Africa, and groundwater storage is consequently omitted from assessments of freshwater availability. Here we present the first quantitative continent-wide maps of aquifer storage and potential borehole yields in Africa based on an extensive review of available maps, publications and data. We estimate total groundwater storage in Africa to be 0.66 million km 3 (0.36–1.75 million km 3 ). Not all of this groundwater storage is available for abstraction, but the estimated volume is more than 100 times estimates of annual renewable freshwater resources on Africa. Groundwater resources are unevenly distributed: the largest groundwater volumes are found in the large sedimentary aquifers in the North African countries Libya, Algeria, Egypt and Sudan. Nevertheless, for many African countries appropriately sited and constructed boreholes can support handpump abstraction (yields of 0.1–0.3 l s −1 ), and contain sufficient storage to sustain abstraction through inter-annual variations in recharge. The maps show further that the potential for higher yielding boreholes ( > 5 l s −1 ) is much more limited. Therefore, strategies for increasing irrigation or supplying water to rapidly urbanizing cities that are predicated on the widespread drilling of high yielding boreholes are likely to be unsuccessful. As groundwater is the largest and most widely distributed store of freshwater in Africa, the quantitative maps are intended to lead to more realistic assessments of water security and water stress, and to promote a more quantitative approach to mapping of groundwater resources at national and regional level. (letter)

  18. Quantitative PET of liver functions.

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[ 18 F]fluoro- D -galactose ( 18 F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value ( SUV ) from a static liver 18 F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11 C-palmitate and with the conjugated bile acid tracer [ N -methyl- 11 C]cholylsarcosine ( 11 C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood ( K 1 ; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion , SUV of non-invasive static PET with 18 F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET.

  19. Quantitative PET of liver functions

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[18F]fluoro-D-galactose (18F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value (SUV) from a static liver 18F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11C-palmitate and with the conjugated bile acid tracer [N-methyl-11C]cholylsarcosine (11C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood (K 1; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion, SUV of non-invasive static PET with 18F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET. PMID:29755841

  20. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  1. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  2. Orphan sources

    International Nuclear Information System (INIS)

    Pust, R.; Urbancik, L.

    2008-01-01

    The presentation describes how the stable detection systems (hereinafter referred to as S DS ) have contributed to reveal the uncontrolled sources of ionizing radiation on the territory of the State Office for Nuclear Safety (SONS) Brno Regional Centre (RC Brno). It also describes the emergencies which were solved by or in which the workers from the Brno. Regional Centre participated in. The contribution is divided into the following chapters: A. SDS systems installed on the territory of SONS RC Brno; B. Selected unusual emergencies; C. Comments to individual emergencies; D. Aspects of SDS operation in term of their users; E. Aspects of SDS operation and related activities in term of radiation protection; F. Current state of orphan sources. (authors)

  3. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  4. Krakow conference on low emissions sources: Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Pierce, B.L.; Butcher, T.A. [eds.

    1995-12-31

    The Krakow Conference on Low Emission Sources presented the information produced and analytical tools developed in the first phase of the Krakow Clean Fossil Fuels and Energy Efficiency Program. This phase included: field testing to provide quantitative data on missions and efficiencies as well as on opportunities for building energy conservation; engineering analysis to determine the costs of implementing pollution control; and incentives analysis to identify actions required to create a market for equipment, fuels, and services needed to reduce pollution. Collectively, these Proceedings contain reports that summarize the above phase one information, present the status of energy system management in Krakow, provide information on financing pollution control projects in Krakow and elsewhere, and highlight the capabilities and technologies of Polish and American companies that are working to reduce pollution from low emission sources. It is intended that the US reader will find in these Proceedings useful results and plans for control of pollution from low emission sources that are representative of heating systems in central and Eastern Europe. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  5. Tritium sources

    International Nuclear Information System (INIS)

    Glodic, S.; Boreli, F.

    1993-01-01

    Tritium is the only radioactive isotope of hydrogen. It directly follows the metabolism of water and it can be bound into genetic material, so it is very important to control levels of contamination. In order to define the state of contamination it is necessary to establish 'zero level', i.e. actual global inventory. The importance of tritium contamination monitoring increases with the development of fusion power installations. Different sources of tritium are analyzed and summarized in this paper. (author)

  6. Radioactive source

    International Nuclear Information System (INIS)

    Drabkina, L.E.; Mazurek, V.; Myascedov, D.N.; Prokhorov, P.; Kachalov, V.A.; Ziv, D.M.

    1976-01-01

    A radioactive layer in a radioactive source is sealed by the application of a sealing layer on the radioactive layer. The sealing layer can consist of a film of oxide of titanium, tin, zirconium, aluminum, or chromium. Preferably, the sealing layer is pure titanium dioxide. The radioactive layer is embedded in a finish enamel which, in turn, is on a priming enamel which surrounds a substrate

  7. Isotopic and molecular fractionation in combustion; three routes to molecular marker validation, including direct molecular 'dating' (GC/AMS)

    Science.gov (United States)

    Currie, L. A.; Klouda, G. A.; Benner, B. A.; Garrity, K.; Eglinton, T. I.

    The identification of unique isotopic, elemental, and molecular markers for sources of combustion aerosol has growing practical importance because of the potential effects of fine particle aerosol on health, visibility and global climate. It is urgent, therefore, that substantial efforts be directed toward the validation of assumptions involving the use of such tracers for source apportionment. We describe here three independent routes toward carbonaceous aerosol molecular marker identification and validation: (1) tracer regression and multivariate statistical techniques applied to field measurements of mixed source, carbonaceous aerosols; (2) a new development in aerosol 14C metrology: direct, pure compound accelerator mass spectrometry (AMS) by off-line GC/AMS ('molecular dating'); and (3) direct observation of isotopic and molecular source emissions during controlled laboratory combustion of specific fuels. Findings from the combined studies include: independent support for benzo( ghi)perylene as a motor vehicle tracer from the first (statistical) and second (direct 'dating') studies; a new indication, from the third (controlled combustion) study, of a relation between 13C isotopic fractionation and PAH molecular fractionation, also linked with fuel and stage of combustion; and quantitative data showing the influence of both fuel type and combustion conditions on the yields of such species as elemental carbon and PAH, reinforcing the importance of exercising caution when applying presumed conservative elemental or organic tracers to fossil or biomass burning field data as in the first study.

  8. Maths meets myths quantitative approaches to ancient narratives

    CERN Document Server

    MacCarron, Máirín; MacCarron, Pádraig

    2017-01-01

    With an emphasis on exploring measurable aspects of ancient narratives, Maths Meets Myths sets out to investigate age-old material with new techniques. This book collects, for the first time, novel quantitative approaches to studying sources from the past, such as chronicles, epics, folktales, and myths. It contributes significantly to recent efforts in bringing together natural scientists and humanities scholars in investigations aimed at achieving greater understanding of our cultural inheritance. Accordingly, each contribution reports on a modern quantitative approach applicable to narrative sources from the past, or describes those which would be amenable to such treatment and why they are important. This volume is a unique state-of-the-art compendium on an emerging research field which also addresses anyone with interests in quantitative approaches to humanities.

  9. Muon sources

    International Nuclear Information System (INIS)

    Parsa, Z.

    2001-01-01

    A full high energy muon collider may take considerable time to realize. However, intermediate steps in its direction are possible and could help facilitate the process. Employing an intense muon source to carry out forefront low energy research, such as the search for muon-number non-conservation, represents one interesting possibility. For example, the MECO proposal at BNL aims for 2 x 10 -17 sensitivity in their search for coherent muon-electron conversion in the field of a nucleus. To reach that goal requires the production, capture and stopping of muons at an unprecedented 10 11 μ/sec. If successful, such an effort would significantly advance the state of muon technology. More ambitious ideas for utilizing high intensity muon sources are also being explored. Building a muon storage ring for the purpose of providing intense high energy neutrino beams is particularly exciting.We present an overview of muon sources and example of a muon storage ring based Neutrino Factory at BNL with various detector location possibilities

  10. Quantitative chemogenomics: machine-learning models of protein-ligand interaction.

    Science.gov (United States)

    Andersson, Claes R; Gustafsson, Mats G; Strömbergsson, Helena

    2011-01-01

    Chemogenomics is an emerging interdisciplinary field that lies in the interface of biology, chemistry, and informatics. Most of the currently used drugs are small molecules that interact with proteins. Understanding protein-ligand interaction is therefore central to drug discovery and design. In the subfield of chemogenomics known as proteochemometrics, protein-ligand-interaction models are induced from data matrices that consist of both protein and ligand information along with some experimentally measured variable. The two general aims of this quantitative multi-structure-property-relationship modeling (QMSPR) approach are to exploit sparse/incomplete information sources and to obtain more general models covering larger parts of the protein-ligand space, than traditional approaches that focuses mainly on specific targets or ligands. The data matrices, usually obtained from multiple sparse/incomplete sources, typically contain series of proteins and ligands together with quantitative information about their interactions. A useful model should ideally be easy to interpret and generalize well to new unseen protein-ligand combinations. Resolving this requires sophisticated machine-learning methods for model induction, combined with adequate validation. This review is intended to provide a guide to methods and data sources suitable for this kind of protein-ligand-interaction modeling. An overview of the modeling process is presented including data collection, protein and ligand descriptor computation, data preprocessing, machine-learning-model induction and validation. Concerns and issues specific for each step in this kind of data-driven modeling will be discussed. © 2011 Bentham Science Publishers

  11. Spectral confocal reflection microscopy using a white light source

    Science.gov (United States)

    Booth, M.; Juškaitis, R.; Wilson, T.

    2008-08-01

    We present a reflection confocal microscope incorporating a white light supercontinuum source and spectral detection. The microscope provides images resolved spatially in three-dimensions, in addition to spectral resolution covering the wavelength range 450-650nm. Images and reflection spectra of artificial and natural specimens are presented, showing features that are not normally revealed in conventional microscopes or confocal microscopes using discrete line lasers. The specimens include thin film structures on semiconductor chips, iridescent structures in Papilio blumei butterfly scales, nacre from abalone shells and opal gemstones. Quantitative size and refractive index measurements of transparent beads are derived from spectral interference bands.

  12. Quantitative trait loci associated with anthracnose resistance in sorghum

    Science.gov (United States)

    With an aim to develop a durable resistance to the fungal disease anthracnose, two unique genetic sources of resistance were selected to create genetic mapping populations to identify regions of the sorghum genome that encode anthracnose resistance. A series of quantitative trait loci were identifi...

  13. Early Quantitative Assessment of Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormandjieva, O.

    2007-01-01

    Non-functional requirements (NFRs) of software systems are a well known source of uncertainty in effort estimation. Yet, quantitatively approaching NFR early in a project is hard. This paper makes a step towards reducing the impact of uncertainty due to NRF. It offers a solution that incorporates

  14. Application of an image processing software for quantitative autoradiography

    International Nuclear Information System (INIS)

    Sobeslavsky, E.; Bergmann, R.; Kretzschmar, M.; Wenzel, U.

    1993-01-01

    The present communication deals with the utilization of an image processing device for quantitative whole-body autoradiography, cell counting and also for interpretation of chromatograms. It is shown that the system parameters allow an adequate and precise determination of optical density values. Also shown are the main error sources limiting the applicability of the system. (orig.)

  15. A potential quantitative method for assessing individual tree performance

    Science.gov (United States)

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  16. Toward uniform implementation of parametric map Digital Imaging and Communication in Medicine standard in multisite quantitative diffusion imaging studies.

    Science.gov (United States)

    Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C

    2018-01-01

    This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.

  17. Approaches to quantitative risk assessment with applications to PP

    International Nuclear Information System (INIS)

    Geiger, G.; Schaefer, A.

    2002-01-01

    Full text: Experience with accidents such as Goiania in Brazil and indications of a considerable number of orphan sources suggest that improved protection would be desirable for some types of radioactive material of wide-spread use such as radiation sources for civil purposes. Regarding large potential health and economic consequences (in particular, if terrorists attacks cannot be excluded), significant costs of preventive actions, and large uncertainties about both the likelihood of occurrence and the potential consequences of PP safety and security incidents, an optimum relationship between preventive and mitigative efforts is likely to be a key issue for successful risk management in this field. Thus, possible violations of physical protection combined with threats of misuse of nuclear materials, including terrorist attack, pose considerable challenges to global security from various perspectives. In view of these challenges, recent advance in applied risk and decision analysis suggests methodological and procedural improvements in quantitative risk assessment, the demarcation of acceptable risk, and risk management. Advance is based on a recently developed model of optimal risky choice suitable for assessing and comparing the cumulative probability distribution functions attached to safety and security risks. Besides quantification of risk (e. g., in economic terms), the standardization of various risk assessment models frequently used in operations research can be approached on this basis. The paper explores possible applications of these improved methods to the safety and security management of nuclear materials, cost efficiency of risk management measures, and the establishment international safety and security standards of PP. Examples will be presented that are based on selected scenarios of misuse involving typical radioactive sources. (author)

  18. Treatment planning source assessment

    International Nuclear Information System (INIS)

    Calzetta Larrieu, O.; Blaumann, H.; Longhino, J.

    2000-01-01

    The reactor RA-6 NCT system was improved during the last year mainly in two aspects: the facility itself getting lower contamination factors and using better measurements techniques to obtain lower uncertainties in its characterization. In this job we show the different steps to get the source to be used in the treatment planning code representing the NCT facility. The first one was to compare the dosimetry in a water phantom between the calculation using the entire facility including core, filter and shields and a surface source at the end of the beam. The second one was to transform this particle by particle source in a distribution one regarding the minimum spatial, energy and angular resolution to get similar results. Finally we compare calculation and experimental values with and without the water phantom to adjust the distribution source. The results are discussed. (author)

  19. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  20. Establishment of a new method to quantitatively evaluate hyphal fusion ability in Aspergillus oryzae.

    Science.gov (United States)

    Tsukasaki, Wakako; Maruyama, Jun-Ichi; Kitamoto, Katsuhiko

    2014-01-01

    Hyphal fusion is involved in the formation of an interconnected colony in filamentous fungi, and it is the first process in sexual/parasexual reproduction. However, it was difficult to evaluate hyphal fusion efficiency due to the low frequency in Aspergillus oryzae in spite of its industrial significance. Here, we established a method to quantitatively evaluate the hyphal fusion ability of A. oryzae with mixed culture of two different auxotrophic strains, where the ratio of heterokaryotic conidia growing without the auxotrophic requirements reflects the hyphal fusion efficiency. By employing this method, it was demonstrated that AoSO and AoFus3 are required for hyphal fusion, and that hyphal fusion efficiency of A. oryzae was increased by depleting nitrogen source, including large amounts of carbon source, and adjusting pH to 7.0.

  1. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  2. Quantitative multiphoton imaging

    Science.gov (United States)

    König, Karsten; Weinigel, Martin; Breunig, Hans Georg; Uchugonova, Aisada

    2014-02-01

    Certified clinical multiphoton tomographs for label-free multidimensional high-resolution in vivo imaging have been introduced to the market several years ago. Novel tomographs include a flexible 360° scan head attached to a mechanooptical arm for autofluorescence and SHG imaging as well as a CARS module. Non-fluorescent lipids and water, mitochondrial fluorescent NAD(P)H, fluorescent elastin, keratin, and melanin as well as SHG-active collagen can be imaged in vivo with submicron resolution in human skin. Sensitive and rapid detectors allow single photon counting and the construction of 3D maps where the number of detected photons per voxel is depicted. Intratissue concentration profiles from endogenous as well exogenous substances can be generated when the number of detected photons can be correlated with the number of molecules with respect to binding and scattering behavior. Furthermore, the skin ageing index SAAID based on the ratio elastin/collagen as well as the epidermis depth based on the onset of SHG generation can be determined.

  3. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  4. Sources of Free and Open Source Spatial Data for Natural Disasters and Principles for Use in Developing Country Contexts

    Science.gov (United States)

    Taylor, Faith E.; Malamud, Bruce D.; Millington, James D. A.

    2016-04-01

    Access to reliable spatial and quantitative datasets (e.g., infrastructure maps, historical observations, environmental variables) at regional and site specific scales can be a limiting factor for understanding hazards and risks in developing country settings. Here we present a 'living database' of >75 freely available data sources relevant to hazard and risk in Africa (and more globally). Data sources include national scientific foundations, non-governmental bodies, crowd-sourced efforts, academic projects, special interest groups and others. The database is available at http://tinyurl.com/africa-datasets and is continually being updated, particularly in the context of broader natural hazards research we are doing in the context of Malawi and Kenya. For each data source, we review the spatiotemporal resolution and extent and make our own assessments of reliability and usability of datasets. Although such freely available datasets are sometimes presented as a panacea to improving our understanding of hazards and risk in developing countries, there are both pitfalls and opportunities unique to using this type of freely available data. These include factors such as resolution, homogeneity, uncertainty, access to metadata and training for usage. Based on our experience, use in the field and grey/peer-review literature, we present a suggested set of guidelines for using these free and open source data in developing country contexts.

  5. Designing a Quantitative Structure-Activity Relationship for the ...

    Science.gov (United States)

    Toxicokinetic models serve a vital role in risk assessment by bridging the gap between chemical exposure and potentially toxic endpoints. While intrinsic metabolic clearance rates have a strong impact on toxicokinetics, limited data is available for environmentally relevant chemicals including nearly 8000 chemicals tested for in vitro bioactivity in the Tox21 program. To address this gap, a quantitative structure-activity relationship (QSAR) for intrinsic metabolic clearance rate was developed to offer reliable in silico predictions for a diverse array of chemicals. Models were constructed with curated in vitro assay data for both pharmaceutical-like chemicals (ChEMBL database) and environmentally relevant chemicals (ToxCast screening) from human liver microsomes (2176 from ChEMBL) and human hepatocytes (757 from ChEMBL and 332 from ToxCast). Due to variability in the experimental data, a binned approach was utilized to classify metabolic rates. Machine learning algorithms, such as random forest and k-nearest neighbor, were coupled with open source molecular descriptors and fingerprints to provide reasonable estimates of intrinsic metabolic clearance rates. Applicability domains defined the optimal chemical space for predictions, which covered environmental chemicals well. A reduced set of informative descriptors (including relative charge and lipophilicity) and a mixed training set of pharmaceuticals and environmentally relevant chemicals provided the best intr

  6. Land Streamer Surveying Using Multiple Sources

    KAUST Repository

    Mahmoud, Sherif

    2014-12-11

    Various examples are provided for land streamer seismic surveying using multiple sources. In one example, among others, a method includes disposing a land streamer in-line with first and second shot sources. The first shot source is at a first source location adjacent to a proximal end of the land streamer and the second shot source is at a second source location separated by a fixed length corresponding to a length of the land streamer. Shot gathers can be obtained when the shot sources are fired. In another example, a system includes a land streamer including a plurality of receivers, a first shot source located adjacent to the proximal end of the land streamer, and a second shot source located in-line with the land streamer and the first shot source. The second shot source is separated from the first shot source by a fixed overall length corresponding to the land streamer.

  7. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  8. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  9. QTest: Quantitative Testing of Theories of Binary Choice.

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  10. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  11. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  12. Absolute quantitative total-body small-animal SPECT with focusing pinholes

    NARCIS (Netherlands)

    Wu, C.; Van der Have, F.; Vastenhouw, B.; Dierckx, R.A.J.O.; Paans, A.M.J.; Beekman, F.J.

    2010-01-01

    Purpose: In pinhole SPECT, attenuation of the photon flux on trajectories between source and pinholes affects quantitative accuracy of reconstructed images. Previously we introduced iterative methods that compensate for image degrading effects of detector and pinhole blurring, pinhole sensitivity

  13. 42 CFR 410.100 - Included services.

    Science.gov (United States)

    2010-10-01

    ... service; however, maintenance therapy itself is not covered as part of these services. (c) Occupational... increase respiratory function, such as graded activity services; these services include physiologic... rehabilitation plan of treatment, including physical therapy services, occupational therapy services, speech...

  14. 77 FR 7 - Revisions to Labeling Requirements for Blood and Blood Components, Including Source Plasma

    Science.gov (United States)

    2012-01-03

    ... uniform container label for blood and blood components and recommended labels that incorporated barcode... Protein Fraction (part 640, subpart I), and Immune Globulin (part 640, subpart J)). The comment noted that...

  15. Completed Research in Health, Physical Education, Recreation & Dance; Including International Sources. Volume 27. 1985 Edition.

    Science.gov (United States)

    Freedson, Patty S., Ed.

    This compilation lists research completed in the areas of health, physical education, recreation, dance, and allied areas during 1984. The document is arranged in two parts. In the index, references are arranged under the subject headings in alphabetical order. Abstracts of master's and doctor's theses from institutions offering graduate programs…

  16. Nutrition in pregnancy: the argument for including a source of choline

    OpenAIRE

    Zeisel, Steven H

    2013-01-01

    Steven H Zeisel Nutrition Research Institute at Kannapolis, Department of Nutrition, University of North Carolina at Chapel Hill, Kannapolis, NC, USA Abstract: Women, during pregnancy and lactation, should eat foods that contain adequate amounts of choline. A mother delivers large amounts of choline across the placenta to the fetus, and after birth she delivers large amounts of choline in milk to the infant; this greatly increases the demand on the choline stores of the mother. Adequate inta...

  17. Nuclear medicine and imaging research (instrumentation and quantitative methods of evaluation)

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.; Chen, C.T.

    1992-07-01

    This document is the annual progress report for project entitled ''Instrumentation and Quantitative Methods of Evaluation.'' Progress is reported in separate sections individually abstracted and indexed for the database. Subject areas reported include theoretical studies of imaging systems and methods, hardware developments, quantitative methods of evaluation, and knowledge transfer: education in quantitative nuclear medicine imaging

  18. Static, Lightweight Includes Resolution for PHP

    NARCIS (Netherlands)

    M.A. Hills (Mark); P. Klint (Paul); J.J. Vinju (Jurgen)

    2014-01-01

    htmlabstractDynamic languages include a number of features that are challenging to model properly in static analysis tools. In PHP, one of these features is the include expression, where an arbitrary expression provides the path of the file to include at runtime. In this paper we present two

  19. Article Including Environmental Barrier Coating System

    Science.gov (United States)

    Lee, Kang N. (Inventor)

    2015-01-01

    An enhanced environmental barrier coating for a silicon containing substrate. The enhanced barrier coating may include a bond coat doped with at least one of an alkali metal oxide and an alkali earth metal oxide. The enhanced barrier coating may include a composite mullite bond coat including BSAS and another distinct second phase oxide applied over said surface.

  20. Rare thoracic cancers, including peritoneum mesothelioma

    NARCIS (Netherlands)

    Siesling, Sabine; van der Zwan, Jan Maarten; Izarzugaza, Isabel; Jaal, Jana; Treasure, Tom; Foschi, Roberto; Ricardi, Umberto; Groen, Harry; Tavilla, Andrea; Ardanaz, Eva

    Rare thoracic cancers include those of the trachea, thymus and mesothelioma (including peritoneum mesothelioma). The aim of this study was to describe the incidence, prevalence and survival of rare thoracic tumours using a large database, which includes cancer patients diagnosed from 1978 to 2002,

  1. Rare thoracic cancers, including peritoneum mesothelioma

    NARCIS (Netherlands)

    Siesling, Sabine; Zwan, J.M.V.D.; Izarzugaza, I.; Jaal, J.; Treasure, T.; Foschi, R.; Ricardi, U.; Groen, H.; Tavilla, A.; Ardanaz, E.

    2012-01-01

    Rare thoracic cancers include those of the trachea, thymus and mesothelioma (including peritoneum mesothelioma). The aim of this study was to describe the incidence, prevalence and survival of rare thoracic tumours using a large database, which includes cancer patients diagnosed from 1978 to 2002,

  2. Quantitative fuel motion determination with the CABRI fast neutron hodoscope

    International Nuclear Information System (INIS)

    Baumung, K.; Augier, G.

    1991-01-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized

  3. Quantitative nature of overexpression experiments

    Science.gov (United States)

    Moriya, Hisao

    2015-01-01

    Overexpression experiments are sometimes considered as qualitative experiments designed to identify novel proteins and study their function. However, in order to draw conclusions regarding protein overexpression through association analyses using large-scale biological data sets, we need to recognize the quantitative nature of overexpression experiments. Here I discuss the quantitative features of two different types of overexpression experiment: absolute and relative. I also introduce the four primary mechanisms involved in growth defects caused by protein overexpression: resource overload, stoichiometric imbalance, promiscuous interactions, and pathway modulation associated with the degree of overexpression. PMID:26543202

  4. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  5. Production of sealed sources

    International Nuclear Information System (INIS)

    Bandi, L.N.

    2016-01-01

    Radioisotope production has been an ongoing activity in India since the sixties. Radioisotopes find wide-ranging applications in various fields, including industry, research, agriculture and medicine. Board of Radiation and Isotope Technology, an industrial unit of Department of Atomic Energy is involved in fabrication and supply of wide variety of sealed sources. The main radioisotopes fabricated and supplied by BRIT are Cobalt-60, Iridium-192. These isotopes are employed in industrial and laboratory irradiators, teletherapy machines, radiography exposure devices, nucleonic gauges. The source fabrication facilities of BRIT are located at Rajasthan Atomic Power Project Cobalt-60 Facility (RAPPCOF), Kota, Radiological Laboratories Group (RLG) and High Intensity Radiation Utilization Project (HIRUP) at Trombay

  6. Quantitative fluorescence angiography for neurosurgical interventions.

    Science.gov (United States)

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  7. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  8. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  9. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. A Targeted LC-MS/MS Method for the Simultaneous Detection and Quantitation of Egg, Milk, and Peanut Allergens in Sugar Cookies.

    Science.gov (United States)

    Boo, Chelsea C; Parker, Christine H; Jackson, Lauren S

    2018-01-01

    Food allergy is a growing public health concern, with many individuals reporting allergies to multiple food sources. Compliance with food labeling regulations and prevention of inadvertent cross-contact in manufacturing requires the use of reliable methods for the detection and quantitation of allergens in processed foods. In this work, a novel liquid chromatography-tandem mass spectrometry multiple-reaction monitoring method for multiallergen detection and quantitation of egg, milk, and peanut was developed and evaluated in an allergen-incurred baked sugar cookie matrix. A systematic evaluation of method parameters, including sample extraction, concentration, and digestion, were optimized for candidate allergen peptide markers. The optimized method enabled the reliable detection and quantitation of egg, milk, and peanut allergens in sugar cookies, with allergen concentrations as low as 5 ppm allergen-incurred ingredient.

  11. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  12. Shedding quantitative fluorescence light on novel regulatory mechanisms in skeletal biomedicine and biodentistry.

    Science.gov (United States)

    Lee, Ji-Won; Iimura, Tadahiro

    2017-02-01

    Digitalized fluorescence images contain numerical information such as color (wavelength), fluorescence intensity and spatial position. However, quantitative analyses of acquired data and their validation remained to be established. Our research group has applied quantitative fluorescence imaging on tissue sections and uncovered novel findings in skeletal biomedicine and biodentistry. This review paper includes a brief background of quantitative fluorescence imaging and discusses practical applications by introducing our previous research. Finally, the future perspectives of quantitative fluorescence imaging are discussed.

  13. Monitoring alert and drowsy states by modeling EEG source nonstationarity

    Science.gov (United States)

    Hsu, Sheng-Hsiou; Jung, Tzyy-Ping

    2017-10-01

    Objective. As a human brain performs various cognitive functions within ever-changing environments, states of the brain characterized by recorded brain activities such as electroencephalogram (EEG) are inevitably nonstationary. The challenges of analyzing the nonstationary EEG signals include finding neurocognitive sources that underlie different brain states and using EEG data to quantitatively assess the state changes. Approach. This study hypothesizes that brain activities under different states, e.g. levels of alertness, can be modeled as distinct compositions of statistically independent sources using independent component analysis (ICA). This study presents a framework to quantitatively assess the EEG source nonstationarity and estimate levels of alertness. The framework was tested against EEG data collected from 10 subjects performing a sustained-attention task in a driving simulator. Main results. Empirical results illustrate that EEG signals under alert versus drowsy states, indexed by reaction speeds to driving challenges, can be characterized by distinct ICA models. By quantifying the goodness-of-fit of each ICA model to the EEG data using the model deviation index (MDI), we found that MDIs were significantly correlated with the reaction speeds (r  =  -0.390 with alertness models and r  =  0.449 with drowsiness models) and the opposite correlations indicated that the two models accounted for sources in the alert and drowsy states, respectively. Based on the observed source nonstationarity, this study also proposes an online framework using a subject-specific ICA model trained with an initial (alert) state to track the level of alertness. For classification of alert against drowsy states, the proposed online framework achieved an averaged area-under-curve of 0.745 and compared favorably with a classic power-based approach. Significance. This ICA-based framework provides a new way to study changes of brain states and can be applied to

  14. Chandra Source Catalog: User Interfaces

    Science.gov (United States)

    Bonaventura, Nina; Evans, I. N.; Harbo, P. N.; Rots, A. H.; Tibbetts, M. S.; Van Stone, D. W.; Zografou, P.; Anderson, C. S.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Glotfelty, K. J.; Grier, J. D.; Hain, R.; Hall, D. M.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Winkelman, S. L.

    2010-03-01

    The CSCview data mining interface is available for browsing the Chandra Source Catalog (CSC) and downloading tables of quality-assured source properties and data products. Once the desired source properties and search criteria are entered into the CSCview query form, the resulting source matches are returned in a table along with the values of the requested source properties for each source. (The catalog can be searched on any source property, not just position.) At this point, the table of search results may be saved to a text file, and the available data products for each source may be downloaded. CSCview save files are output in RDB-like and VOTable format. The available CSC data products include event files, spectra, lightcurves, and images, all of which are processed with the CIAO software. CSC data may also be accessed non-interactively with Unix command-line tools such as cURL and Wget, using ADQL 2.0 query syntax. In fact, CSCview features a separate ADQL query form for those who wish to specify this type of query within the GUI. Several interfaces are available for learning if a source is included in the catalog (in addition to CSCview): 1) the CSC interface to Sky in Google Earth shows the footprint of each Chandra observation on the sky, along with the CSC footprint for comparison (CSC source properties are also accessible when a source within a Chandra field-of-view is clicked); 2) the CSC Limiting Sensitivity online tool indicates if a source at an input celestial location was too faint for detection; 3) an IVOA Simple Cone Search interface locates all CSC sources within a specified radius of an R.A. and Dec.; and 4) the CSC-SDSS cross-match service returns the list of sources common to the CSC and SDSS, either all such sources or a subset based on search criteria.

  15. A Hybrid, Current-Source/Voltage-Source Power Inverter Circuit

    DEFF Research Database (Denmark)

    Trzynadlowski, Andrzej M.; Patriciu, Niculina; Blaabjerg, Frede

    2001-01-01

    A combination of a large current-source inverter and a small voltage-source inverter circuits is analyzed. The resultant hybrid inverter inherits certain operating advantages from both the constituent converters. In comparison with the popular voltage-source inverter, these advantages include...... reduced switching losses, improved quality of output current waveforms, and faster dynamic response to current control commands. Description of operating principles and characteristics of the hybrid inverter is illustrated with results of experimental investigation of a laboratory model....

  16. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  17. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  18. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  19. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  20. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  1. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  2. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  3. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  4. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  5. Quantitative Communication Research: Review, Trends, and Critique

    Directory of Open Access Journals (Sweden)

    Timothy R. Levine

    2013-01-01

    Full Text Available Trends in quantitative communication research are reviewed. A content analysis of 48 articles reporting original communication research published in 1988-1991 and 2008-2011 is reported. Survey research and self-report measurement remain common approaches to research. Null hypothesis significance testing remains the dominant approach to statistical analysis. Reporting the shapes of distributions, estimates of statistical power, and confidence intervals remain uncommon. Trends over time include the increased popularity of health communication and computer mediated communication as topics of research, and increased attention to mediator and moderator variables. The implications of these practices for scientific progress are critically discussed, and suggestions for the future are provided.

  6. Optimization of atomic beam sources for polarization experiments

    Energy Technology Data Exchange (ETDEWEB)

    Gaisser, Martin; Nass, Alexander; Stroeher, Hans [IKP, Forschungszentrum Juelich (Germany)

    2013-07-01

    For experiments with spin-polarized protons and neutrons a dense target is required. In current atomic beam sources an atomic hydrogen or deuterium beam is expanded through a cold nozzle and a system of sextupole magnets and RF-transition units selects a certain hyperfine state. The achievable flux seems to be limited to about 10{sup 17} particles per second with a high nuclear polarization. A lot of experimental and theoretical effort has been undertaken to understand all effects and to increase the flux. However, improvements have remained marginal. Now, a Monte Carlo simulation based on the DSMC part of the open source C++ library OpenFOAM is set up in order to get a better understanding of the flow and to optimize the various elements. It is intended to include important effects like deflection from magnetic fields, recombination on the walls and spin exchange collisions in the simulation and make quantitative predictions of changes in the experimental setup. The goal is to get a tool that helps to further increase the output of an atomic beam source. So far, a new binary collision model, magnetic fields, RF-transition units and a tool to measure the collision age are included. The next step will be to couple the whole simulation with an optimization algorithm implementing Adaptive Simulated Annealing (ASA) in order to automatically optimize the atomic beam source.

  7. Radiation sources and technical services

    International Nuclear Information System (INIS)

    Stonek, K.; Satorie, Z.; Vyskocil, I.

    1981-01-01

    Work is briefly described of the department for sealed sources production of the Institute, including leak testing and surface contamination of sealed sources. The department also provides technical services including the inspections of sealed sources used in medicine and geology and repair of damaged sources. It carries out research of the mechanical and thermal strength of sealed sources and of the possibility of reprocessing used 226 Ra sources. The despatch department is responsible for supplying the entire country with home and imported radionuclides. The department of technical services is responsible for testing imported radionuclides, assembling materials testing, industrial and medical irradiation devices, and for the collection and storage of low-level wastes on a national scale. (M.D.)

  8. MRMer, an interactive open source and cross-platform system for data extraction and visualization of multiple reaction monitoring experiments.

    Science.gov (United States)

    Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin

    2008-11-01

    Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.

  9. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  10. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  11. Global inventory of NOx sources

    International Nuclear Information System (INIS)

    Delmas, R.; Serca, D.; Jambert, C.

    1997-01-01

    Nitrogen oxides are key compounds for the oxidation capacity of the troposphere. Their concentration depends on the proximity of sources because of their short atmospheric lifetime. An accurate knowledge of the distribution of their sources and sinks is therefore crucial. At the global scale, the dominant sources of nitrogen oxides - combustion of fossil fuel (about 50%) and biomass burning (about 20%) - are basically anthropogenic. Natural sources, including lightning and microbial activity in soils, represent therefore less than 30% of total emissions. Fertilizer use in agriculture constitutes an anthropogenic perturbation to the microbial source. The methods to estimate the magnitude and distribution of these dominant sources of nitrogen oxides are discussed. Some minor sources which may play a specific role in tropospheric chemistry such as NO x emission from aircraft in the upper troposphere or input from production in the stratosphere from N 2 O photodissociation are also considered

  12. Radioisotopic neutron transmission spectrometry: Quantitative analysis by using partial least-squares method

    International Nuclear Information System (INIS)

    Kim, Jong-Yun; Choi, Yong Suk; Park, Yong Joon; Jung, Sung-Hee

    2009-01-01

    Neutron spectrometry, based on the scattering of high energy fast neutrons from a radioisotope and slowing-down by the light hydrogen atoms, is a useful technique for non-destructive, quantitative measurement of hydrogen content because it has a large measuring volume, and is not affected by temperature, pressure, pH value and color. The most common choice for radioisotope neutron source is 252 Cf or 241 Am-Be. In this study, 252 Cf with a neutron flux of 6.3x10 6 n/s has been used as an attractive neutron source because of its high flux neutron and weak radioactivity. Pulse-height neutron spectra have been obtained by using in-house built radioisotopic neutron spectrometric system equipped with 3 He detector and multi-channel analyzer, including a neutron shield. As a preliminary study, polyethylene block (density of ∼0.947 g/cc and area of 40 cmx25 cm) was used for the determination of hydrogen content by using multivariate calibration models, depending on the thickness of the block. Compared with the results obtained from a simple linear calibration model, partial least-squares regression (PLSR) method offered a better performance in a quantitative data analysis. It also revealed that the PLSR method in a neutron spectrometric system can be promising in the real-time, online monitoring of the powder process to determine the content of any type of molecules containing hydrogen nuclei.

  13. Inline CBET Model Including SRS Backscatter

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, David S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-06-26

    Cross-beam energy transfer (CBET) has been used as a tool on the National Ignition Facility (NIF) since the first energetics experiments in 2009 to control the energy deposition in ignition hohlraums and tune the implosion symmetry. As large amounts of power are transferred between laser beams at the entrance holes of NIF hohlraums, the presence of many overlapping beat waves can lead to stochastic ion heating in the regions where laser beams overlap [P. Michel et al., Phys. Rev. Lett. 109, 195004 (2012)]. Using the CBET gains derived in this paper, we show how to implement these equations in a ray-based laser source for a rad-hydro code.

  14. Field nonuniformity correction for quantitative analysis of digitized mammograms

    International Nuclear Information System (INIS)

    Pawluczyk, Olga; Yaffe, Martin J.

    2001-01-01

    Several factors, including the heel effect, variation in distance from the x-ray source to points in the image and path obliquity contribute to the signal nonuniformity of mammograms. To best use digitized mammograms for quantitative image analysis, these field non-uniformities must be corrected. An empirically based correction method, which uses a bowl-shaped calibration phantom, has been developed. Due to the annular spherical shape of the phantom, its attenuation is constant over the entire image. Remaining nonuniformities are due only to the heel and inverse square effects as well as the variable path through the beam filter, compression plate and image receptor. In logarithmic space, a normalized image of the phantom can be added to mammograms to correct for these effects. Then, an analytical correction for path obliquity in the breast can be applied to the images. It was found that the correction causes the errors associated with field nonuniformity to be reduced from 14% to 2% for a 4 cm block of material corresponding to a combination of 50% fibroglandular and 50% fatty breast tissue. A repeatability study has been conducted to show that in regions as far as 20 cm away from the chest wall, variations due to imaging conditions and phantom alignment contribute to <2% of overall corrected signal

  15. 3D quantitative phase imaging of neural networks using WDT

    Science.gov (United States)

    Kim, Taewoo; Liu, S. C.; Iyer, Raj; Gillette, Martha U.; Popescu, Gabriel

    2015-03-01

    White-light diffraction tomography (WDT) is a recently developed 3D imaging technique based on a quantitative phase imaging system called spatial light interference microscopy (SLIM). The technique has achieved a sub-micron resolution in all three directions with high sensitivity granted by the low-coherence of a white-light source. Demonstrations of the technique on single cell imaging have been presented previously; however, imaging on any larger sample, including a cluster of cells, has not been demonstrated using the technique. Neurons in an animal body form a highly complex and spatially organized 3D structure, which can be characterized by neuronal networks or circuits. Currently, the most common method of studying the 3D structure of neuron networks is by using a confocal fluorescence microscope, which requires fluorescence tagging with either transient membrane dyes or after fixation of the cells. Therefore, studies on neurons are often limited to samples that are chemically treated and/or dead. WDT presents a solution for imaging live neuron networks with a high spatial and temporal resolution, because it is a 3D imaging method that is label-free and non-invasive. Using this method, a mouse or rat hippocampal neuron culture and a mouse dorsal root ganglion (DRG) neuron culture have been imaged in order to see the extension of processes between the cells in 3D. Furthermore, the tomogram is compared with a confocal fluorescence image in order to investigate the 3D structure at synapses.

  16. Open Source Telecommunication Companies

    Directory of Open Access Journals (Sweden)

    Peter Liu

    2007-08-01

    Full Text Available Little is known about companies whose core business is selling telecommunications products that lever open source projects. Open source telecommunications (OST companies operate in markets that are very different from typical software product markets. The telecommunications market is regulated, vertically integrated, and proprietary designs and special chips are widely used. For a telecommunications product to be useful, it must interact with both access network products and core network products. Due to specifications in Service Agreements Levels, penalties for failures of telecommunications products are very high. This article shares information that is not widely known, including a list of OST companies and the open source projects on which they depend, the size and diversity of venture capital investment in OST companies, the nature of the commercial product-open source software and company-project relationships, ways in which OST companies make money, benefits and risks of OST companies, and competition between OST companies. Analysis of this information provides insights into the ways in which companies can build business models around open source software. These findings will be of interest to entrepreneurs, top management teams of incumbent companies that sell telecommunications products, and those who care about Ontario's ability to compete globally.

  17. Optically pumped terahertz sources

    Institute of Scientific and Technical Information of China (English)

    ZHONG Kai; SHI Wei; XU DeGang; LIU PengXiang; WANG YuYe; MEI JiaLin; YAN Chao; FU ShiJie; YAO JianQuan

    2017-01-01

    High-power terahertz (THz) generation in the frequency range of0.1-10 THz has been a fast-developing research area ever since the beginning of the THz boom two decades ago,enabling new technological breakthroughs in spectroscopy,communication,imaging,etc.By using optical (laser) pumping methods with near-or mid-infrared (IR) lasers,flexible and practical THz sources covering the whole THz range can be realized to overcome the shortage of electronic THz sources and now they are playing important roles in THz science and technology.This paper overviews various optically pumped THz sources,including femtosecond laser based ultrafast broadband THz generation,monochromatic widely tunable THz generation,single-mode on-chip THz source from photomixing,and the traditional powerful THz gas lasers.Full descriptions from basic principles to the latest progress are presented and their advantages and disadvantages are discussed as well.It is expected that this review gives a comprehensive reference to researchers in this area and additionally helps newcomers to quickly gain understanding of optically pumped THz sources.

  18. Aggregated Demand Modelling Including Distributed Generation, Storage and Demand Response

    OpenAIRE

    Marzooghi, Hesamoddin; Hill, David J.; Verbic, Gregor

    2014-01-01

    It is anticipated that penetration of renewable energy sources (RESs) in power systems will increase further in the next decades mainly due to environmental issues. In the long term of several decades, which we refer to in terms of the future grid (FG), balancing between supply and demand will become dependent on demand actions including demand response (DR) and energy storage. So far, FG feasibility studies have not considered these new demand-side developments for modelling future demand. I...

  19. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  20. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  1. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Global anthropogenic emissions of particulate matter including black carbon

    Science.gov (United States)

    Klimont, Zbigniew; Kupiainen, Kaarle; Heyes, Chris; Purohit, Pallav; Cofala, Janusz; Rafaj, Peter; Borken-Kleefeld, Jens; Schöpp, Wolfgang

    2017-07-01

    This paper presents a comprehensive assessment of historical (1990-2010) global anthropogenic particulate matter (PM) emissions including the consistent and harmonized calculation of mass-based size distribution (PM1, PM2. 5, PM10), as well as primary carbonaceous aerosols including black carbon (BC) and organic carbon (OC). The estimates were developed with the integrated assessment model GAINS, where source- and region-specific technology characteristics are explicitly included. This assessment includes a number of previously unaccounted or often misallocated emission sources, i.e. kerosene lamps, gas flaring, diesel generators, refuse burning; some of them were reported in the past for selected regions or in the context of a particular pollutant or sector but not included as part of a total estimate. Spatially, emissions were calculated for 172 source regions (as well as international shipping), presented for 25 global regions, and allocated to 0.5° × 0.5° longitude-latitude grids. No independent estimates of emissions from forest fires and savannah burning are provided and neither windblown dust nor unpaved roads emissions are included. We estimate that global emissions of PM have not changed significantly between 1990 and 2010, showing a strong decoupling from the global increase in energy consumption and, consequently, CO2 emissions, but there are significantly different regional trends, with a particularly strong increase in East Asia and Africa and a strong decline in Europe, North America, and the Pacific region. This in turn resulted in important changes in the spatial pattern of PM burden, e.g. European, North American, and Pacific contributions to global emissions dropped from nearly 30 % in 1990 to well below 15 % in 2010, while Asia's contribution grew from just over 50 % to nearly two-thirds of the global total in 2010. For all PM species considered, Asian sources represented over 60 % of the global anthropogenic total, and residential combustion

  3. Global anthropogenic emissions of particulate matter including black carbon

    Directory of Open Access Journals (Sweden)

    Z. Klimont

    2017-07-01

    Full Text Available This paper presents a comprehensive assessment of historical (1990–2010 global anthropogenic particulate matter (PM emissions including the consistent and harmonized calculation of mass-based size distribution (PM1, PM2. 5, PM10, as well as primary carbonaceous aerosols including black carbon (BC and organic carbon (OC. The estimates were developed with the integrated assessment model GAINS, where source- and region-specific technology characteristics are explicitly included. This assessment includes a number of previously unaccounted or often misallocated emission sources, i.e. kerosene lamps, gas flaring, diesel generators, refuse burning; some of them were reported in the past for selected regions or in the context of a particular pollutant or sector but not included as part of a total estimate. Spatially, emissions were calculated for 172 source regions (as well as international shipping, presented for 25 global regions, and allocated to 0.5°  ×  0.5° longitude–latitude grids. No independent estimates of emissions from forest fires and savannah burning are provided and neither windblown dust nor unpaved roads emissions are included. We estimate that global emissions of PM have not changed significantly between 1990 and 2010, showing a strong decoupling from the global increase in energy consumption and, consequently, CO2 emissions, but there are significantly different regional trends, with a particularly strong increase in East Asia and Africa and a strong decline in Europe, North America, and the Pacific region. This in turn resulted in important changes in the spatial pattern of PM burden, e.g. European, North American, and Pacific contributions to global emissions dropped from nearly 30 % in 1990 to well below 15 % in 2010, while Asia's contribution grew from just over 50 % to nearly two-thirds of the global total in 2010. For all PM species considered, Asian sources represented over 60 % of the global

  4. Effects of formic acid hydrolysis on the quantitative analysis of radiation-induced DNA base damage products assayed by gas chromatography/mass spectrometry

    International Nuclear Information System (INIS)

    Swarts, S.G.; Smith, G.S.; Miao, L.; Wheeler, K.T.

    1996-01-01

    Gas chromatography/mass spectrometry (GC/ MS-SIM) is an excellent technique for performing both qualitative and quantitative analysis of DNA base damage products that are formed by exposure to ionizing radiation or by the interaction of intracellular DNA with activated oxygen species. This technique commonly uses a hot formic acid hydrolysis step to degrade the DNA to individual free bases. However, due to the harsh nature of this degradation procedure, the quantitation of DNA base damage products may be adversely affected. Consequently, we examined the effects of various formic acid hydrolysis procedures on the quantitation of a number of DNA base damage products and identified several factors that can influence this quantitation. These factors included (1) the inherent acid stabilities of both the lesions and the internal standards; (2) the hydrolysis temperature; (3) the source and grade of the formic acid; and (4) the sample mass during hydrolysis. Our data also suggested that the N, O-bis (trimethylsilyl)trifluoroacetamide (BSTFA) derivatization efficiency can be adversely affected, presumably by trace contaminants either in the formic acid or from the acid-activated surface of the glass derivatization vials. Where adverse effects were noted, modifications were explored in an attempt to improve the quantitation of these DNA lesions. Although experimental steps could be taken to minimize the influence of these factors on the quantitation of some base damage products, no single procedure solved the quantitation problem for all base lesions. However, a significant improvement in the quantitation was achieved if the relative molecular response factor (RMRF) values for these lesions were generated with authentic DNA base damage products that had been treated exactly like the experimental samples. (orig.)

  5. Individual patient dosimetry using quantitative SPECT imaging

    International Nuclear Information System (INIS)

    Gonzalez, J.; Oliva, J.; Baum, R.; Fisher, S.

    2002-01-01

    An approach is described to provide individual patient dosimetry for routine clinical use. Accurate quantitative SPECT imaging was achieved using appropriate methods. The volume of interest (VOI) was defined semi-automatically using a fixed threshold value obtained from phantom studies. The calibration factor to convert the voxel counts from SPECT images into activity values was determine from calibrated point source using the same threshold value as in phantom studies. From selected radionuclide the dose within and outside a sphere of voxel dimension at different distances was computed through dose point-kernels to obtain a discrete absorbed dose kernel representation around the volume source with uniform activity distribution. The spatial activity distribution from SPECT imaging was convolved with this kernel representation using the discrete Fourier transform method to yield three-dimensional absorbed dose rate distribution. The accuracy of dose rates calculation was validated by software phantoms. The absorbed dose was determined by integration of the dose rate distribution for each volume of interest (VOI). Parameters for treatment optimization such as dose rate volume histograms and dose rate statistic are provided. A patient example was used to illustrate our dosimetric calculations

  6. Status of spallation neutron source

    Energy Technology Data Exchange (ETDEWEB)

    Oyama, Yukio [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1997-03-01

    Existing and planned facilities using proton accelerator driven spallation neutron source are reviewed. These include new project of neutron science proposed from Japan Atomic Energy Research Institute. The present status of facility requirement and accelerator technology leads us to new era of neutron science such as neutron scattering research and nuclear transmutation study using very intense neutron source. (author)

  7. Online Sources for Competitor Information.

    Science.gov (United States)

    Weiss, Arthur

    Competitor information gathering is a key aspect of business planning. Information can be collected from either published or unpublished sources. Unpublished information will often be verified based on material from published sources. Published information is more likely to be factual and includes financial, stockmarket, press, market and…

  8. Births and deaths including fetal deaths

    Data.gov (United States)

    U.S. Department of Health & Human Services — Access to a variety of United States birth and death files including fetal deaths: Birth Files, 1968-2009; 1995-2005; Fetal death file, 1982-2005; Mortality files,...

  9. Including Indigenous Minorities in Decision-Making

    DEFF Research Database (Denmark)

    Pristed Nielsen, Helene

    Based on theories of public sphere participation and deliberative democracy, this book presents empirical results from a study of experiences with including Aboriginal and Maori groups in political decision-making in respectively Western Australia and New Zealand......Based on theories of public sphere participation and deliberative democracy, this book presents empirical results from a study of experiences with including Aboriginal and Maori groups in political decision-making in respectively Western Australia and New Zealand...

  10. Gas storage materials, including hydrogen storage materials

    Science.gov (United States)

    Mohtadi, Rana F; Wicks, George G; Heung, Leung K; Nakamura, Kenji

    2013-02-19

    A material for the storage and release of gases comprises a plurality of hollow elements, each hollow element comprising a porous wall enclosing an interior cavity, the interior cavity including structures of a solid-state storage material. In particular examples, the storage material is a hydrogen storage material such as a solid state hydride. An improved method for forming such materials includes the solution diffusion of a storage material solution through a porous wall of a hollow element into an interior cavity.

  11. Quantitative (real-time) PCR

    International Nuclear Information System (INIS)

    Denman, S.E.; McSweeney, C.S.

    2005-01-01

    Many nucleic acid-based probe and PCR assays have been developed for the detection tracking of specific microbes within the rumen ecosystem. Conventional PCR assays detect PCR products at the end stage of each PCR reaction, where exponential amplification is no longer being achieved. This approach can result in different end product (amplicon) quantities being generated. In contrast, using quantitative, or real-time PCR, quantification of the amplicon is performed not at the end of the reaction, but rather during exponential amplification, where theoretically each cycle will result in a doubling of product being created. For real-time PCR, the cycle at which fluorescence is deemed to be detectable above the background during the exponential phase is termed the cycle threshold (Ct). The Ct values obtained are then used for quantitation, which will be discussed later

  12. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  13. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  14. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  15. Qualitative discussion of quantitative radiography

    International Nuclear Information System (INIS)

    Berger, H.; Motz, J.W.

    1975-01-01

    Since radiography yields an image that can be easily related to the tested object, it is superior to many nondestructive testing techniques in revealing the size, shape, and location of certain types of discontinuities. The discussion is limited to a description of the radiographic process, examination of some of the quantitative aspects of radiography, and an outline of some of the new ideas emerging in radiography. The advantages of monoenergetic x-ray radiography and neutron radiography are noted

  16. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  17. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  18. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  19. Assigning Significance in Label-Free Quantitative Proteomics to Include Single-Peptide-Hit Proteins with Low Replicates

    OpenAIRE

    Li, Qingbo

    2010-01-01

    When sample replicates are limited in a label-free proteomics experiment, selecting differentially regulated proteins with an assignment of statistical significance remains difficult for proteins with a single-peptide hit or a small fold-change. This paper aims to address this issue. An important component of the approach employed here is to utilize the rule of Minimum number of Permuted Significant Pairings (MPSP) to reduce false positives. The MPSP rule generates permuted sample pairings fr...

  20. Megafaunal community structure of Andaman seamounts including the back-arc basin - A quantitative exploration from the Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Sautya, S.; Ingole, B.S.; Ray, D.; Stohr, S.; Samudrala, K.; KameshRaju, K.A.; Mudholkar, A.V.

    seamount fauna in the southwest Pacific. Nature 405: 944–947. 4. McClain C (2007) Seamounts: identity crisis or split personality? Journal of Biogeography 34: 2001–2008. 5. Clark MR, Rowden AA, Schlacher T, Williams A, Consalvey M, et al. (2010) The ecology.... 19. Pielou EC (1966) Species diversity and pattern diversity in the study of ecological succession. Journal of Theoretical Biology 10: 372–383. 20. Shannon CE, Weaver W (1963) The Mathematical Theory of Communication. Urbana, Illinois: University...

  1. Source-space ICA for MEG source imaging.

    Science.gov (United States)

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  2. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    Science.gov (United States)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  3. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  4. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  5. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  6. A quantitative reading of competences documents of Law new degrees.

    OpenAIRE

    Leví Orta, Genoveva del Carmen; Ramos Méndez, Eduardo

    2014-01-01

    Documents formulating competences of degrees are key sources for analysis, evaluation and profile comparison of training, currently offered by different university degrees. This work aims to make a quantitative reading of competences documents of Law degree from various Spanish universities, based on the ideas of Content Analysis. The methodology has two phases. Firstly, a dictionary of concepts related to the components of competences is identified in the documentary corpus. Next, the corpus...

  7. Compact ion accelerator source

    Science.gov (United States)

    Schenkel, Thomas; Persaud, Arun; Kapadia, Rehan; Javey, Ali

    2014-04-29

    An ion source includes a conductive substrate, the substrate including a plurality of conductive nanostructures with free-standing tips formed on the substrate. A conductive catalytic coating is formed on the nanostructures and substrate for dissociation of a molecular species into an atomic species, the molecular species being brought in contact with the catalytic coating. A target electrode placed apart from the substrate, the target electrode being biased relative to the substrate with a first bias voltage to ionize the atomic species in proximity to the free-standing tips and attract the ionized atomic species from the substrate in the direction of the target electrode.

  8. INEEL Source Water Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sehlke, Gerald

    2003-03-01

    The Idaho National Engineering and Environmental Laboratory (INEEL) covers approximately 890 mi2 and includes 12 public water systems that must be evaluated for Source water protection purposes under the Safe Drinking Water Act. Because of its size and location, six watersheds and five aquifers could potentially affect the INEEL’s drinking water sources. Based on a preliminary evaluation of the available information, it was determined that the Big Lost River, Birch Creek, and Little Lost River Watersheds and the eastern Snake River Plain Aquifer needed to be assessed. These watersheds were delineated using the United States Geologic Survey’s Hydrological Unit scheme. Well capture zones were originally estimated using the RESSQC module of the Environmental Protection Agency’s Well Head Protection Area model, and the initial modeling assumptions and results were checked by running several scenarios using Modflow modeling. After a technical review, the resulting capture zones were expanded to account for the uncertainties associated with changing groundwater flow directions, a thick vadose zone, and other data uncertainties. Finally, all well capture zones at a given facility were merged to a single wellhead protection area at each facility. A contaminant source inventory was conducted, and the results were integrated with the well capture zones, watershed and aquifer information, and facility information using geographic information system technology to complete the INEEL’s Source Water Assessment. Of the INEEL’s 12 public water systems, three systems rated as low susceptibility (EBR-I, Main Gate, and Gun Range), and the remainder rated as moderate susceptibility. No INEEL public water system rated as high susceptibility. We are using this information to develop a source water management plan from which we will subsequently implement an INEEL-wide source water management program. The results are a very robust set of wellhead protection areas that will

  9. Intense fusion neutron sources

    International Nuclear Information System (INIS)

    Kuteev, B. V.; Goncharov, P. R.; Sergeev, V. Yu.; Khripunov, V. I.

    2010-01-01

    The review describes physical principles underlying efficient production of free neutrons, up-to-date possibilities and prospects of creating fission and fusion neutron sources with intensities of 10 15 -10 21 neutrons/s, and schemes of production and application of neutrons in fusion-fission hybrid systems. The physical processes and parameters of high-temperature plasmas are considered at which optimal conditions for producing the largest number of fusion neutrons in systems with magnetic and inertial plasma confinement are achieved. The proposed plasma methods for neutron production are compared with other methods based on fusion reactions in nonplasma media, fission reactions, spallation, and muon catalysis. At present, intense neutron fluxes are mainly used in nanotechnology, biotechnology, material science, and military and fundamental research. In the near future (10-20 years), it will be possible to apply high-power neutron sources in fusion-fission hybrid systems for producing hydrogen, electric power, and technological heat, as well as for manufacturing synthetic nuclear fuel and closing the nuclear fuel cycle. Neutron sources with intensities approaching 10 20 neutrons/s may radically change the structure of power industry and considerably influence the fundamental and applied science and innovation technologies. Along with utilizing the energy produced in fusion reactions, the achievement of such high neutron intensities may stimulate wide application of subcritical fast nuclear reactors controlled by neutron sources. Superpower neutron sources will allow one to solve many problems of neutron diagnostics, monitor nano-and biological objects, and carry out radiation testing and modification of volumetric properties of materials at the industrial level. Such sources will considerably (up to 100 times) improve the accuracy of neutron physics experiments and will provide a better understanding of the structure of matter, including that of the neutron itself.

  10. Intense fusion neutron sources

    Science.gov (United States)

    Kuteev, B. V.; Goncharov, P. R.; Sergeev, V. Yu.; Khripunov, V. I.

    2010-04-01

    The review describes physical principles underlying efficient production of free neutrons, up-to-date possibilities and prospects of creating fission and fusion neutron sources with intensities of 1015-1021 neutrons/s, and schemes of production and application of neutrons in fusion-fission hybrid systems. The physical processes and parameters of high-temperature plasmas are considered at which optimal conditions for producing the largest number of fusion neutrons in systems with magnetic and inertial plasma confinement are achieved. The proposed plasma methods for neutron production are compared with other methods based on fusion reactions in nonplasma media, fission reactions, spallation, and muon catalysis. At present, intense neutron fluxes are mainly used in nanotechnology, biotechnology, material science, and military and fundamental research. In the near future (10-20 years), it will be possible to apply high-power neutron sources in fusion-fission hybrid systems for producing hydrogen, electric power, and technological heat, as well as for manufacturing synthetic nuclear fuel and closing the nuclear fuel cycle. Neutron sources with intensities approaching 1020 neutrons/s may radically change the structure of power industry and considerably influence the fundamental and applied science and innovation technologies. Along with utilizing the energy produced in fusion reactions, the achievement of such high neutron intensities may stimulate wide application of subcritical fast nuclear reactors controlled by neutron sources. Superpower neutron sources will allow one to solve many problems of neutron diagnostics, monitor nano-and biological objects, and carry out radiation testing and modification of volumetric properties of materials at the industrial level. Such sources will considerably (up to 100 times) improve the accuracy of neutron physics experiments and will provide a better understanding of the structure of matter, including that of the neutron itself.

  11. A radiochemical separation of spallogenic 88Zr in the carrier-free state for radioisotopic photoneutron sources

    International Nuclear Information System (INIS)

    Whipple, R.E.; Grant, P.M.; Daniels, R.J.; Daniels, W.R.; O'Brien, H.A.Jr.

    1976-01-01

    As the precursor of its 88 Y daughter, 88 Zr could be advantageously included in the active component of the 88 Y-Be photoneutron source for several reasons. The spallation of Mo targets with medium-energy protons at LAMPF procedure has been developed to separate radiozirconium from the target material and various spallogenic impurities. 88 Zr can consequently be obtained carrier-free and in quantitative yield. (author)

  12. An Integrated Biochemistry Laboratory, Including Molecular Modeling

    Science.gov (United States)

    Hall, Adele J. Wolfson Mona L.; Branham, Thomas R.

    1996-11-01

    The dilemma of designing an advanced undergraduate laboratory lies in the desire to teach and reinforce basic principles and techniques while at the same time exposing students to the excitement of research. We report here on a one-semester, project-based biochemistry laboratory that combines the best features of a cookbook approach (high success rate, achievement of defined goals) with those of an investigative, discovery-based approach (student involvement in the experimental design, excitement of real research). Individual modules may be selected and combined to meet the needs of different courses and different institutions. The central theme of this lab is protein purification and design. This laboratory accompanies the first semester of biochemistry (Structure and Function of Macromolecules, a course taken mainly by junior and senior chemistry and biological chemistry majors). The protein chosen as the object of study is the enzyme lysozyme, which is utilized in all projects. It is suitable for a student lab because it is easily and inexpensively obtained from egg white and is extremely stable, and its high isoelectric point (pI = 11) allows for efficient separation from other proteins by ion-exchange chromatography. Furthermore, a literature search conducted by the resourceful student reveals a wealth of information, since lysozyme has been the subject of numerous studies. It was the first enzyme whose structure was determined by crystallography (1). Hendrickson et al. (2) have previously described an intensive one-month laboratory course centered around lysozyme, although their emphasis is on protein stability rather than purification and engineering. Lysozyme continues to be the focus of much exciting new work on protein folding and dynamics, structure and activity (3 - 5). This lab course includes the following features: (i) reinforcement of basic techniques, such as preparation of buffers, simple enzyme kinetics, and absorption spectroscopy; (ii

  13. Electrochemical cell structure including an ionomeric barrier

    Science.gov (United States)

    Lambert, Timothy N.; Hibbs, Michael

    2017-06-20

    An apparatus includes an electrochemical half-cell comprising: an electrolyte, an anode; and an ionomeric barrier positioned between the electrolyte and the anode. The anode may comprise a multi-electron vanadium phosphorous alloy, such as VP.sub.x, wherein x is 1-5. The electrochemical half-cell is configured to oxidize the vanadium and phosphorous alloy to release electrons. A method of mitigating corrosion in an electrochemical cell includes disposing an ionomeric barrier in a path of electrolyte or ion flow to an anode and mitigating anion accumulation on the surface of the anode.

  14. Isolators Including Main Spring Linear Guide Systems

    Science.gov (United States)

    Goold, Ryan (Inventor); Buchele, Paul (Inventor); Hindle, Timothy (Inventor); Ruebsamen, Dale Thomas (Inventor)

    2017-01-01

    Embodiments of isolators, such as three parameter isolators, including a main spring linear guide system are provided. In one embodiment, the isolator includes first and second opposing end portions, a main spring mechanically coupled between the first and second end portions, and a linear guide system extending from the first end portion, across the main spring, and toward the second end portion. The linear guide system expands and contracts in conjunction with deflection of the main spring along the working axis, while restricting displacement and rotation of the main spring along first and second axes orthogonal to the working axis.

  15. 28 CFR 20.32 - Includable offenses.

    Science.gov (United States)

    2010-07-01

    ... Exchange of Criminal History Record Information § 20.32 Includable offenses. (a) Criminal history record... vehicular manslaughter, driving under the influence of drugs or liquor, and hit and run), when unaccompanied by a § 20.32(a) offense. These exclusions may not be applicable to criminal history records...

  16. Including Students with Visual Impairments: Softball

    Science.gov (United States)

    Brian, Ali; Haegele, Justin A.

    2014-01-01

    Research has shown that while students with visual impairments are likely to be included in general physical education programs, they may not be as active as their typically developing peers. This article provides ideas for equipment modifications and game-like progressions for one popular physical education unit, softball. The purpose of these…

  17. Extending flood damage assessment methodology to include ...

    African Journals Online (AJOL)

    Optimal and sustainable flood plain management, including flood control, can only be achieved when the impacts of flood control measures are considered for both the man-made and natural environments, and the sociological aspects are fully considered. Until now, methods/models developed to determine the influences ...

  18. BIOLOGIC AND ECONOMIC EFFECTS OF INCLUDING DIFFERENT ...

    African Journals Online (AJOL)

    The biologic and economic effects of including three agro-industrial by-products as ingredients in turkey poult diets were investigated using 48 turkey poults in a completely randomised design experiment. Diets were formulated to contain the three by-products – wheat offal, rice husk and palm kernel meal, each at 20% level ...

  19. Including Children Dependent on Ventilators in School.

    Science.gov (United States)

    Levine, Jack M.

    1996-01-01

    Guidelines for including ventilator-dependent children in school are offered, based on experience with six such students at a New York State school. Guidelines stress adherence to the medical management plan, the school-family partnership, roles of the social worker and psychologist, orientation, transportation, classroom issues, and steps toward…

  20. The application of IBA techniques to air pollution source fingerprinting and source apportionment

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D., E-mail: dcz@ansto.gov.au; Stelcer, E.; Atanacio, A.; Crawford, J.

    2014-01-01

    IBA techniques have been used to measure elemental concentrations of more than 20 different elements found in fine particle (PM2.5) air pollution. These data together with their errors and minimum detectable limits were used in Positive Matrix Factorisation (PMF) analyses to quantitatively determine source fingerprints and their contributions to the total measured fine mass. Wind speed and direction back trajectory data from the global HYSPLIT codes were then linked to these PMF fingerprints to quantitatively identify the location of the sources.

  1. Impedance Source Power Electronic Converters

    DEFF Research Database (Denmark)

    Liu, Yushan; Abu-Rub, Haitham; Ge, Baoming

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable...... and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key...... features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding...

  2. Impedance source power electronic converters

    CERN Document Server

    Liu, Yushan; Ge, Baoming; Blaabjerg, Frede; Ellabban, Omar; Loh, Poh Chiang

    2016-01-01

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding control methods. Presents the latest power conversion solutions that aim to advance the role of pow...

  3. Californium source transfer

    International Nuclear Information System (INIS)

    Wallace, C.R.

    1995-01-01

    In early 1995, the receipt of four sealed californium-252 sources from Oak Ridge National Lab was successfully accomplished by a team comprised of Radiological Engineering, Radiological Operations and Health Physics Instrumentation personnel. A procedure was developed and walked-down by the participants during a Dry Run Evolution. Several special tools were developed during the pre-planning phases of the project which reduced individual and job dose to minimal levels. These included a mobile lifting device for attachment of a transfer ball valve assembly to the undercarriage of the Cannonball Carrier, a transfer tube elbow to ensure proper angle of the source transfer tube, and several tools used during emergency response for remote retrieval and handling of an unshielded source. Lessons were learned in the areas of contamination control, emergency preparedness, and benefits of thorough pre-planning, effectiveness of locally creating and designing special tools to reduce worker dose, and methods of successfully accomplishing source receipt evolutions during extreme or inclement weather

  4. 1st International Congress on Actuarial Science and Quantitative Finance

    CERN Document Server

    Garrido, José; Hernández-Hernández, Daniel; ICASQF

    2015-01-01

    Featuring contributions from industry and academia, this volume includes chapters covering a diverse range of theoretical and empirical aspects of actuarial science and quantitative finance, including portfolio management, derivative valuation, risk theory and the economics of insurance. Developed from the First International Congress on Actuarial Science and Quantitative Finance, held at the Universidad Nacional de Colombia in Bogotá in June 2014, this volume highlights different approaches to issues arising from industries in the Andean and Carribean regions. Contributions address topics such as Reverse mortgage schemes and urban dynamics, modeling spot price dynamics in the electricity market, and optimizing calibration and pricing with SABR models.

  5. Massively parallel data processing for quantitative total flow imaging with optical coherence microscopy and tomography

    Science.gov (United States)

    Sylwestrzak, Marcin; Szlag, Daniel; Marchand, Paul J.; Kumar, Ashwin S.; Lasser, Theo

    2017-08-01

    We present an application of massively parallel processing of quantitative flow measurements data acquired using spectral optical coherence microscopy (SOCM). The need for massive signal processing of these particular datasets has been a major hurdle for many applications based on SOCM. In view of this difficulty, we implemented and adapted quantitative total flow estimation algorithms on graphics processing units (GPU) and achieved a 150 fold reduction in processing time when compared to a former CPU implementation. As SOCM constitutes the microscopy counterpart to spectral optical coherence tomography (SOCT), the developed processing procedure can be applied to both imaging modalities. We present the developed DLL library integrated in MATLAB (with an example) and have included the source code for adaptations and future improvements. Catalogue identifier: AFBT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AFBT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU GPLv3 No. of lines in distributed program, including test data, etc.: 913552 No. of bytes in distributed program, including test data, etc.: 270876249 Distribution format: tar.gz Programming language: CUDA/C, MATLAB. Computer: Intel x64 CPU, GPU supporting CUDA technology. Operating system: 64-bit Windows 7 Professional. Has the code been vectorized or parallelized?: Yes, CPU code has been vectorized in MATLAB, CUDA code has been parallelized. RAM: Dependent on users parameters, typically between several gigabytes and several tens of gigabytes Classification: 6.5, 18. Nature of problem: Speed up of data processing in optical coherence microscopy Solution method: Utilization of GPU for massively parallel data processing Additional comments: Compiled DLL library with source code and documentation, example of utilization (MATLAB script with raw data) Running time: 1,8 s for one B-scan (150 × faster in comparison to the CPU

  6. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  7. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  8. Quantitative safety goals for the regulatory process

    International Nuclear Information System (INIS)

    Joksimovic, V.; O'Donnell, L.F.

    1981-01-01

    The paper offers a brief summary of the current regulatory background in the USA, emphasizing nuclear, related to the establishment of quantitative safety goals as a way to respond to the key issue of 'how safe is safe enough'. General Atomic has taken a leading role in advocating the use of probabilistic risk assessment techniques in the regulatory process. This has led to understanding of the importance of quantitative safety goals. The approach developed by GA is discussed in the paper. It is centred around definition of quantitative safety regions. The regions were termed: design basis, safety margin or design capability and safety research. The design basis region is bounded by the frequency of 10 -4 /reactor-year and consequences of no identifiable public injury. 10 -4 /reactor-year is associated with the total projected lifetime of a commercial US nuclear power programme. Events which have a 50% chance of happening are included in the design basis region. In the safety margin region, which extends below the design basis region, protection is provided against some events whose probability of not happening during the expected course of the US nuclear power programme is within the range of 50 to 90%. Setting the lower mean frequency to this region of 10 -5 /reactor-year is equivalent to offering 90% assurance that an accident of given severity will not happen. Rare events with a mean frequency below 10 -5 can be predicted to occur. However, accidents predicted to have a probability of less than 10 -6 are 99% certain not to happen at all, and are thus not anticipated to affect public health and safety. The area between 10 -5 and 10 -6 defines the frequency portion of the safety research region. Safety goals associated with individual risk to a maximum-exposed member of public, general societal risk and property risk are proposed in the paper

  9. Photoactive devices including porphyrinoids with coordinating additives

    Science.gov (United States)

    Forrest, Stephen R; Zimmerman, Jeramy; Yu, Eric K; Thompson, Mark E; Trinh, Cong; Whited, Matthew; Diev, Vlacheslav

    2015-05-12

    Coordinating additives are included in porphyrinoid-based materials to promote intermolecular organization and improve one or more photoelectric characteristics of the materials. The coordinating additives are selected from fullerene compounds and organic compounds having free electron pairs. Combinations of different coordinating additives can be used to tailor the characteristic properties of such porphyrinoid-based materials, including porphyrin oligomers. Bidentate ligands are one type of coordinating additive that can form coordination bonds with a central metal ion of two different porphyrinoid compounds to promote porphyrinoid alignment and/or pi-stacking. The coordinating additives can shift the absorption spectrum of a photoactive material toward higher wavelengths, increase the external quantum efficiency of the material, or both.

  10. 26 CFR 31.3402(e)-1 - Included and excluded wages.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 15 2010-04-01 2010-04-01 false Included and excluded wages. 31.3402(e)-1... SOURCE Collection of Income Tax at Source § 31.3402(e)-1 Included and excluded wages. (a) If a portion of... not more than 31 consecutive days constitutes wages, and the remainder does not constitute wages, all...

  11. Towards a Quantitative Framework for Evaluating Vulnerability of Drinking Water Wells to Contamination from Unconventional Oil & Gas Development

    Science.gov (United States)

    Soriano, M., Jr.; Deziel, N. C.; Saiers, J. E.

    2017-12-01

    The rapid expansion of unconventional oil and gas (UO&G) production, made possible by advances in hydraulic fracturing (fracking), has triggered concerns over risks this extraction poses to water resources and public health. Concerns are particularly acute within communities that host UO&G development and rely heavily on shallow aquifers as sources of drinking water. This research aims to develop a quantitative framework to evaluate the vulnerability of drinking water wells to contamination from UO&G activities. The concept of well vulnerability is explored through application of backwards travel time probability modeling to estimate the likelihood that capture zones of drinking water wells circumscribe source locations of UO&G contamination. Sources of UO&G contamination considered in this analysis include gas well pads and documented sites of UO&G wastewater and chemical spills. The modeling approach is illustrated for a portion of Susquehanna County, Pennsylvania, where more than one thousand shale gas wells have been completed since 2005. Data from a network of eight multi-level groundwater monitoring wells installed in the study site in 2015 are used to evaluate the model. The well vulnerability concept is proposed as a physically based quantitative tool for policy-makers dealing with the management of contamination risks of drinking water wells. In particular, the model can be used to identify adequate setback distances of UO&G activities from drinking water wells and other critical receptors.

  12. Electric power monthly, September 1990. [Glossary included

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-17

    The purpose of this report is to provide energy decision makers with accurate and timely information that may be used in forming various perspectives on electric issues. The power plants considered include coal, petroleum, natural gas, hydroelectric, and nuclear power plants. Data are presented for power generation, fuel consumption, fuel receipts and cost, sales of electricity, and unusual occurrences at power plants. Data are compared at the national, Census division, and state levels. 4 figs., 52 tabs. (CK)

  13. Nuclear reactor shield including magnesium oxide

    International Nuclear Information System (INIS)

    Rouse, C.A.; Simnad, M.T.

    1981-01-01

    An improvement is described for nuclear reactor shielding of a type used in reactor applications involving significant amounts of fast neutron flux. The reactor shielding includes means providing structural support, neutron moderator material, neutron absorber material and other components, wherein at least a portion of the neutron moderator material is magnesium in the form of magnesium oxide either alone or in combination with other moderator materials such as graphite and iron

  14. Model for safety reports including descriptive examples

    International Nuclear Information System (INIS)

    1995-12-01

    Several safety reports will be produced in the process of planning and constructing the system for disposal of high-level radioactive waste in Sweden. The present report gives a model, with detailed examples, of how these reports should be organized and what steps they should include. In the near future safety reports will deal with the encapsulation plant and the repository. Later reports will treat operation of the handling systems and the repository

  15. Jet-calculus approach including coherence effects

    International Nuclear Information System (INIS)

    Jones, L.M.; Migneron, R.; Narayanan, K.S.S.

    1987-01-01

    We show how integrodifferential equations typical of jet calculus can be combined with an averaging procedure to obtain jet-calculus-based results including the Mueller interference graphs. Results in longitudinal-momentum fraction x for physical quantities are higher at intermediate x and lower at large x than with the conventional ''incoherent'' jet calculus. These results resemble those of Marchesini and Webber, who used a Monte Carlo approach based on the same dynamics

  16. The Case for Infusing Quantitative Literacy into Introductory Geoscience Courses

    Directory of Open Access Journals (Sweden)

    Jennifer M. Wenner

    2009-01-01

    Full Text Available We present the case for introductory geoscience courses as model venues for increasing the quantitative literacy (QL of large numbers of the college-educated population. The geosciences provide meaningful context for a number of fundamental mathematical concepts that are revisited several times in a single course. Using some best practices from the mathematics education community surrounding problem solving, calculus reform, pre-college mathematics and five geoscience/math workshops, geoscience and mathematics faculty have identified five pedagogical ideas to increase the QL of the students who populate introductory geoscience courses. These five ideas include techniques such as: place mathematical concepts in context, use multiple representations, use technology appropriately, work in groups, and do multiple-day, in-depth problems that place quantitative skills in multiple contexts. We discuss the pedagogical underpinnings of these five ideas and illustrate some ways that the geosciences represent ideal places to use these techniques. However, the inclusion of QL in introductory courses is often met with resistance at all levels. Faculty who wish to include quantitative content must use creative means to break down barriers of public perception of geoscience as qualitative, administrative worry that enrollments will drop and faculty resistance to change. Novel ways to infuse QL into geoscience classrooms include use of web-based resources, shadow courses, setting clear expectations, and promoting quantitative geoscience to the general public. In order to help faculty increase the QL of geoscience students, a community-built faculty-centered web resource (Teaching Quantitative Skills in the Geosciences houses multiple examples that implement the five best practices of QL throughout the geoscience curriculum. We direct faculty to three portions of the web resource: Teaching Quantitative Literacy, QL activities, and the 2006 workshop website

  17. Pesticide Information Sources in the United States.

    Science.gov (United States)

    Alston, Patricia Gayle

    1992-01-01

    Presents an overview of electronic and published sources on pesticides. Includes sources such as databases, CD-ROMs, books, journals, brochures, pamphlets, fact sheets, hotlines, courses, electronic mail, and electronic bulletin boards. (MCO)

  18. The Chandra Source Catalog : Automated Source Correlation

    Science.gov (United States)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  19. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  20. Quantitative Trait Loci in Inbred Lines

    NARCIS (Netherlands)

    Jansen, R.C.

    2001-01-01

    Quantitative traits result from the influence of multiple genes (quantitative trait loci) and environmental factors. Detecting and mapping the individual genes underlying such 'complex' traits is a difficult task. Fortunately, populations obtained from crosses between inbred lines are relatively

  1. A quantitative framework for assessing ecological resilience

    Science.gov (United States)

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  2. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  3. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  4. Cold source economic study

    International Nuclear Information System (INIS)

    Fuster, Serge.

    1975-01-01

    This computer code is intended for the statement of the general economic balance resulting from using a given cold source. The balance includes the investments needed for constructing the various materials, and also production balances resulting from their utilization. The case of either using an open circuit condenser on sea or river, or using air cooling systems with closed circuits or as auxiliaries can be dealt with. The program can be used to optimize the characteristics of the various parts of the cold source. The performance of the various materials can be evaluated for a given situation from using very full, precise economic balances, these materials can also be classified according to their possible uses, the outer constraints being taken into account (limits for heat disposal into rivers or seas, water temperature, air temperature). Technical choices whose economic consequences are important have been such clarified [fr

  5. Filtered cathodic arc source

    International Nuclear Information System (INIS)

    Falabella, S.; Sanders, D.M.

    1994-01-01

    A continuous, cathodic arc ion source coupled to a macro-particle filter capable of separation or elimination of macro-particles from the ion flux produced by cathodic arc discharge is described. The ion source employs an axial magnetic field on a cathode (target) having tapered sides to confine the arc, thereby providing high target material utilization. A bent magnetic field is used to guide the metal ions from the target to the part to be coated. The macro-particle filter consists of two straight solenoids, end to end, but placed at 45 degree to one another, which prevents line-of-sight from the arc spot on the target to the parts to be coated, yet provides a path for ions and electrons to flow, and includes a series of baffles for trapping the macro-particles. 3 figures

  6. Fundamental quantitative security in quantum key generation

    International Nuclear Information System (INIS)

    Yuen, Horace P.

    2010-01-01

    We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographic context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.

  7. Quantitative fluorescence nanoscopy for cancer biomedicine

    Science.gov (United States)

    Huang, Tao; Nickerson, Andrew; Peters, Alec; Nan, Xiaolin

    2015-08-01

    Cancer is a major health threat worldwide. Options for targeted cancer therapy, however, are often limited, in a large part due to our incomplete understanding of how key processes including oncogenesis and drug response are mediated at the molecular level. New imaging techniques for visualizing biomolecules and their interactions at the nanometer and single molecule scales, collectively named fluorescence nanoscopy, hold the promise to transform biomedical research by providing direct mechanistic insight into cellular processes. We discuss the principles of quantitative single-molecule localization microscopy (SMLM), a subset of fluorescence nanoscopy, and their applications to cancer biomedicine. In particular, we will examine oncogenesis and drug resistance mediated by mutant Ras, which is associated with ~1/3 of all human cancers but has remained an intractable drug target. At ~20 nm spatial and single-molecule stoichiometric resolutions, SMLM clearly showed that mutant Ras must form dimers to activate its effector pathways and drive oncogenesis. SMLM further showed that the Raf kinase, one of the most important effectors of Ras, also forms dimers upon activation by Ras. Moreover, treatment of cells expressing wild type Raf with Raf inhibitors induces Raf dimer formation in a manner dependent on Ras dimerization. Together, these data suggest that Ras dimers mediate oncogenesis and drug resistance in tumors with hyperactive Ras and can potentially be targeted for cancer therapy. We also discuss recent advances in SMLM that enable simultaneous imaging of multiple biomolecules and their interactions at the nanoscale. Our work demonstrates the power of quantitative SMLM in cancer biomedicine.

  8. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  9. MR Fingerprinting for Rapid Quantitative Abdominal Imaging.

    Science.gov (United States)

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D; Wright, Katherine L; Seiberlich, Nicole; Griswold, Mark A; Gulani, Vikas

    2016-04-01

    To develop a magnetic resonance (MR) "fingerprinting" technique for quantitative abdominal imaging. This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue

  10. Health impacts of different energy sources

    International Nuclear Information System (INIS)

    1982-01-01

    Energy is needed to sustain the economy, health and welfare of nations. As a consequence of this, energy consumption figures are frequently used as an index of a nation's advancement. As a result of the global energy crisis, almost every nation has had to develop all its available energy resources and plan its future use of energy. The planners of national and international energy policies are however often faced with a problem of 'public acceptance' arising from the potential health and environmental impacts of developing energy resources. The public's desire to preserve the quality of man's health and his environment frequently results in opposition to many industrial innovations, including the generation and use of energy. Reliable, quantitative data and information are needed on the risks to health and the environment of different contemporary energy systems, to improve public understanding, and to serve as the basis from which national planners can choose between different energy supply options. With the exception of nuclear energy, even in technologically advanced countries little systematic research and development has been done on the quantitative assessment of the effects on health and the environment of the conventional energy sources. The need for this information has only been realized over the past decade as the climate and environment in many regions of the world has deteriorated with the unabated release of pollutants from factories and energy generating plants in particular. A number of countries have started national environmental health research programmes to monitor and regulate toxic emissions from industry and energy plants. Energy-related environmental health research has been supported and co-ordinated by various international organizations such as the International Atomic Energy Agency (IAEA), World Health Organization (WHO) and United Nations Environment Programme (UNEP). WHO has supported expert reviews on the potential health risks posed

  11. [Renal patient's diet: Can fish be included?].

    Science.gov (United States)

    Castro González, M I; Maafs Rodríguez, A G; Galindo Gómez, C

    2012-01-01

    Medical and nutritional treatment for renal disease, now a major public health issue, is highly complicated. Nutritional therapy must seek to retard renal dysfunction, maintain an optimal nutritional status and prevent the development of underlying pathologies. To analyze ten fish species to identify those that, because of their low phosphorus content, high biological value protein and elevated n-3 fatty acids EPA and DHA, could be included in renal patient's diet. The following fish species (Litte tunny, Red drum, Spotted eagleray, Escolar, Swordfish, Big-scale pomfret, Cortez flounder, Largemouth blackbass, Periche mojarra, Florida Pompano) were analyzed according to the AOAC and Keller techniques to determine their protein, phosphorus, sodium, potassium, cholesterol, vitamins D(3) and E, and n-3 EPA+DHA content. These results were used to calculate relations between nutrients. The protein in the analyzed species ranged from 16.5 g/100 g of fillet (Largemouth black bass) to 27.2 g/100 g (Red drum); the lowest phosphorus value was 28.6 mg/100 g (Periche mojarra) and the highest 216.3 mg/100 g (Spotted eagle ray). 80% of the fish presented > 100 mg EPA + DHA in 100 g of fillet. By its Phosphorus/gProtein ratio, Escolar and Swordfish could not be included in the renal diet; Little tunny, Escolar, Big-scale pomfret, Largemouth black-bass, Periche mojarra and Florida Pompano presented a lower Phosphorus/EPA + DHA ratio. Florida pompano is the most recommended specie for renal patients, due to its optimal nutrient relations. However, all analyzed species, except Escolar and Swordfish, could be included in renal diets.

  12. MOS modeling hierarchy including radiation effects

    International Nuclear Information System (INIS)

    Alexander, D.R.; Turfler, R.M.

    1975-01-01

    A hierarchy of modeling procedures has been developed for MOS transistors, circuit blocks, and integrated circuits which include the effects of total dose radiation and photocurrent response. The models were developed for use with the SCEPTRE circuit analysis program, but the techniques are suitable for other modern computer aided analysis programs. The modeling hierarchy permits the designer or analyst to select the level of modeling complexity consistent with circuit size, parametric information, and accuracy requirements. Improvements have been made in the implementation of important second order effects in the transistor MOS model, in the definition of MOS building block models, and in the development of composite terminal models for MOS integrated circuits

  13. Drug delivery device including electrolytic pump

    KAUST Repository

    Foulds, Ian G.; Buttner, Ulrich; Yi, Ying

    2016-01-01

    Systems and methods are provided for a drug delivery device and use of the device for drug delivery. In various aspects, the drug delivery device combines a “solid drug in reservoir” (SDR) system with an electrolytic pump. In various aspects an improved electrolytic pump is provided including, in particular, an improved electrolytic pump for use with a drug delivery device, for example an implantable drug delivery device. A catalytic reformer can be incorporated in a periodically pulsed electrolytic pump to provide stable pumping performance and reduced actuation cycle.

  14. Drug delivery device including electrolytic pump

    KAUST Repository

    Foulds, Ian G.

    2016-03-31

    Systems and methods are provided for a drug delivery device and use of the device for drug delivery. In various aspects, the drug delivery device combines a “solid drug in reservoir” (SDR) system with an electrolytic pump. In various aspects an improved electrolytic pump is provided including, in particular, an improved electrolytic pump for use with a drug delivery device, for example an implantable drug delivery device. A catalytic reformer can be incorporated in a periodically pulsed electrolytic pump to provide stable pumping performance and reduced actuation cycle.

  15. Energy principle with included boundary conditions

    International Nuclear Information System (INIS)

    Lehnert, B.

    1994-01-01

    Earlier comments by the author on the limitations of the classical form of the extended energy principle are supported by a complementary analysis on the potential energy change arising from free-boundary displacements of a magnetically confined plasma. In the final formulation of the extended principle, restricted displacements, satisfying pressure continuity by means of plasma volume currents in a thin boundary layer, are replaced by unrestricted (arbitrary) displacements which can give rise to induced surface currents. It is found that these currents contribute to the change in potential energy, and that their contribution is not taken into account by such a formulation. A general expression is further given for surface currents induced by arbitrary displacements. The expression is used to reformulate the energy principle for the class of displacements which satisfy all necessary boundary conditions, including that of the pressure balance. This makes a minimization procedure of the potential energy possible, for the class of all physically relevant test functions which include the constraints imposed by the boundary conditions. Such a procedure is also consistent with a corresponding variational calculus. (Author)

  16. Aerosol simulation including chemical and nuclear reactions

    International Nuclear Information System (INIS)

    Marwil, E.S.; Lemmon, E.C.

    1985-01-01

    The numerical simulation of aerosol transport, including the effects of chemical and nuclear reactions presents a challenging dynamic accounting problem. Particles of different sizes agglomerate and settle out due to various mechanisms, such as diffusion, diffusiophoresis, thermophoresis, gravitational settling, turbulent acceleration, and centrifugal acceleration. Particles also change size, due to the condensation and evaporation of materials on the particle. Heterogeneous chemical reactions occur at the interface between a particle and the suspending medium, or a surface and the gas in the aerosol. Homogeneous chemical reactions occur within the aersol suspending medium, within a particle, and on a surface. These reactions may include a phase change. Nuclear reactions occur in all locations. These spontaneous transmutations from one element form to another occur at greatly varying rates and may result in phase or chemical changes which complicate the accounting process. This paper presents an approach for inclusion of these effects on the transport of aerosols. The accounting system is very complex and results in a large set of stiff ordinary differential equations (ODEs). The techniques for numerical solution of these ODEs require special attention to achieve their solution in an efficient and affordable manner. 4 refs

  17. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  18. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  19. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  20. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  1. Developing HYDMN code to include the transient of MNSR

    International Nuclear Information System (INIS)

    Al-Barhoum, M.

    2000-11-01

    A description of the programs added to HYDMN code (a code for thermal-hydraulic steady state of MNSR) to include the transient of the same MNSR is presented. The code asks the initial conditions for the power (in k W) and the cold initial core inlet temperature (in degrees centigrade). A time-dependent study of the coolant inlet and outlet temperature, its speed, pool and tank temperatures is done for MNSR in general and for the Syrian MNSR in particular. The study solves the differential equations taken from reference (1) by using some numerical methods found in reference (3). The code becomes this way independent of any external information source. (Author)

  2. Modification of SKYSHINE-III to include cask array shadowing

    Energy Technology Data Exchange (ETDEWEB)

    Hertel, N.E. [Georgia Institute of Technology, Atlanta, GA (United States); Pfeifer, H.J. [NAC International, Norcross, GA (United States); Napolitano, D.G. [NISYS Corporation, Duluth, GA (United States)

    2000-03-01

    The NAC International version of SKYSHINE-III has been expanded to represent the radiation emissions from ISFSI (Interim Spent Fuel Storage Installations) dry storage casks using surface source descriptions. In addition, this modification includes a shadow shielding algorithm of the casks in the array. The resultant code is a flexible design tool which can be used to rapidly assess the impact of various cask loadings and arrangements. An example of its use in calculating dose rates for a 10x8 cask array is presented. (author)

  3. Negative ion sources for tandem accelerator

    International Nuclear Information System (INIS)

    Minehara, Eisuke

    1980-08-01

    Four kinds of negative ion sources (direct extraction Duoplasmatron ion source, radial extraction Penniing ion source, lithium charge exchange ion source and Middleton-type sputter ion source) have been installed in the JAERI tandem accelerator. The ion sources can generate many negative ions ranging from Hydrogen to Uranium with the exception of Ne, Ar, Kr, Xe and Rn. Discussions presented in this report include mechanisms of negative ion formation, electron affinity and stability of negative ions, performance of the ion sources and materials used for negative ion production. Finally, the author will discuss difficult problems to be overcome in order to get any negative ion sufficiently. (author)

  4. Use of radioactive indicators for the quantitative determination of non-metall inclusions in steel

    International Nuclear Information System (INIS)

    Rewienska-Kosciuk, B.; Michalik, J.

    1979-01-01

    Methods of determining and investigating the sources of non-metal inclusions in steel are presented together with some results of radiometric investigations. The experience of several years of research in industries as well as profound studies of world literature were used as a basis for systematic and critical discussion of the methods used. Optimum methods have been chosen for the quantitative determination of oxide inclusions and for the identification of their origin (e.g. from the refractory furnace lining, the tap-hole, the runner, the ladle or mold slag). Problems of tracers (type, quantity, condition, activity), of the labelling method suitable for the various origins of inclusions, of sampling, of chemical processing of the material sampled, as well as of radiometric measuring techniques (including possible activation) are discussed. Finally, a method for the determination of inclusions resulting from the deoxidation of steel is briefly outlined. (author)

  5. The surgery of peripheral nerves (including tumors)

    DEFF Research Database (Denmark)

    Fugleholm, Kåre

    2013-01-01

    Surgical pathology of the peripheral nervous system includes traumatic injury, entrapment syndromes, and tumors. The recent significant advances in the understanding of the pathophysiology and cellular biology of peripheral nerve degeneration and regeneration has yet to be translated into improved...... surgical techniques and better outcome after peripheral nerve injury. Decision making in peripheral nerve surgery continues to be a complex challenge, where the mechanism of injury, repeated clinical evaluation, neuroradiological and neurophysiological examination, and detailed knowledge of the peripheral...... nervous system response to injury are prerequisite to obtain the best possible outcome. Surgery continues to be the primary treatment modality for peripheral nerve tumors and advances in adjuvant oncological treatment has improved outcome after malignant peripheral nerve tumors. The present chapter...

  6. CERN Technical Training: LABVIEW courses include RADE

    CERN Multimedia

    HR Department

    2009-01-01

    The contents of the "LabView Basic I" and "LabView Intermediate II" courses have recently been changed to include, respectively, an introduction to and expert training in the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to developing expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course pr...

  7. CERN Technical Training: LABVIEW courses include RADE

    CERN Multimedia

    HR Department

    2009-01-01

    The contents of the "LabView Basic I" and "LabView Intermediate II" courses have recently been changed to include, respectively, an introduction to and expert training in the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to developing expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course prepares participants to develop test and measurement, da...

  8. CERN Technical Training: LABVIEW courses include RADE

    CERN Multimedia

    HR Department

    2009-01-01

    The contents of "LabView Basic I" and "LabView Intermediate II" trainings have been recently changed to include, respectively, an introduction and an expert training on the Rapid Application Development Environment (RADE). RADE is a LabView-based application developed at CERN to integrate LabView in the accelerator and experiment control infrastructure. It is a suitable solution to develop expert tools, machine development analysis and independent test facilities. The course names have also been changed to "LabVIEW Basics I with RADE Introduction" and "LabVIEW Intermediate II with Advanced RADE Application". " LabVIEW Basics I with RADE Introduction" is designed for: Users preparing to develop applications using LabVIEW, or NI Developer Suite; users and technical managers evaluating LabVIEW or NI Developer Suite in purchasing decisions; users pursuing the Certified LabVIEW Developer certification. The course prepare...

  9. Critical point anomalies include expansion shock waves

    Energy Technology Data Exchange (ETDEWEB)

    Nannan, N. R., E-mail: ryan.nannan@uvs.edu [Mechanical Engineering Discipline, Anton de Kom University of Suriname, Leysweg 86, PO Box 9212, Paramaribo, Suriname and Process and Energy Department, Delft University of Technology, Leeghwaterstraat 44, 2628 CA Delft (Netherlands); Guardone, A., E-mail: alberto.guardone@polimi.it [Department of Aerospace Science and Technology, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Colonna, P., E-mail: p.colonna@tudelft.nl [Propulsion and Power, Delft University of Technology, Kluyverweg 1, 2629 HS Delft (Netherlands)

    2014-02-15

    From first-principle fluid dynamics, complemented by a rigorous state equation accounting for critical anomalies, we discovered that expansion shock waves may occur in the vicinity of the liquid-vapor critical point in the two-phase region. Due to universality of near-critical thermodynamics, the result is valid for any common pure fluid in which molecular interactions are only short-range, namely, for so-called 3-dimensional Ising-like systems, and under the assumption of thermodynamic equilibrium. In addition to rarefaction shock waves, diverse non-classical effects are admissible, including composite compressive shock-fan-shock waves, due to the change of sign of the fundamental derivative of gasdynamics.

  10. CLIC expands to include the Southern Hemisphere

    CERN Multimedia

    Roberto Cantoni

    2010-01-01

    Australia has recently joined the CLIC collaboration: the enlargement will bring new expertise and resources to the project, and is especially welcome in the wake of CERN budget redistributions following the recent adoption of the Medium Term Plan.   The countries involved in CLIC collaboration With the signing of a Memorandum of Understanding on 26 August 2010, the ACAS network (Australian Collaboration for Accelerator Science) became the 40th member of in the multilateral CLIC collaboration making Australia the 22nd country to join the collaboration. “The new MoU was signed by the ACAS network, which includes the Australian Synchrotron and the University of Melbourne”, explains Jean-Pierre Delahaye, CLIC Study Leader. “Thanks to their expertise, the Australian institutes will contribute greatly to the CLIC damping rings and the two-beam test modules." Institutes from any country wishing to join the CLIC collaboration are invited to assume responsibility o...

  11. Should Broca's area include Brodmann area 47?

    Science.gov (United States)

    Ardila, Alfredo; Bernal, Byron; Rosselli, Monica

    2017-02-01

    Understanding brain organization of speech production has been a principal goal of neuroscience. Historically, brain speech production has been associated with so-called Broca’s area (Brodmann area –BA- 44 and 45), however, modern neuroimaging developments suggest speech production is associated with networks rather than with areas. The purpose of this paper was to analyze the connectivity of BA47 ( pars orbitalis) in relation to language . A meta-analysis was conducted to assess the language network in which BA47 is involved. The Brainmap database was used. Twenty papers corresponding to 29 experimental conditions with a total of 373 subjects were included. Our results suggest that BA47 participates in a “frontal language production system” (or extended Broca’s system). The BA47  connectivity found is also concordant with a minor role in language semantics. BA47 plays a central role in the language production system.

  12. Musculoskeletal ultrasound including definitions for ultrasonographic pathology

    DEFF Research Database (Denmark)

    Wakefield, RJ; Balint, PV; Szkudlarek, Marcin

    2005-01-01

    Ultrasound (US) has great potential as an outcome in rheumatoid arthritis trials for detecting bone erosions, synovitis, tendon disease, and enthesopathy. It has a number of distinct advantages over magnetic resonance imaging, including good patient tolerability and ability to scan multiple joints...... in a short period of time. However, there are scarce data regarding its validity, reproducibility, and responsiveness to change, making interpretation and comparison of studies difficult. In particular, there are limited data describing standardized scanning methodology and standardized definitions of US...... pathologies. This article presents the first report from the OMERACT ultrasound special interest group, which has compared US against the criteria of the OMERACT filter. Also proposed for the first time are consensus US definitions for common pathological lesions seen in patients with inflammatory arthritis....

  13. Grand unified models including extra Z bosons

    International Nuclear Information System (INIS)

    Li Tiezhong

    1989-01-01

    The grand unified theories (GUT) of the simple Lie groups including extra Z bosons are discussed. Under authors's hypothesis there are only SU 5+m SO 6+4n and E 6 groups. The general discussion of SU 5+m is given, then the SU 6 and SU 7 are considered. In SU 6 the 15+6 * +6 * fermion representations are used, which are not same as others in fermion content, Yukawa coupling and broken scales. A conception of clans of particles, which are not families, is suggested. These clans consist of extra Z bosons and the corresponding fermions of the scale. The all of fermions in the clans are down quarks except for the standard model which consists of Z bosons and 15 fermions, therefore, the spectrum of the hadrons which are composed of these down quarks are different from hadrons at present

  14. Including climate change in energy investment decisions

    International Nuclear Information System (INIS)

    Ybema, J.R.; Boonekamp, P.G.M.; Smit, J.T.J.

    1995-08-01

    To properly take climate change into account in the analysis of energy investment decisions, it is required to apply decision analysis methods that are capable of considering the specific characteristics of climate change (large uncertainties, long term horizon). Such decision analysis methods do exist. They can explicitly include evolving uncertainties, multi-stage decisions, cumulative effects and risk averse attitudes. Various methods are considered in this report and two of these methods have been selected: hedging calculations and sensitivity analysis. These methods are applied to illustrative examples, and its limitations are discussed. The examples are (1a) space heating and hot water for new houses from a private investor perspective and (1b) as example (1a) but from a government perspective, (2) electricity production with an integrated coal gasification combined cycle (ICGCC) with or without CO 2 removal, and (3) national energy strategy to hedge for climate change. 9 figs., 21 tabs., 42 refs., 1 appendix

  15. Education Program on Fossil Resources Including Coal

    Science.gov (United States)

    Usami, Masahiro

    Fossil fuels including coal play a key role as crucial energies in contributing to economic development in Asia. On the other hand, its limited quantity and the environmental problems causing from its usage have become a serious global issue and a countermeasure to solve such problems is very much demanded. Along with the pursuit of sustainable development, environmentally-friendly use of highly efficient fossil resources should be therefore, accompanied. Kyushu-university‧s sophisticated research through long years of accumulated experience on the fossil resources and environmental sectors together with the advanced large-scale commercial and empirical equipments will enable us to foster cooperative research and provide internship program for the future researchers. Then, this program is executed as a consignment business from the Ministry of Economy, Trade and Industry from 2007 fiscal year to 2009 fiscal year. The lecture that uses the textbooks developed by this program is scheduled to be started a course in fiscal year 2010.

  16. Urban PM in Eastern Germany: Source apportionment and contributions from different spatial scales

    Science.gov (United States)

    van Pinxteren, D.; Fomba, K. W.; Mothes, F.; Spindler, G.; Herrmann, H.

    2017-12-01

    Understanding the contributions of particulate matter (PM) sources and the source areas impacting total PM levels in a city are important requirements for further developing clean air policies and efficient abatement strategies. This presentation reports on two studies in Eastern Germany providing a detailed picture of present-day urban PM sources and discriminating contributions of local, regional and long-range sources. The "Leipzig Aerosol 2013-15" study yielded contributions of 12 sources to coarse, fine, and ultrafine particles, resolved by Positive Matrix Factorization (PMF) from comprehensive chemical speciation of 5-stage Berner impactor samples at 4 different sites in the Leipzig area. Dominant winter-time sources were traffic exhaust and non-exhaust emissions, secondary aerosol formation, and combustion emissions from both biomass and coal burning with different relative importance in different particle size ranges. Local sources dominated PM levels in ultrafine and coarse particles (60% - 80%) while high mass concentrations in accumulation mode particles mainly resulted from regional import into the city (70%). The "PM-East" study compiled PM10 mass and constituents' concentrations at 10 urban and rural sites in Eastern Germany during winter 2016/17, which included a 3-week episode of frequent exceedances of the PM10 limit value. PMF source apportionment is performed for a subset of the sites, including the city of Berlin. Contributions from short-, mid-, and long-range sources, including trans-boundary pollution import from neighbouring countries, are quantitatively assessed by advanced back trajectory statistical methods. Data analysis in PM-East is ongoing and final results will be available by November. Funding is acknowledged from 4 federal states of Germany: Berlin Senate Department for Environment, Transport and Climate Protection; Saxon State Office for Environment, Agriculture and Geology; State Agency for Environment, Nature Conservation and

  17. The Chandra Source Catalog: Algorithms

    Science.gov (United States)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  18. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  19. Physical and Chemical Barriers in Root Tissues Contribute to Quantitative Resistance to Fusarium oxysporum f. sp. pisi in Pea

    Directory of Open Access Journals (Sweden)

    Moustafa Bani

    2018-02-01

    Full Text Available Fusarium wilt caused by Fusarium oxysporum f. sp. pisi (Fop is one of the most destructive diseases of pea worldwide. Control of this disease is difficult and it is mainly based on the use of resistant cultivars. While monogenic resistance has been successfully used in the field, it is at risk of breakdown by the constant evolution of the pathogen. New sources of quantitative resistance have been recently identified from a wild relative Pisum spp. collection. Here, we characterize histologically the resistance mechanisms occurring in these sources of quantitative resistance. Detailed comparison, of the reaction at cellular level, of eight pea accessions with differential responses to Fop race 2, showed that resistant accessions established several barriers at the epidermis, exodermis, cortex, endodermis and vascular stele efficiently impeding fungal progression. The main components of these different barriers were carbohydrates and phenolic compounds including lignin. We found that these barriers were mainly based on three defense mechanisms including cell wall strengthening, formation of papilla-like structures at penetration sites and accumulation of different substances within and between cells. These defense reactions varied in intensity and localization between resistant accessions. Our results also clarify some steps of the infection process of F. oxysporum in plant and support the important role of cell wall-degrading enzymes in F. oxysporum pathogenicity.

  20. Qualitative and Quantitative Sentiment Proxies

    DEFF Research Database (Denmark)

    Zhao, Zeyan; Ahmad, Khurshid

    2015-01-01

    Sentiment analysis is a content-analytic investigative framework for researchers, traders and the general public involved in financial markets. This analysis is based on carefully sourced and elaborately constructed proxies for market sentiment and has emerged as a basis for analysing movements...

  1. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  2. Quantitative aspects of myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Vogel, R.A.

    1980-01-01

    Myocardial perfusion measurements have traditionally been performed in a quantitative fashion using application of the Sapirstein, Fick, Kety-Schmidt, or compartmental analysis principles. Although global myocardial blood flow measurements have not proven clinically useful, regional determinations have substantially advanced our understanding of and ability to detect myocardial ischemia. With the introduction of thallium-201, such studies have become widely available, although these have generally undergone qualitative evaluation. Using computer-digitized data, several methods for the quantification of myocardial perfusion images have been introduced. These include orthogonal and polar coordinate systems and anatomically oriented region of interest segmentation. Statistical ranges of normal and time-activity analyses have been applied to these data, resulting in objective and reproducible means of data evaluation

  3. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  4. Material-specific Conversion Factors for Different Solid Phantoms Used in the Dosimetry of Different Brachytherapy Sources

    Directory of Open Access Journals (Sweden)

    Sedigheh Sina

    2015-07-01

    Full Text Available Introduction Based on Task Group No. 43 (TG-43U1 recommendations, water phantom is proposed as a reference phantom for the dosimetry of brachytherapy sources. The experimental determination of TG-43 parameters is usually performed in water-equivalent solid phantoms. The purpose of this study was to determine the conversion factors for equalizing solid phantoms to water. Materials and Methods TG-43 parameters of low- and high-energy brachytherapy sources (i.e., Pd-103, I-125 and Cs-137 were obtained in different phantoms, using Monte Carlo simulations. The brachytherapy sources were simulated at the center of different phantoms including water, solid water, poly(methyl methacrylate, polystyrene and polyethylene. Dosimetric parameters such as dose rate constant, radial dose function and anisotropy function of each source were compared in different phantoms. Then, conversion factors were obtained to make phantom parameters equivalent to those of water. Results Polynomial coefficients of conversion factors were obtained for all sources to quantitatively compare g(r values in different phantom materials and the radial dose function in water. Conclusion Polynomial coefficients of conversion factors were obtained for all sources to quantitatively compare g(r values in different phantom materials and the radial dose function in water.

  5. Analysis of Smart Composite Structures Including Debonding

    Science.gov (United States)

    Chattopadhyay, Aditi; Seeley, Charles E.

    1997-01-01

    Smart composite structures with distributed sensors and actuators have the capability to actively respond to a changing environment while offering significant weight savings and additional passive controllability through ply tailoring. Piezoelectric sensing and actuation of composite laminates is the most promising concept due to the static and dynamic control capabilities. Essential to the implementation of these smart composites are the development of accurate and efficient modeling techniques and experimental validation. This research addresses each of these important topics. A refined higher order theory is developed to model composite structures with surface bonded or embedded piezoelectric transducers. These transducers are used as both sensors and actuators for closed loop control. The theory accurately captures the transverse shear deformation through the thickness of the smart composite laminate while satisfying stress free boundary conditions on the free surfaces. The theory is extended to include the effect of debonding at the actuator-laminate interface. The developed analytical model is implemented using the finite element method utilizing an induced strain approach for computational efficiency. This allows general laminate geometries and boundary conditions to be analyzed. The state space control equations are developed to allow flexibility in the design of the control system. Circuit concepts are also discussed. Static and dynamic results of smart composite structures, obtained using the higher order theory, are correlated with available analytical data. Comparisons, including debonded laminates, are also made with a general purpose finite element code and available experimental data. Overall, very good agreement is observed. Convergence of the finite element implementation of the higher order theory is shown with exact solutions. Additional results demonstrate the utility of the developed theory to study piezoelectric actuation of composite

  6. Quantitative utilization of prior biological knowledge in the Bayesian network modeling of gene expression data

    Directory of Open Access Journals (Sweden)

    Gao Shouguo

    2011-08-01

    Full Text Available Abstract Background Bayesian Network (BN is a powerful approach to reconstructing genetic regulatory networks from gene expression data. However, expression data by itself suffers from high noise and lack of power. Incorporating prior biological knowledge can improve the performance. As each type of prior knowledge on its own may be incomplete or limited by quality issues, integrating multiple sources of prior knowledge to utilize their consensus is desirable. Results We introduce a new method to incorporate the quantitative information from multiple sources of prior knowledge. It first uses the Naïve Bayesian classifier to assess the likelihood of functional linkage between gene pairs based on prior knowledge. In this study we included cocitation in PubMed and schematic similarity in Gene Ontology annotation. A candidate network edge reservoir is then created in which the copy number of each edge is proportional to the estimated likelihood of linkage between the two corresponding genes. In network simulation the Markov Chain Monte Carlo sampling algorithm is adopted, and samples from this reservoir at each iteration to generate new candidate networks. We evaluated the new algorithm using both simulated and real gene expression data including that from a yeast cell cycle and a mouse pancreas development/growth study. Incorporating prior knowledge led to a ~2 fold increase in the number of known transcription regulations recovered, without significant change in false positive rate. In contrast, without the prior knowledge BN modeling is not always better than a random selection, demonstrating the necessity in network modeling to supplement the gene expression data with additional information. Conclusion our new development provides a statistical means to utilize the quantitative information in prior biological knowledge in the BN modeling of gene expression data, which significantly improves the performance.

  7. Ion source and injector development

    International Nuclear Information System (INIS)

    Curtis, C.D.

    1976-01-01

    This is a survey of low energy accelerators which inject into proton linacs. Laboratories covered include Argonne, Brookhaven, CERN, Chalk River, Fermi, ITEP, KEK, Rutherford, and Saclay. This paper emphasizes complete injector systems, comparing significant hardware features and beam performance data, including recent additions. There is increased activity now in the acceleration of polarized protons, H + and H - , and of unpolarized H - . New source development and programs for these ion beams is outlined at the end of the report. Heavy-ion sources are not included

  8. Data source handbook

    CERN Document Server

    Warden, Pete

    2011-01-01

    If you''re a developer looking to supplement your own data tools and services, this concise ebook covers the most useful sources of public data available today. You''ll find useful information on APIs that offer broad coverage, tie their data to the outside world, and are either accessible online or feature downloadable bulk data. You''ll also find code and helpful links. This guide organizes APIs by the subjects they cover-such as websites, people, or places-so you can quickly locate the best resources for augmenting the data you handle in your own service. Categories include: Website tools

  9. GPC Single Source Letter

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  10. Sources of tritium

    International Nuclear Information System (INIS)

    Phillips, J.E.; Easterly, C.E.

    1980-12-01

    A review of tritium sources is presented. The tritium production and release rates are discussed for light water reactors (LWRs), heavy water reactors (HWRs), high temperature gas cooled reactors (HTGRs), liquid metal fast breeder reactors (LMFBRs), and molten salt breeder reactors (MSBRs). In addition, release rates are discussed for tritium production facilities, fuel reprocessing plants, weapons detonations, and fusion reactors. A discussion of the chemical form of the release is included. The energy producing facilities are ranked in order of increasing tritium production and release. The ranking is: HTGRs, LWRs, LMFBRs, MSBRs, and HWRs. The majority of tritium has been released in the form of tritiated water

  11. Zγ production at NNLO including anomalous couplings

    Science.gov (United States)

    Campbell, John M.; Neumann, Tobias; Williams, Ciaran

    2017-11-01

    In this paper we present a next-to-next-to-leading order (NNLO) QCD calculation of the processes pp → l + l -γ and pp\\to ν \\overline{ν}γ that we have implemented in MCFM. Our calculation includes QCD corrections at NNLO both for the Standard Model (SM) and additionally in the presence of Zγγ and ZZγ anomalous couplings. We compare our implementation, obtained using the jettiness slicing approach, with a previous SM calculation and find broad agreement. Focusing on the sensitivity of our results to the slicing parameter, we show that using our setup we are able to compute NNLO cross sections with numerical uncertainties of about 0.1%, which is small compared to residual scale uncertainties of a few percent. We study potential improvements using two different jettiness definitions and the inclusion of power corrections. At √{s}=13 TeV we present phenomenological results and consider Zγ as a background to H → Zγ production. We find that, with typical cuts, the inclusion of NNLO corrections represents a small effect and loosens the extraction of limits on anomalous couplings by about 10%.

  12. Alternating phase focussing including space charge

    International Nuclear Information System (INIS)

    Cheng, W.H.; Gluckstern, R.L.

    1992-01-01

    Longitudinal stability can be obtained in a non-relativistic drift tube accelerator by traversing each gap as the rf accelerating field rises. However, the rising accelerating field leads to a transverse defocusing force which is usually overcome by magnetic focussing inside the drift tubes. The radio frequency quadrupole is one way of providing simultaneous longitudinal and transverse focusing without the use of magnets. One can also avoid the use of magnets by traversing alternate gaps between drift tubes as the field is rising and falling, thus providing an alternation of focussing and defocusing forces in both the longitudinal and transverse directions. The stable longitudinal phase space area is quite small, but recent efforts suggest that alternating phase focussing (APF) may permit low velocity acceleration of currents in the 100-300 ma range. This paper presents a study of the parameter space and a test of crude analytic predictions by adapting the code PARMILA, which includes space charge, to APF. 6 refs., 3 figs

  13. Probabilistic production simulation including CHP plants

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H.V.; Palsson, H.; Ravn, H.F.

    1997-04-01

    A probabilistic production simulation method is presented for an energy system containing combined heat and power plants. The method permits incorporation of stochastic failures (forced outages) of the plants and is well suited for analysis of the dimensioning of the system, that is, for finding the appropriate types and capacities of production plants in relation to expansion planning. The method is in the tradition of similar approaches for the analysis of power systems, based on the load duration curve. The present method extends on this by considering a two-dimensional load duration curve where the two dimensions represent heat and power. The method permits the analysis of a combined heat and power system which includes all the basic relevant types of plants, viz., condensing plants, back pressure plants, extraction plants and heat plants. The focus of the method is on the situation where the heat side has priority. This implies that on the power side there may be imbalances between demand and production. The method permits quantification of the expected power overflow, the expected unserviced power demand, and the expected unserviced heat demand. It is shown that a discretization method as well as double Fourier series may be applied in algorithms based on the method. (au) 1 tab., 28 ills., 21 refs.

  14. Langevin simulations of QCD, including fermions

    International Nuclear Information System (INIS)

    Kronfeld, A.S.

    1986-02-01

    We encounter critical slow down in updating when xi/a -> infinite and in matrix inversion (needed to include fermions) when msub(q)a -> 0. A simulation that purports to solve QCD numerically will encounter these limits, so to face the challenge in the title of this workshop, we must cure the disease of critical slow down. Physically, this critical slow down is due to the reluctance of changes at short distances to propagate to large distances. Numerically, the stability of an algorithm at short wavelengths requires a (moderately) small step size; critical slow down occurs when the effective long wavelength step size becomes tiny. The remedy for this disease is an algorithm that propagates signals quickly throughout the system; i.e. one whose effective step size is not reduced for the long wavelength conponents of the fields. (Here the effective ''step size'' is essentially an inverse decorrelation time.) To do so one must resolve various wavelengths of the system and modify the dynamics (in CPU time) of the simulation so that all modes evolve at roughly the same rate. This can be achieved by introducing Fourier transforms. I show how to implement Fourier acceleration for Langevin updating and for conjugate gradient matrix inversion. The crucial feature of these algorithms that lends them to Fourier acceleration is that they update the lattice globally; hence the Fourier transforms are computed once per sweep rather than once per hit. (orig./HSI)

  15. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  16. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  17. Quantitative description of woody plant communities: Part II ...

    African Journals Online (AJOL)

    These procedures are divided into primary and secondary calculations. The former is then divided into the calculation of spatial tree volume and preliminary calculations regarding the complete quantitative description. The latter include the calculation of the evapotranspiration tree equivalent (ETTE), browse tree equivalent ...

  18. Superconducting ECR ion source system

    International Nuclear Information System (INIS)

    Sharma, S.C.; Gore, J.A.; Gupta, A.K.; Saxena, A.

    2017-01-01

    In order to cover the entire mass range of the elements across the periodic table, an ECR based heavy ion accelerator programme, consisting of a superconducting ECR (Electron Cyclotron Resonance) source and a room temperature RFQ (Radio Frequency Quadrupole) followed by low and high beta superconducting resonator cavities has been proposed. The 18 GHz superconducting ECR ion source system has already been commissioned and being operated periodically at FOTIA beam hall. This source is capable of delivering ion beams right from proton to uranium with high currents and high charge states over a wide mass range (1/7 ≤ q/m ≤ 1/2) across the periodic table, including U"3"4"+ (q/m∼1/7) with 100 pna yield. The normalized transverse beam emittance from ECR source is expected to be <1.0 pi mm mrad. ECR ion sources are quite robust, making them suitable for operating for weeks continuously without any interruption

  19. Incidents with hazardous radiation sources

    International Nuclear Information System (INIS)

    Schoenhacker, Stefan

    2016-01-01

    Incidents with hazardous radiation sources can occur in any country, even those without nuclear facilities. Preparedness for such incidents is supposed to fulfill globally agreed minimum standards. Incidents are categorized in incidents with licensed handling of radiation sources as for material testing, transport accidents of hazardous radiation sources, incidents with radionuclide batteries, incidents with satellites containing radioactive inventory, incidents wit not licensed handling of illegally acquired hazardous radiation sources. The emergency planning in Austria includes a differentiation according to the consequences: incidents with release of radioactive materials resulting in restricted contamination, incidents with release of radioactive materials resulting in local contamination, and incidents with the hazard of e@nhanced exposure due to the radiation source.

  20. [Origin of sennosides in health teas including Malva leaves].

    Science.gov (United States)

    Kojima, T; Kishi, M; Sekita, S; Satake, M

    2001-06-01

    The aim of this study is to clarify whether sennosides are contained in the leaf of Malva verticillata L., and then to clarify the source of sennosides in health teas including malva leaves. The identification and determination of sennosides were performed with thin layer chromatography and high performance liquid chromatography. The leaf of Malva verticillata L. did not contain sennosides A or B and could be easily distinguished from senna leaf. Our previous report showed that sennosides are contained in weight-reducing herbal teas including malva leaves, and that senna leaf is a herbal component in some teas. Furthermore, in 10 samples of health tea including malva leaves that were bought last year, the smallest amount of sennosides was 6.1 mg/bag, and all health teas including malva leaves contained the leaf and midrib of senna. We suggest that sennosides A and B are not contained in the leaf of Malva verticillata L., and that the sennosides in health teas including malva leaves are not derived from malva leaf but from senna leaf.