WorldWideScience

Sample records for surveys allowed quantification

  1. Photochemical Microscale Electrophoresis Allows Fast Quantification of Biomolecule Binding.

    Science.gov (United States)

    Möller, Friederike M; Kieß, Michael; Braun, Dieter

    2016-04-27

    Intricate spatiotemporal patterns emerge when chemical reactions couple to physical transport. We induce electrophoretic transport by a confined photochemical reaction and use it to infer the binding strength of a second, biomolecular binding reaction under physiological conditions. To this end, we use the photoactive compound 2-nitrobenzaldehyde, which releases a proton upon 375 nm irradiation. The charged photoproducts locally perturb electroneutrality due to differential diffusion, giving rise to an electric potential Φ in the 100 μV range on the micrometer scale. Electrophoresis of biomolecules in this field is counterbalanced by back-diffusion within seconds. The biomolecule concentration is measured by fluorescence and settles proportionally to exp(-μ/D Φ). Typically, binding alters either the diffusion coefficient D or the electrophoretic mobility μ. Hence, the local biomolecule fluorescence directly reflects the binding state. A fit to the law of mass action reveals the dissociation constant of the binding reaction. We apply this approach to quantify the binding of the aptamer TBA15 to its protein target human-α-thrombin and to probe the hybridization of DNA. Dissociation constants in the nanomolar regime were determined and match both results in literature and in control experiments using microscale thermophoresis. As our approach is all-optical, isothermal and requires only nanoliter volumes at nanomolar concentrations, it will allow for the fast screening of biomolecule binding in low volume multiwell formats.

  2. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  3. Stereotypical Escape Behavior in Caenorhabditis elegans Allows Quantification of Effective Heat Stimulus Level.

    Directory of Open Access Journals (Sweden)

    Kawai Leung

    2016-12-01

    Full Text Available A goal of many sensorimotor studies is to quantify the stimulus-behavioral response relation for specific organisms and specific sensory stimuli. This is especially important to do in the context of painful stimuli since most animals in these studies cannot easily communicate to us their perceived levels of such noxious stimuli. Thus progress on studies of nociception and pain-like responses in animal models depends crucially on our ability to quantitatively and objectively infer the sensed levels of these stimuli from animal behaviors. Here we develop a quantitative model to infer the perceived level of heat stimulus from the stereotyped escape response of individual nematodes Caenorhabditis elegans stimulated by an IR laser. The model provides a method for quantification of analgesic-like effects of chemical stimuli or genetic mutations in C. elegans. We test ibuprofen-treated worms and a TRPV (transient receptor potential mutant, and we show that the perception of heat stimuli for the ibuprofen treated worms is lower than the wild-type. At the same time, our model shows that the mutant changes the worm's behavior beyond affecting the thermal sensory system. Finally, we determine the stimulus level that best distinguishes the analgesic-like effects and the minimum number of worms that allow for a statistically significant identification of these effects.

  4. Quantification of risk considering external events on the change of allowed outage time and the preventive maintenance during power operation

    Energy Technology Data Exchange (ETDEWEB)

    Kang, D. J.; Kim, K. Y.; Yang, J. E

    2001-03-01

    In this study, for the major safety systems of Ulchin Units 3/4, we quantify the risk on the change of AOT and the PM during power operation to identify the effects on the results of external events PSA when nuclear power plant changes such as allowed outage time are requested. The systems for which the risks on the change of allowed outage time are quantified are High Pressure Safety Injection System (HPSIS), Containment Spray System (CSS), and Emergency Diesel Generator (EDG). The systems for which the risks on the PM during power operation are Low Pressure Safety Injection System (LPSIS), CSS, EDG, Essential Service Water System (ESWS). Following conclusions can be obtained through this study: 1)The increase of core damage frequency ({delta}CDF) on the change of AOT and the conditional core damage probability (CCDP) on the on-line PM of each system are differently quantified according to the cases of considering only internal events or only external events. . 2)It is expected that the quantification of risk including internal and external events is advantageous for the licensee of NPP if the regulatory acceptance criteria for the technical specification changes are relatively set up. However, it is expected to be disadvantageous for the licensee if the acceptance criteria are absolutely set up. 3)It is expected that the conduction on the quantification of only a fire event is sufficient when the quantification of external events PSA model is required for the plant changes of Korea Standard NPPs. 4)It is expected that the quantification of the increase of core damage frequency and the incremental conditional core damage probability on technical specification changes are not needed if the quantification results of those considering only internal events are below regulatory acceptance criteria and the external events PSA results are not greatly affected by the system availability. However, it is expected that the quantification of risk considering external events

  5. Quantification of risk considering external events on the change of allowed outage time and the preventive maintenance during power operation

    International Nuclear Information System (INIS)

    Kang, D. J.; Kim, K. Y.; Yang, J. E.

    2001-03-01

    In this study, for the major safety systems of Ulchin Units 3/4, we quantify the risk on the change of AOT and the PM during power operation to identify the effects on the results of external events PSA when nuclear power plant changes such as allowed outage time are requested. The systems for which the risks on the change of allowed outage time are quantified are High Pressure Safety Injection System (HPSIS), Containment Spray System (CSS), and Emergency Diesel Generator (EDG). The systems for which the risks on the PM during power operation are Low Pressure Safety Injection System (LPSIS), CSS, EDG, Essential Service Water System (ESWS). Following conclusions can be obtained through this study: 1)The increase of core damage frequency (ΔCDF) on the change of AOT and the conditional core damage probability (CCDP) on the on-line PM of each system are differently quantified according to the cases of considering only internal events or only external events. . 2)It is expected that the quantification of risk including internal and external events is advantageous for the licensee of NPP if the regulatory acceptance criteria for the technical specification changes are relatively set up. However, it is expected to be disadvantageous for the licensee if the acceptance criteria are absolutely set up. 3)It is expected that the conduction on the quantification of only a fire event is sufficient when the quantification of external events PSA model is required for the plant changes of Korea Standard NPPs. 4)It is expected that the quantification of the increase of core damage frequency and the incremental conditional core damage probability on technical specification changes are not needed if the quantification results of those considering only internal events are below regulatory acceptance criteria and the external events PSA results are not greatly affected by the system availability. However, it is expected that the quantification of risk considering external events on

  6. The Vital Role of Administrative Cost Allowances to Student Financial Aid Offices: Key Findings from NASFAA's Administrative Cost Allowance Survey, July 2011

    Science.gov (United States)

    National Association of Student Financial Aid Administrators (NJ1), 2011

    2011-01-01

    The National Association of Student Financial Aid Administrators (NASFAA) recently conducted a survey on the 2009-10 award year Administrative Cost Allowances (ACA), which are funds used by colleges and universities to support operations and professional development. Specifically, ACA is often used in essential areas that support the day-to-day…

  7. Fibrin-Targeted Magnetic Resonance Imaging Allows In Vivo Quantification of Thrombus Fibrin Content and Identifies Thrombi Amenable for Thrombolysis

    Science.gov (United States)

    Jenkins, Julia; Modarai, Bijan; Wiethoff, Andrea J.; Phinikaridou, Alkystis; Grover, Steven P.; Patel, Ashish S.; Schaeffter, Tobias; Smith, Alberto; Botnar, Rene M.

    2014-01-01

    Objective Deep venous thrombosis is a major health problem. Thrombolytic therapies are effective in recanalizing the veins and preventing post-thrombotic complications, but there is no consensus on selection criteria. The aim of this study was to investigate a fibrin-specific MRI contrast agent (EP-2104R) for the accurate quantification of thrombus’ fibrin content in vivo and for the identification of thrombus suitable for thrombolysis. Approach and Results Venous thrombosis was induced in the inferior vena cava of 8- to 10-week-old male BALB/C mice and MRI performed 2, 4, 7, 10, 14, and 21 days later. Eighteen mice were scanned at each time point pre and 2 hours post injection of EP-2104R (8.0 μmol/kg) with 12 mice at each time point used to correlate fibrin contrast uptake with thrombus’ histological stage and fibrin content. Six mice at each time point were immediately subjected to intravascular thrombolytic therapy (10 mg/kg of tissue-type plasminogen activator). Mice were imaged to assess response to lytic therapy 24 hours after thrombolytic treatment. Two mice at each time point were scanned post injection of 0.2 mmol/kg of Gd-DTPA (gadolinium with diethylenetriaminepentacetate, Magnevist, Schering AG, Berlin, Germany) for control purpose. Contrast uptake was correlated positively with the fibrin content of the thrombus measured by Western blotting (R2=0.889; PThrombus relaxation rate (R1) post contrast and the change in visualized thrombus size on late gadolinium enhancement inversion recovery MRI pre–EP-2104R and post–EP-2104R injection were the best predictors for successful thrombolysis (area under the curve, 0.989 [95% confidence interval, 0.97–1.00] and 0.994 [95% confidence interval, 0.98–1.00] respectively). Conclusions MRI with a fibrin-specific contrast agent accurately estimates thrombus fibrin content in vivo and identifies thrombi that are amenable for thrombolysis. PMID:24723557

  8. Predicting medical professionals' intention to allow family presence during resuscitation: A cross sectional survey.

    Science.gov (United States)

    Lai, Meng-Kuan; Aritejo, Bayu Aji; Tang, Jing-Shia; Chen, Chien-Liang; Chuang, Chia-Chang

    2017-05-01

    Family presence during resuscitation is an emerging trend, yet it remains controversial, even in countries with relatively high acceptance of family presence during resuscitation among medical professionals. Family presence during resuscitation is not common in many countries, and medical professionals in these regions are unfamiliar with family presence during resuscitation. Therefore, this study predicted the medical professionals' intention to allow family presence during resuscitation by applying the theory of planned behaviour. A cross-sectional survey. A single medical centre in southern Taiwan. Medical staffs including physicians and nurses in a single medical centre (n=714). A questionnaire was constructed to measure the theory of planned behaviour constructs of attitudes, subjective norms, perceived behavioural control, and behavioural intentions as well as the awareness of family presence during resuscitation and demographics. In total, 950 questionnaires were distributed to doctors and nurses in a medical centre. Among the 714 valid questionnaires, only 11 participants were aware of any association in Taiwan that promotes family presence during resuscitation; 94.7% replied that they were unsure (30.4%) or that their unit did not have a family presence during resuscitation policy (74.8%). Regression analysis was performed to predict medical professionals' intention to allow family presence during resuscitation. The results indicated that only positive attitudes and subjective norms regarding family presence during resuscitation and clinical tenure could predict the intention to allow family presence during resuscitation. Because Family presence during resuscitation practice is not common in Taiwan and only 26.19% of the participants agreed to both items measuring the intention to allow family presence during resuscitation, we recommend the implementation of a family presence during resuscitation education program that will enhance the positive beliefs

  9. The quantification of free Amadori compounds and amino acids allows to model the bound Maillard reaction products formation in soybean products

    NARCIS (Netherlands)

    Troise, Antonio Dario; Wiltafsky, Markus; Fogliano, Vincenzo; Vitaglione, Paola

    2018-01-01

    The quantification of protein bound Maillard reaction products (MRPs) is still a challenge in food chemistry. Protein hydrolysis is the bottleneck step: it is time consuming and the protein degradation is not always complete. In this study, the quantitation of free amino acids and Amadori products

  10. Hepatitis B virus DNA quantification with the three-in-one (3io) method allows accurate single-step differentiation of total HBV DNA and cccDNA in biopsy-size liver samples.

    Science.gov (United States)

    Taranta, Andrzej; Tien Sy, Bui; Zacher, Behrend Johan; Rogalska-Taranta, Magdalena; Manns, Michael Peter; Bock, Claus Thomas; Wursthorn, Karsten

    2014-08-01

    Hepatitis B virus (HBV) replicates via reverse transcription converting its partially double stranded genome into the covalently closed circular DNA (cccDNA). The long-lasting cccDNA serves as a replication intermediate in the nuclei of hepatocytes. It is an excellent, though evasive, parameter for monitoring the course of liver disease and treatment efficiency. To develop and test a new approach for HBV DNA quantification in serum and small-size liver samples. The p3io plasmid contains an HBV fragment and human β-actin gene (hACTB) as a standard. Respective TaqMan probes were labeled with different fluorescent dyes. A triplex real-time PCR for simultaneous quantification of total HBV DNA, cccDNA and hACTB could be established. Three-in-one method allows simultaneous analysis of 3 targets with a lower limit of quantification of 48 copies per 20 μl PCR reaction and a wide range of linearity (R(2)>0.99, pDNA samples from HBV infected patients. Total HBV DNA and cccDNA could be quantified in 32 and 22 of 33 FFPE preserved liver specimens, respectively. Total HBV DNA concentrations quantified by the 3io method remained comparable with Cobas TaqMan HBV Test v2.0. The three-in-one protocol allows the single step quantification of viral DNA in samples from different sources. Therefore lower sample input, faster data acquisition, a lowered error and significantly lower costs are the advantages of the method. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. The quantification of free Amadori compounds and amino acids allows to model the bound Maillard reaction products formation in soybean products.

    Science.gov (United States)

    Troise, Antonio Dario; Wiltafsky, Markus; Fogliano, Vincenzo; Vitaglione, Paola

    2018-05-01

    The quantification of protein bound Maillard reaction products (MRPs) is still a challenge in food chemistry. Protein hydrolysis is the bottleneck step: it is time consuming and the protein degradation is not always complete. In this study, the quantitation of free amino acids and Amadori products (APs) was compared to the percentage of blocked lysine by using chemometric tools. Eighty thermally treated soybean samples were analyzed by mass spectrometry to measure the concentration of free amino acids, free APs and the protein-bound markers of the Maillard reaction (furosine, Nε-(carboxymethyl)-l-lysine, Nε-(carboxyethyl)-l-lysine, total lysine). Results demonstrated that Discriminant Analysis (DA) and Correlated Component Regression (CCR) correctly estimated the percent of blocked lysine in a validation and prediction set. These findings indicate that the measure of free markers reflects the extent of protein damage in soybean samples and it suggests the possibility to obtain rapid information on the quality of the industrial processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Inference of pain stimulus level from stereotypical behavioral response of C.elegans allows quantification of effects of anesthesia and mutation

    Science.gov (United States)

    Leung, Kawai; Mohammadi, Aylia; Ryu, William; Nemenman, Ilya

    In animals, we must infer the pain level from experimental characterization of behavior. This is not trivial since behaviors are very complex and multidimensional. To establish C.elegans as a model for pain research, we propose for the first time a quantitative model that allows inference of a thermal nociceptive stimulus level from the behavior of an individual worm. We apply controlled levels of pain by locally heating worms with an infrared laser and capturing the subsequent behavior. We discover that the behavioral response is a product of stereotypical behavior and a nonlinear function of the strength of stimulus. The same stereotypical behavior is observed in normal, anesthetized and mutated worms. From this result we build a Bayesian model to infer the strength of laser stimulus from the behavior. This model allows us to measure the efficacy of anaesthetization and mutation by comparing the inferred strength of stimulus. Based on the measured nociceptive escape of over 200 worms, our model is able to significantly differentiate normal, anaesthetized and mutated worms with 40 worm samples. This work was partially supported by NSF Grant No. IOS/1208126 and HFSP Grant No. RGY0084/.

  13. Electronic cigarette use in restaurants and workplaces where combustible tobacco smoking is not allowed: an Internet survey in Japan.

    Science.gov (United States)

    Kiyohara, Kosuke; Tabuchi, Takahiro

    2018-05-01

    The present study aimed to examine the experience of actual electronic cigarette (e-cigarette) use in smoke-free areas of restaurants and workplaces and to explore the determinants associated with such use among Japanese adults who reported any experience using e-cigarettes (e-cigarette ever-users). An Internet-based self-reported questionnaire survey was conducted in 2015 on Japanese e-cigarette ever-users. The proportion of the respondents who had ever used or frequently used e-cigarettes in smoke-free restaurants and/or workplaces was calculated. Potential factors associated with e-cigarette use in those smoke-free areas were also examined by using multivariable logistic regression analyses. In total, 1243 e-cigarette ever-users (662 current and 581 former e-cigarette users) were analysed. The majority of them (1020/1243, 82.1%) were male and their mean age ± SD was 47.0±10.4 years. The proportion of those who had ever used e-cigarettes in smoke-free restaurants was 28.8% (358/1243) and that in smoke-free workplaces was 25.5% (317/1243), respectively. The proportion of those who had frequently used e-cigarettes in smoke-free restaurants was 18.5% (230/1243) and that in smoke-free workplaces was 16.3% (202/1243), respectively. In general, the proportion of e-cigarette use in those smoke-free areas was higher among those having a higher educational level than those having a lower educational level. Among adult Japanese e-cigarette ever-users, approximately 26%-29% had ever used and 16%-19% had frequently used e-cigarettes in restaurants and/or workplaces where combustible tobacco smoking is not allowed. Policy-makers may need to establish explicit rules as to e-cigarette use in smoke-free environments. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Allowing Physicians to Choose the Value of Compensation for Participation in a Web-Based Survey: Randomized Controlled Trial.

    Science.gov (United States)

    Turnbull, Alison E; O'Connor, Cristi L; Lau, Bryan; Halpern, Scott D; Needham, Dale M

    2015-07-29

    Survey response rates among physicians are declining, and determining an appropriate level of compensation to motivate participation poses a major challenge. To estimate the effect of permitting intensive care physicians to select their preferred level of compensation for completing a short Web-based survey on physician (1) response rate, (2) survey completion rate, (3) time to response, and (4) time spent completing the survey. A total of 1850 US intensivists from an existing database were randomized to receive a survey invitation email with or without an Amazon.com incentive available to the first 100 respondents. The incentive could be instantly redeemed for an amount chosen by the respondent, up to a maximum of US $50. The overall response rate was 35.90% (630/1755). Among the 35.4% (111/314) of eligible participants choosing the incentive, 80.2% (89/111) selected the maximum value. Among intensivists offered an incentive, the response was 6.0% higher (95% CI 1.5-10.5, P=.01), survey completion was marginally greater (807/859, 94.0% vs 892/991, 90.0%; P=.06), and the median number of days to survey response was shorter (0.8, interquartile range [IQR] 0.2-14.4 vs 6.6, IQR 0.3-22.3; P=.001), with no difference in time spent completing the survey. Permitting intensive care physicians to determine compensation level for completing a short Web-based survey modestly increased response rate and substantially decreased response time without decreasing the time spent on survey completion.

  15. An Alternative to the Carlson-Parkin Method for the Quantification of Qualitative Inflation Expectations: Evidence from the Ifo World Economic Survey

    OpenAIRE

    Henzel, Steffen; Wollmershäuser, Timo

    2005-01-01

    This paper presents a new methodology for the quantification of qualitative survey data. Traditional conversion methods, such as the probability approach of Carlson and Parkin (1975) or the time-varying parameters model of Seitz (1988), require very restrictive assumptions concerning the expectations formation process of survey respondents. Above all, the unbiasedness of expectations, which is a necessary condition for rationality, is imposed. Our approach avoids these assumptions. The novelt...

  16. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  17. CHILD ALLOWANCE

    CERN Multimedia

    Human Resources Division

    2001-01-01

    HR Division wishes to clarify to members of the personnel that the allowance for a dependent child continues to be paid during all training courses ('stages'), apprenticeships, 'contrats de qualification', sandwich courses or other courses of similar nature. Any payment received for these training courses, including apprenticeships, is however deducted from the amount reimbursable as school fees. HR Division would also like to draw the attention of members of the personnel to the fact that any contract of employment will lead to the suppression of the child allowance and of the right to reimbursement of school fees.

  18. Detection, quantification and genotyping of Herpes Simplex Virus in cervicovaginal secretions by real-time PCR: a cross sectional survey

    Directory of Open Access Journals (Sweden)

    Natividad-Sancho Angels

    2005-08-01

    Full Text Available Abstract Background Herpes Simplex Virus (HSV Genital Ulcer Disease (GUD is an important public health problem, whose interaction with HIV results in mutually enhancing epidemics. Conventional methods for detecting HSV tend to be slow and insensitive. We designed a rapid PCR-based assay to quantify and type HSV in cervicovaginal lavage (CVL fluid of subjects attending a Genito-Urinary Medicine (GUM clinic. Vaginal swabs, CVL fluid and venous blood were collected. Quantitative detection of HSV was conducted using real time PCR with HSV specific primers and SYBR Green I. Fluorogenic TaqMan Minor Groove Binder (MGB probes designed around a single base mismatch in the HSV DNA polymerase I gene were used to type HSV in a separate reaction. The Kalon test was used to detect anti-HSV-2 IgG antibodies in serum. Testing for HIV, other Sexually Transmitted Infections (STI and related infections was based on standard clinical and laboratory methods. Results Seventy consecutive GUM clinic attendees were studied. Twenty-seven subjects (39% had detectable HSV DNA in CVL fluid; HSV-2 alone was detected in 19 (70% subjects, HSV-1 alone was detected in 4 (15% subjects and both HSV types were detected in 4 (15% subjects. Eleven out of 27 subjects (41% with anti-HSV-2 IgG had detectable HSV-2 DNA in CVL fluid. Seven subjects (10% were HIV-positive. Three of seven (43% HIV-infected subjects and two of five subjects with GUD (40% were secreting HSV-2. None of the subjects in whom HSV-1 was detected had GUD. Conclusion Quantitative real-time PCR and Taqman MGB probes specific for HSV-1 or -2 were used to develop an assay for quantification and typing of HSV. The majority of subjects in which HSV was detected had low levels of CVL fluid HSV, with no detectable HSV-2 antibodies and were asymptomatic.

  19. Validation of a food quantification picture book targeting children of 0-10 years of age for pan-European and national dietary surveys.

    Science.gov (United States)

    Trolle, Ellen; Vandevijvere, Stefanie; Ruprich, Jiří; Ege, Majken; Dofková, Marcela; de Boer, Evelien; Ocké, Marga

    2013-12-01

    The aim of the present study was to validate thirty-eight picture series of six pictures each developed within the PANCAKE (Pilot study for the Assessment of Nutrient intake and food Consumption Among Kids in Europe) project for portion size estimation of foods consumed by infants, toddlers and children for future pan-European and national dietary surveys. Identical validation sessions were conducted in three European countries. In each country, forty-five foods were evaluated; thirty-eight foods were the same as the depicted foods, and seven foods were different, but meant to be quantified by the use of one of the thirty-eight picture series. Each single picture within a picture series was evaluated six times by means of predefined portions. Therefore, thirty-six pre-weighed portions of each food were evaluated by convenience samples of parents having children aged from 3 months to 10 years. The percentages of participants choosing the correct picture, the picture adjacent to the correct picture or a distant picture were calculated, and the performance of individual pictures within the series was assessed. For twenty foods, the picture series performed acceptably (mean difference between the estimated portion number and the served portion number less than 0.4 (SD picture series were acceptable for inclusion in the PANCAKE picture book. However, the picture series of baby food, salads and cakes either can only be used for foods that are very similar to those depicted or need to be substituted by another quantification tool.

  20. Atmospheric characterization through fused mobile airborne and surface in situ surveys: methane emissions quantification from a producing oil field

    Science.gov (United States)

    Leifer, Ira; Melton, Christopher; Fischer, Marc L.; Fladeland, Matthew; Frash, Jason; Gore, Warren; Iraci, Laura T.; Marrero, Josette E.; Ryoo, Ju-Mee; Tanaka, Tomoaki; Yates, Emma L.

    2018-03-01

    Methane (CH4) inventory uncertainties are large, requiring robust emission derivation approaches. We report on a fused airborne-surface data collection approach to derive emissions from an active oil field near Bakersfield, central California. The approach characterizes the atmosphere from the surface to above the planetary boundary layer (PBL) and combines downwind trace gas concentration anomaly (plume) above background with normal winds to derive flux. This approach does not require a well-mixed PBL; allows explicit, data-based, uncertainty evaluation; and was applied to complex topography and wind flows. In situ airborne (collected by AJAX - the Alpha Jet Atmospheric eXperiment) and mobile surface (collected by AMOG - the AutoMObile trace Gas - Surveyor) data were collected on 19 August 2015 to assess source strength. Data included an AMOG and AJAX intercomparison transect profiling from the San Joaquin Valley (SJV) floor into the Sierra Nevada (0.1-2.2 km altitude), validating a novel surface approach for atmospheric profiling by leveraging topography. The profile intercomparison found good agreement in multiple parameters for the overlapping altitude range from 500 to 1500 m for the upper 5 % of surface winds, which accounts for wind-impeding structures, i.e., terrain, trees, buildings, etc. Annualized emissions from the active oil fields were 31.3 ± 16 Gg methane and 2.4 ± 1.2 Tg carbon dioxide. Data showed the PBL was not well mixed at distances of 10-20 km downwind, highlighting the importance of the experimental design.

  1. Atmospheric characterization through fused mobile airborne and surface in situ surveys: methane emissions quantification from a producing oil field

    Directory of Open Access Journals (Sweden)

    I. Leifer

    2018-03-01

    Full Text Available Methane (CH4 inventory uncertainties are large, requiring robust emission derivation approaches. We report on a fused airborne–surface data collection approach to derive emissions from an active oil field near Bakersfield, central California. The approach characterizes the atmosphere from the surface to above the planetary boundary layer (PBL and combines downwind trace gas concentration anomaly (plume above background with normal winds to derive flux. This approach does not require a well-mixed PBL; allows explicit, data-based, uncertainty evaluation; and was applied to complex topography and wind flows. In situ airborne (collected by AJAX – the Alpha Jet Atmospheric eXperiment and mobile surface (collected by AMOG – the AutoMObile trace Gas – Surveyor data were collected on 19 August 2015 to assess source strength. Data included an AMOG and AJAX intercomparison transect profiling from the San Joaquin Valley (SJV floor into the Sierra Nevada (0.1–2.2 km altitude, validating a novel surface approach for atmospheric profiling by leveraging topography. The profile intercomparison found good agreement in multiple parameters for the overlapping altitude range from 500 to 1500 m for the upper 5 % of surface winds, which accounts for wind-impeding structures, i.e., terrain, trees, buildings, etc. Annualized emissions from the active oil fields were 31.3 ± 16 Gg methane and 2.4 ± 1.2 Tg carbon dioxide. Data showed the PBL was not well mixed at distances of 10–20 km downwind, highlighting the importance of the experimental design.

  2. Quantification of physical activity using the QAPACE Questionnaire: a two stage cluster sample design survey of children and adolescents attending urban school.

    Science.gov (United States)

    Barbosa, Nicolas; Sanchez, Carlos E; Patino, Efrain; Lozano, Benigno; Thalabard, Jean C; LE Bozec, Serge; Rieu, Michel

    2016-05-01

    Quantification of physical activity as energy expenditure is important since youth for the prevention of chronic non communicable diseases in adulthood. It is necessary to quantify physical activity expressed in daily energy expenditure (DEE) in school children and adolescents between 8-16 years, by age, gender and socioeconomic level (SEL) in Bogotá. This is a Two Stage Cluster Survey Sample. From a universe of 4700 schools and 760000 students from three existing socioeconomic levels in Bogotá (low, medium and high). The random sample was 20 schools and 1840 students (904 boys and 936 girls). Foreshadowing desertion of participants and inconsistency in the questionnaire responses, the sample size was increased. Thus, 6 individuals of each gender for each of the nine age groups were selected, resulting in a total sample of 2160 individuals. Selected students filled the QAPACE questionnaire under supervision. The data was analyzed comparing means with multivariate general linear model. Fixed factors used were: gender (boys and girls), age (8 to 16 years old) and tri-strata SEL (low, medium and high); as independent variables were assessed: height, weight, leisure time, expressed in hours/day and dependent variable: daily energy expenditure DEE (kJ.kg-1.day-1): during leisure time (DEE-LT), during school time (DEE-ST), during vacation time (DEE-VT), and total mean DEE per year (DEEm-TY) RESULTS: Differences in DEE by gender, in boys, LT and all DEE, with the SEL all variables were significant; but age-SEL was only significant in DEE-VT. In girls, with the SEL all variables were significant. The post hoc multiple comparisons tests were significant with age using Fisher's Least Significant Difference (LSD) test in all variables. For both genders and for all SELs the values in girls had the higher value except SEL high (5-6) The boys have higher values in DEE-LT, DEE-ST, DEE-VT; except in DEEm-TY in SEL (5-6) In SEL (5-6) all DEEs for both genders are highest. For SEL

  3. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  4. Trading sulfur dioxide allowances

    International Nuclear Information System (INIS)

    Goldburg, C.B.; Lave, L.B.

    1992-01-01

    The 1990 Clean Air Act is aimed at generators larger than 25 MW, as these are the largest polluters. Market incentives give each source an emissions allocation but also flexibility. If a plant has lower emissions than the target, it can sell the 'surplus' emissions as allowances to plants that fail to meet the target. Only a few trades have occurred to date. Market-based incentives should lower the costs of improving environmental quality significantly. However, currently institutional dificulties hamper implementation

  5. 40 CFR 35.2025 - Allowance and advance of allowance.

    Science.gov (United States)

    2010-07-01

    ... advance of allowance. (a) Allowance. Step 2+3 and Step 3 grant agreements will include an allowance for facilities planning and design of the project and Step 7 agreements will include an allowance for facility... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Allowance and advance of allowance. 35...

  6. Quantification procedures in micro X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Kanngiesser, Birgit

    2003-01-01

    For the quantification in micro X-ray fluorescence analysis standardfree quantification procedures have become especially important. An introduction to the basic concepts of these quantification procedures is given, followed by a short survey of the procedures which are available now and what kind of experimental situations and analytical problems are addressed. The last point is extended by the description of an own development for the fundamental parameter method, which renders the inclusion of nonparallel beam geometries possible. Finally, open problems for the quantification procedures are discussed

  7. Validation of a food quantification picture book targeting children of 0–10 years of age for pan-European and national dietary surveys

    DEFF Research Database (Denmark)

    Trolle, Ellen; Vandevijvere, Stefanie; Ruprich, Jiří

    2013-01-01

    and children for future pan-European and national dietary surveys. Identical validation sessions were conducted in three European countries. In each country, forty-five foods were evaluated; thirty-eight foods were the same as the depicted foods, and seven foods were different, but meant to be quantified......The aim of the present study was to validate thirty-eight picture series of six pictures each developed within the PANCAKE (Pilot study for the Assessment of Nutrient intake and food Consumption Among Kids in Europe) project for portion size estimation of foods consumed by infants, toddlers...... by the use of one of the thirty-eight picture series. Each single picture within a picture series was evaluated six times by means of predefined portions. Therefore, thirty-six pre-weighed portions of each food were evaluated by convenience samples of parents having children aged from 3 months to 10 years...

  8. Quantification in emission tomography

    International Nuclear Information System (INIS)

    Buvat, Irene

    2011-11-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena 2 - Main problems impacting quantification in PET and SPECT: problems, consequences, correction methods, results (Attenuation, scattering, partial volume effect, movement, un-stationary spatial resolution in SPECT, fortuitous coincidences in PET, standardisation in PET); 3 - Synthesis: accessible efficiency, know-how, Precautions, beyond the activity measurement

  9. Quantification of local mobilities

    DEFF Research Database (Denmark)

    Zhang, Y. B.

    2018-01-01

    A new method for quantification of mobilities of local recrystallization boundary segments is presented. The quantification is based on microstructures characterized using electron microscopy and on determination of migration velocities and driving forces for local boundary segments. Pure aluminium...... is investigated and the results show that even for a single recrystallization boundary, different boundary segments migrate differently, and the differences can be understood based on variations in mobilities and local deformed microstructures. The present work has important implications for understanding...

  10. Fluorescent quantification of melanin.

    Science.gov (United States)

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  11. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very ...

  12. FRANX. Application for analysis and quantification of the APS fire

    International Nuclear Information System (INIS)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-01-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  13. Quantification of birefringence readily measures the level of muscle damage in zebrafish

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Joachim, E-mail: Joachim.Berger@Monash.edu [Australian Regenerative Medicine Institute, EMBL Australia, Monash University, Clayton (Australia); Sztal, Tamar; Currie, Peter D. [Australian Regenerative Medicine Institute, EMBL Australia, Monash University, Clayton (Australia)

    2012-07-13

    Highlights: Black-Right-Pointing-Pointer Report of an unbiased quantification of the birefringence of muscle of fish larvae. Black-Right-Pointing-Pointer Quantification method readily identifies level of overall muscle damage. Black-Right-Pointing-Pointer Compare zebrafish muscle mutants for level of phenotype severity. Black-Right-Pointing-Pointer Proposed tool to survey treatments that aim to ameliorate muscular dystrophy. -- Abstract: Muscular dystrophies are a group of genetic disorders that progressively weaken and degenerate muscle. Many zebrafish models for human muscular dystrophies have been generated and analysed, including dystrophin-deficient zebrafish mutants dmd that model Duchenne Muscular Dystrophy. Under polarised light the zebrafish muscle can be detected as a bright area in an otherwise dark background. This light effect, called birefringence, results from the diffraction of polarised light through the pseudo-crystalline array of the muscle sarcomeres. Muscle damage, as seen in zebrafish models for muscular dystrophies, can readily be detected by a reduction in the birefringence. Therefore, birefringence is a very sensitive indicator of overall muscle integrity within larval zebrafish. Unbiased documentation of the birefringence followed by densitometric measurement enables the quantification of the birefringence of zebrafish larvae. Thereby, the overall level of muscle integrity can be detected, allowing the identification and categorisation of zebrafish muscle mutants. In addition, we propose that the establish protocol can be used to analyse treatments aimed at ameliorating dystrophic zebrafish models.

  14. Quantification of birefringence readily measures the level of muscle damage in zebrafish

    International Nuclear Information System (INIS)

    Berger, Joachim; Sztal, Tamar; Currie, Peter D.

    2012-01-01

    Highlights: ► Report of an unbiased quantification of the birefringence of muscle of fish larvae. ► Quantification method readily identifies level of overall muscle damage. ► Compare zebrafish muscle mutants for level of phenotype severity. ► Proposed tool to survey treatments that aim to ameliorate muscular dystrophy. -- Abstract: Muscular dystrophies are a group of genetic disorders that progressively weaken and degenerate muscle. Many zebrafish models for human muscular dystrophies have been generated and analysed, including dystrophin-deficient zebrafish mutants dmd that model Duchenne Muscular Dystrophy. Under polarised light the zebrafish muscle can be detected as a bright area in an otherwise dark background. This light effect, called birefringence, results from the diffraction of polarised light through the pseudo-crystalline array of the muscle sarcomeres. Muscle damage, as seen in zebrafish models for muscular dystrophies, can readily be detected by a reduction in the birefringence. Therefore, birefringence is a very sensitive indicator of overall muscle integrity within larval zebrafish. Unbiased documentation of the birefringence followed by densitometric measurement enables the quantification of the birefringence of zebrafish larvae. Thereby, the overall level of muscle integrity can be detected, allowing the identification and categorisation of zebrafish muscle mutants. In addition, we propose that the establish protocol can be used to analyse treatments aimed at ameliorating dystrophic zebrafish models.

  15. 76 FR 70883 - Clothing Allowance

    Science.gov (United States)

    2011-11-16

    ... prescription skin cream for the ``face, neck, hands, arms, or any area not covered by clothing may come into... the clothing or outergarment due to a second appliance or medication.'' This language will clarify that a second clothing allowance may be paid when a second appliance and/or medication increases the...

  16. Surfaces allowing for fractional statistics

    International Nuclear Information System (INIS)

    Aneziris, Charilaos.

    1992-07-01

    In this paper we give a necessary condition in order for a geometrical surface to allow for Abelian fractional statistics. In particular, we show that such statistics is possible only for two-dimentional oriented surfaces of genus zero, namely the sphere S 2 , the plane R 2 and the cylindrical surface R 1 *S 1 , and in general the connected sum of n planes R 2 -R 2 -R 2 -...-R 2 . (Author)

  17. Emission allowances stall in marketplace

    International Nuclear Information System (INIS)

    Malec, W.F.

    1993-01-01

    Misinformation and public misunderstanding have given emissions trading a bad reputation in the public marketplace, says William F. Malec, executive vice president of the Tennessee Valley Authority (TVA), in Knoxville, Tennessee. Media coverage of a May 1992 emissions-allowance trade between TVA and Wisconsin Power and Light open-quotes focused on the agreement's pollution-trading aspects, not its overall potential economic and environmental benefits,close quotes Malec says. Such negative portrayal of TVA's transaction sparked severe public criticism and charges that emissions trading gives utilities the right to pollute. open-quotes The fact is that TVA sought the emissions-trading agreement as a means to reduce overall emissions in the most cost-effective way,close quotes Malec explains. Emissions trading allows a company with emission levels lower than clean-air standards to earn open-quotes credits.close quotes These credits then may be purchased by a company with emission levels that exceed federal standards. Under this arrangement, the environment is protected and companies that buy credits save money because they do not have to purchase expensive emissions-control devices or reduce their production levels. Malec says TVA decided to enter into the emissions-allowance market, not only to cut costs, but also to publicize the existence and benefits of emissions trading. However, TVA's experience proves that open-quotes people will not accept what they do not understand,close quotes concludes Malec, open-quotes especially when complex environmental issues are involved.close quotes

  18. Transparent soil microcosms allow 3D spatial quantification of soil microbiological processes in vivo.

    Science.gov (United States)

    Downie, Helen F; Valentine, Tracy A; Otten, Wilfred; Spiers, Andrew J; Dupuy, Lionel X

    2014-01-01

    The recently developed transparent soil consists of particles of Nafion, a polymer with a low refractive index (RI), which is prepared by milling and chemical treatment for use as a soil analog. After the addition of a RI-matched solution, confocal imaging can be carried out in vivo and without destructive sampling. In a previous study, we showed that the new substrate provides a good approximation of plant growth conditions found in natural soils. In this paper, we present further development of the techniques for detailed quantitative analysis of images of root-microbe interactions in situ. Using this system it was possible for the first time to analyze bacterial distribution along the roots and in the bulk substrate in vivo. These findings indicate that the coupling of transparent soil with light microscopy is an important advance toward the discovery of the mechanisms of microbial colonisation of the rhizosphere.

  19. Novel stretch-sensor technology allows quantification of adherence and quality of home-exercises

    DEFF Research Database (Denmark)

    Rathleff, Michael Skovdal; Bandholm, Thomas Quaade; Ahrendt, Peter

    2014-01-01

    , from exercises not performed as prescribed. METHODS: 10 participants performed four different shoulder-abduction exercises in two rounds (80 exercise scenarios in total). The scenarios were (1) low contraction speed, full range of motion (0-90°), (2) high contraction speed, full range of motion (0-90...

  20. Extending the Rayleigh equation to allow competing isotope fractionating pathways to improve quantification of biodegradation

    NARCIS (Netherlands)

    van Breukelen, B.M.

    2007-01-01

    The Rayleigh equation relates the change in isotope ratio of an element in a substrate to the extent of substrate consumption via a single kinetic isotopic fractionation factor (α). Substrate consumption is, however, commonly distributed over several metabolic pathways each potentially having a

  1. Accident sequence quantification with KIRAP

    International Nuclear Information System (INIS)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong.

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP's cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs

  2. Accident sequence quantification with KIRAP

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Un; Han, Sang Hoon; Kim, Kil You; Yang, Jun Eon; Jeong, Won Dae; Chang, Seung Cheol; Sung, Tae Yong; Kang, Dae Il; Park, Jin Hee; Lee, Yoon Hwan; Hwang, Mi Jeong

    1997-01-01

    The tasks of probabilistic safety assessment(PSA) consists of the identification of initiating events, the construction of event tree for each initiating event, construction of fault trees for event tree logics, the analysis of reliability data and finally the accident sequence quantification. In the PSA, the accident sequence quantification is to calculate the core damage frequency, importance analysis and uncertainty analysis. Accident sequence quantification requires to understand the whole model of the PSA because it has to combine all event tree and fault tree models, and requires the excellent computer code because it takes long computation time. Advanced Research Group of Korea Atomic Energy Research Institute(KAERI) has developed PSA workstation KIRAP(Korea Integrated Reliability Analysis Code Package) for the PSA work. This report describes the procedures to perform accident sequence quantification, the method to use KIRAP`s cut set generator, and method to perform the accident sequence quantification with KIRAP. (author). 6 refs.

  3. Adjusting for unrecorded consumption in survey and per capita sales data: quantification of impact on gender- and age-specific alcohol-attributable fractions for oral and pharyngeal cancers in Great Britain.

    Science.gov (United States)

    Meier, Petra Sylvia; Meng, Yang; Holmes, John; Baumberg, Ben; Purshouse, Robin; Hill-McManus, Daniel; Brennan, Alan

    2013-01-01

    Large discrepancies are typically found between per capita alcohol consumption estimated via survey data compared with sales, excise or production figures. This may lead to significant inaccuracies when calculating levels of alcohol-attributable harms. Using British data, we demonstrate an approach to adjusting survey data to give more accurate estimates of per capita alcohol consumption. First, sales and survey data are adjusted to account for potential biases (e.g. self-pouring, under-sampled populations) using evidence from external data sources. Secondly, survey and sales data are aligned using different implementations of Rehm et al.'s method [in (2010) Statistical modeling of volume of alcohol exposure for epidemiological studies of population health: the US example. Pop Health Metrics 8, 1-12]. Thirdly, the impact of our approaches is tested by using our revised survey dataset to calculate alcohol-attributable fractions (AAFs) for oral and pharyngeal cancers. British sales data under-estimate per capita consumption by 8%, primarily due to illicit alcohol. Adjustments to survey data increase per capita consumption estimates by 35%, primarily due to under-sampling of dependent drinkers and under-estimation of home-poured spirits volumes. Before aligning sales and survey data, the revised survey estimate remains 22% lower than the revised sales estimate. Revised AAFs for oral and pharyngeal cancers are substantially larger with our preferred method for aligning data sources, yielding increases in an AAF from the original survey dataset of 0.47-0.60 (males) and 0.28-0.35 (females). It is possible to use external data sources to adjust survey data to reduce the under-estimation of alcohol consumption and then account for residual under-estimation using a statistical calibration technique. These revisions lead to markedly higher estimated levels of alcohol-attributable harm.

  4. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    descriptive trends are sufficient or an understanding of drivers and causes are needed. While there are certainly similar needs across uses and users, the necessary methods, data, and models for quantifying GHGs may vary. Common challenges for quantification noted in an informal survey of users of GHG information by Olander et al (2013) include the following. 3.1. Need for user-friendly methods that work across scales, regions, and systems Much of the data gathered and models developed by the research community provide high confidence in data or indicators computed at one place or for one issue, thus they are relevant for only specific uses, not transparent, or not comparable. These research approaches need to be translated to practitioners though the development of farmer friendly, transparent, comparable, and broadly applicable methods. Many users noted the need for quantification data and methods that work and are accurate across region and scales. One of the interviewed users, Charlotte Streck, summed it up nicely: 'A priority would be to produce comparable datasets for agricultural GHG emissions of particular agricultural practices for a broad set of countries ... with a gradual increase in accuracy'. 3.2. Need for lower cost, feasible approaches Concerns about cost and complexity of existing quantification methods were raised by a number of users interviewed in the survey. In the field it is difficult to measure changes in GHGs from agricultural management due to spatial and temporal variability, and the scale of the management-induced changes relative to background pools and fluxes. Many users noted data gaps and inconsistencies and insufficient technical capacity and infrastructure to generate necessary information, particularly in developing countries. The need for creative approaches for data collection and analysis, such as crowd sourcing and mobile technology, were noted. 3.3. Need for methods that can crosswalk between emission-reduction strategy and inventories

  5. Rapid quantification and sex determination of forensic evidence materials.

    Science.gov (United States)

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  6. Quantification of the sequestration of indium 111 labelled platelets

    International Nuclear Information System (INIS)

    Najean, Y.; Picard, N.; Dufour, V.; Rain, J.D.

    1988-01-01

    A simple method is proposed for an accurate quantification of the splenic and/or hepatic sequestration of the 111 In-labelled platelets. It could be allow a better prediction of the efficiency of splenectomy in idiopathic thrombocytopenic purpura [fr

  7. Quantification and presence of human ancient DNA in burial place ...

    African Journals Online (AJOL)

    Quantification and presence of human ancient DNA in burial place remains of Turkey using real time polymerase chain reaction. ... A published real-time PCR assay, which allows for the combined analysis of nuclear or ancient DNA and mitochondrial DNA, was modified. This approach can be used for recovering DNA from ...

  8. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  9. Quantification of informed opinion

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1985-01-01

    The objective of this session, Quantification of Informed Opinion, is to provide the statistician with a better understanding of this important area. The NRC uses informed opinion, sometimes called engineering judgment or subjective judgment, in many areas. Sometimes informed opinion is the only source of information that exists, especially in phenomenological areas, such as steam explosions, where experiments are costly and phenomena are very difficult to measure. There are many degrees of informed opinion. These vary from the weatherman who makes predictions concerning relatively high probability events with a large data base to the phenomenological expert who must use his intuition tempered with basic knowledge and little or no measured data to predict the behavior of events with a low probability of occurrence. The first paper in this session provides the reader with an overview of the subject area. The second paper provides some aspects that must be considered in the collection of informed opinion to improve the quality of the information. The final paper contains an example of the use of informed opinion in the area of seismic hazard characterization. These papers should be useful to researchers and statisticians who need to collect and use informed opinion in their work

  10. Quantification In Neurology

    Directory of Open Access Journals (Sweden)

    Netravati M

    2005-01-01

    Full Text Available There is a distinct shift of emphasis in clinical neurology in the last few decades. A few years ago, it was just sufficient for a clinician to precisely record history, document signs, establish diagnosis and write prescription. In the present context, there has been a significant intrusion of scientific culture in clinical practice. Several criteria have been proposed, refined and redefined to ascertain accurate diagnosis for many neurological disorders. Introduction of the concept of impairment, disability, handicap and quality of life has added new dimension to the measurement of health and disease and neurological disorders are no exception. "Best guess" treatment modalities are no more accepted and evidence based medicine has become an integral component of medical care. Traditional treatments need validation and new therapies require vigorous trials. Thus, proper quantification in neurology has become essential, both in practice and research methodology in neurology. While this aspect is widely acknowledged, there is a limited access to a comprehensive document pertaining to measurements in neurology. This following description is a critical appraisal of various measurements and also provides certain commonly used rating scales/scores in neurological practice.

  11. Evolution of allowable stresses in shear for lumber

    Science.gov (United States)

    Robert L. Ethington; William L. Galligan; Henry M. Montrey; Alan D. Freas

    1979-01-01

    This paper surveys research leading to allowable shear stress parallel to grain for lumber. In early flexure tests of lumber, some pieces failed in shear. The estimated shear stress at time of failure was generally lower than shear strength measured on small, clear, straight-grained specimens. This and other engineering observations gave rise to adjustments that...

  12. 46 CFR 154.440 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.440 Section 154.440 Shipping COAST... Tank Type A § 154.440 Allowable stress. (a) The allowable stresses for an independent tank type A must... Commandant (CG-522). (b) A greater allowable stress than required in paragraph (a)(1) of this section may be...

  13. 46 CFR 154.421 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.421 Section 154.421 Shipping COAST... § 154.421 Allowable stress. The allowable stress for the integral tank structure must meet the American Bureau of Shipping's allowable stress for the vessel's hull published in “Rules for Building and Classing...

  14. 34 CFR 304.21 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false Allowable costs. 304.21 Section 304.21 Education... Grantee § 304.21 Allowable costs. In addition to the allowable costs established in the Education... allowable expenditures by projects funded under the program: (a) Cost of attendance, as defined in Title IV...

  15. 2 CFR 215.27 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false Allowable costs. 215.27 Section 215.27... § 215.27 Allowable costs. For each kind of recipient, there is a set of Federal principles for determining allowable costs. Allowability of costs shall be determined in accordance with the cost principles...

  16. 50 CFR 80.15 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Allowable costs. 80.15 Section 80.15... WILDLIFE RESTORATION AND DINGELL-JOHNSON SPORT FISH RESTORATION ACTS § 80.15 Allowable costs. (a) What are allowable costs? Allowable costs are costs that are necessary and reasonable for accomplishment of approved...

  17. 49 CFR 266.11 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Allowable costs. 266.11 Section 266.11... TRANSPORTATION ACT § 266.11 Allowable costs. Allowable costs include only the following costs which are properly allocable to the work performed: Planning and program operation costs which are allowed under Federal...

  18. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  19. Quantification practices in the nuclear industry

    International Nuclear Information System (INIS)

    1986-01-01

    In this chapter the quantification of risk practices adopted by the nuclear industries in Germany, Britain and France are examined as representative of the practices adopted throughout Europe. From this examination a number of conclusions are drawn about the common features of the practices adopted. In making this survey, the views expressed in the report of the Task Force on Safety Goals/Objectives appointed by the Commission of the European Communities, are taken into account. For each country considered, the legal requirements for presentation of quantified risk assessment as part of the licensing procedure are examined, and the way in which the requirements have been developed for practical application are then examined. (author)

  20. PCR amplification of repetitive sequences as a possible approach in relative species quantification

    DEFF Research Database (Denmark)

    Ballin, Nicolai Zederkopff; Vogensen, Finn Kvist; Karlsson, Anders H

    2012-01-01

    Abstract Both relative and absolute quantifications are possible in species quantification when single copy genomic DNA is used. However, amplification of single copy genomic DNA does not allow a limit of detection as low as one obtained from amplification of repetitive sequences. Amplification...... of repetitive sequences is therefore frequently used in absolute quantification but problems occur in relative quantification as the number of repetitive sequences is unknown. A promising approach was developed where data from amplification of repetitive sequences were used in relative quantification of species...... to relatively quantify the amount of chicken DNA in a binary mixture of chicken DNA and pig DNA. However, the designed PCR primers lack the specificity required for regulatory species control....

  1. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    Science.gov (United States)

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  2. Standardless quantification by parameter optimization in electron probe microanalysis

    International Nuclear Information System (INIS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-01-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: ► A method for standardless quantification in EPMA is presented. ► It gives better results than the commercial software GENESIS Spectrum. ► It gives better results than the software DTSA. ► It allows the determination of the conductive coating thickness. ► It gives an estimation for the concentration uncertainties.

  3. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  4. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Virus detection and quantification using electrical parameters

    Science.gov (United States)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  6. Resident away rotations allow adaptive neurosurgical training.

    Science.gov (United States)

    Gephart, Melanie Hayden; Derstine, Pamela; Oyesiku, Nelson M; Grady, M Sean; Burchiel, Kim; Batjer, H Hunt; Popp, A John; Barbaro, Nicholas M

    2015-04-01

    Subspecialization of physicians and regional centers concentrate the volume of certain rare cases into fewer hospitals. Consequently, the primary institution of a neurological surgery training program may not have sufficient case volume to meet the current Residency Review Committee case minimum requirements in some areas. To ensure the competency of graduating residents through a comprehensive neurosurgical education, programs may need for residents to travel to outside institutions for exposure to cases that are either less common or more regionally focused. We sought to evaluate off-site rotations to better understand the changing demographics and needs of resident education. This would also allow prospective monitoring of modifications to the neurosurgery training landscape. We completed a survey of neurosurgery program directors and query of data from the Accreditation Council of Graduate Medical Education to characterize the current use of away rotations in neurosurgical education of residents. We found that 20% of programs have mandatory away rotations, most commonly for exposure to pediatric, functional, peripheral nerve, or trauma cases. Most of these rotations are done during postgraduate year 3 to 6, lasting 1 to 15 months. Twenty-six programs have 2 to 3 participating sites and 41 have 4 to 6 sites distinct from the host program. Programs frequently offset potential financial harm to residents rotating at a distant site by support of housing and transportation costs. As medical systems experience fluctuating treatment paradigms and demographics, over time, more residency programs may adapt to meet the Accreditation Council of Graduate Medical Education case minimum requirements through the implementation of away rotations.

  7. Assessing allowable take of migratory birds

    Science.gov (United States)

    Runge, M.C.; Sauer, J.R.; Avery, M.L.; Blackwell, B.F.; Koneff, M.D.

    2009-01-01

    Legal removal of migratory birds from the wild occurs for several reasons, including subsistence, sport harvest, damage control, and the pet trade. We argue that harvest theory provides the basis for assessing the impact of authorized take, advance a simplified rendering of harvest theory known as potential biological removal as a useful starting point for assessing take, and demonstrate this approach with a case study of depredation control of black vultures (Coragyps atratus) in Virginia, USA. Based on data from the North American Breeding Bird Survey and other sources, we estimated that the black vulture population in Virginia was 91,190 (95% credible interval = 44,520?212,100) in 2006. Using a simple population model and available estimates of life-history parameters, we estimated the intrinsic rate of growth (rmax) to be in the range 7?14%, with 10.6% a plausible point estimate. For a take program to seek an equilibrium population size on the conservative side of the yield curve, the rate of take needs to be less than that which achieves a maximum sustained yield (0.5 x rmax). Based on the point estimate for rmax and using the lower 60% credible interval for population size to account for uncertainty, these conditions would be met if the take of black vultures in Virginia in 2006 was < 3,533 birds. Based on regular monitoring data, allowable harvest should be adjusted annually to reflect changes in population size. To initiate discussion about how this assessment framework could be related to the laws and regulations that govern authorization of such take, we suggest that the Migratory Bird Treaty Act requires only that take of native migratory birds be sustainable in the long-term, that is, sustained harvest rate should be < rmax. Further, the ratio of desired harvest rate to 0.5 x rmax may be a useful metric for ascertaining the applicability of specific requirements of the National Environmental Protection Act.

  8. Molecular quantification of environmental DNA using microfluidics and digital PCR.

    Science.gov (United States)

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2012-09-01

    Real-time PCR has been widely used to evaluate gene abundance in natural microbial habitats. However, PCR-inhibitory substances often reduce the efficiency of PCR, leading to the underestimation of target gene copy numbers. Digital PCR using microfluidics is a new approach that allows absolute quantification of DNA molecules. In this study, digital PCR was applied to environmental samples, and the effect of PCR inhibitors on DNA quantification was tested. In the control experiment using λ DNA and humic acids, underestimation of λ DNA at 1/4400 of the theoretical value was observed with 6.58 ng μL(-1) humic acids. In contrast, digital PCR provided accurate quantification data with a concentration of humic acids up to 9.34 ng μL(-1). The inhibitory effect of paddy field soil extract on quantification of the archaeal 16S rRNA gene was also tested. By diluting the DNA extract, quantified copy numbers from real-time PCR and digital PCR became similar, indicating that dilution was a useful way to remedy PCR inhibition. The dilution strategy was, however, not applicable to all natural environmental samples. For example, when marine subsurface sediment samples were tested the copy number of archaeal 16S rRNA genes was 1.04×10(3) copies/g-sediment by digital PCR, whereas real-time PCR only resulted in 4.64×10(2) copies/g-sediment, which was most likely due to an inhibitory effect. The data from this study demonstrated that inhibitory substances had little effect on DNA quantification using microfluidics and digital PCR, and showed the great advantages of digital PCR in accurate quantifications of DNA extracted from various microbial habitats. Copyright © 2012 Elsevier GmbH. All rights reserved.

  9. Clean Air Markets - Allowances Query Wizard

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Allowances Query Wizard is part of a suite of Clean Air Markets-related tools that are accessible at http://camddataandmaps.epa.gov/gdm/index.cfm. The Allowances...

  10. Allowance Holdings and Transfers Data Inventory

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Allowance Holdings and Transfers Data Inventory contains measured data on holdings and transactions of allowances under the NOx Budget Trading Program (NBP), a...

  11. 49 CFR 19.27 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...

  12. 29 CFR 95.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... Governments.” The allowability of costs incurred by non-profit organizations is determined in accordance with... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...

  13. 24 CFR 84.27 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... to the entity incurring the costs. Thus, allowability of costs incurred by State, local or federally..., “Cost Principles for State and Local Governments.” The allowability of costs incurred by non-profit...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...

  14. 7 CFR 550.25 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... cost principles applicable to the entity incurring the costs. Thus, allowability of costs incurred by... at 2 CFR part 225. The allowability of costs incurred by non-profit organizations is determined in... at 2 CFR part 230. The allowability of costs incurred by institutions of higher education is...

  15. 36 CFR 1210.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...

  16. 7 CFR 3019.27 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... applicable to the entity incurring the costs. Thus, allowability of costs incurred by State, local or... Circular A-87, “Cost Principles for State and Local Governments.” The allowability of costs incurred by non... Principles for Non-Profit Organizations.” The allowability of costs incurred by institutions of higher...

  17. 46 CFR 154.428 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.428 Section 154.428 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS FOR... § 154.428 Allowable stress. The membrane tank and the supporting insulation must have allowable stresses...

  18. 46 CFR 154.447 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.447 Section 154.447 Shipping COAST... Tank Type B § 154.447 Allowable stress. (a) An independent tank type B designed from bodies of revolution must have allowable stresses 3 determined by the following formulae: 3 See Appendix B for stress...

  19. 42 CFR 417.802 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 3 2010-10-01 2010-10-01 false Allowable costs. 417.802 Section 417.802 Public... PLANS Health Care Prepayment Plans § 417.802 Allowable costs. (a) General rule. The costs that are considered allowable for HCPP reimbursement are the same as those for reasonable cost HMOs and CMPs specified...

  20. 45 CFR 1180.56 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Allowable costs. 1180.56 Section 1180.56 Public... by a Grantee General Administrative Responsibilities § 1180.56 Allowable costs. (a) Determination of costs allowable under a grant is made in accordance with government-wide cost principles in applicable...

  1. 50 CFR 85.41 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false Allowable costs. 85.41 Section 85.41... Use/Acceptance of Funds § 85.41 Allowable costs. (a) Allowable grant costs are limited to those costs... applicable Federal cost principles in 43 CFR 12.60(b). Purchase of informational signs, program signs, and...

  2. 34 CFR 675.33 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... costs. An institution's share of allowable costs may be in cash or in the form of services. The... 34 Education 3 2010-07-01 2010-07-01 false Allowable costs. 675.33 Section 675.33 Education... costs. (a)(1) Allowable and unallowable costs. Except as provided in paragraph (a)(2) of this section...

  3. Rapid quantification of biomarkers during kerogen microscale pyrolysis

    Energy Technology Data Exchange (ETDEWEB)

    Stott, A.W.; Abbott, G.D. [Fossil Fuels and Environmental Geochemistry NRG, The University, Newcastle-upon-Tyne (United Kingdom)

    1995-02-01

    A rapid, reproducible method incorporating closed system microscale pyrolysis and thermal desorption-gas chromatography/mass spectrometry has been developed and applied to the quantification of sterane biomarkers released during pyrolysis of the Messel oil shale kerogen under confined conditions. This method allows a substantial experimental concentration-time data set to be collected at accurately controlled temperatures, due to the low thermal inertia of the microscale borosilicate glass reaction vessels, which facilitates kinetic studies of biomarker reactions during kerogen microscale pyrolysis

  4. Quantification of miRNAs by a simple and specific qPCR method

    DEFF Research Database (Denmark)

    Cirera Salicio, Susanna; Busk, Peter K.

    2014-01-01

    MicroRNAs (miRNAs) are powerful regulators of gene expression at posttranscriptional level and play important roles in many biological processes and in disease. The rapid pace of the emerging field of miRNAs has opened new avenues for development of techniques to quantitatively determine mi...... in miRNA quantification. Furthermore, the method is easy to perform with common laboratory reagents, which allows miRNA quantification at low cost....

  5. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  6. NGS Survey Control Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NGS Survey Control Map provides a map of the US which allows you to find and display geodetic survey control points stored in the database of the National...

  7. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  8. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  9. Collagen Quantification in Tissue Specimens.

    Science.gov (United States)

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  10. 42 CFR 61.8 - Benefits: Stipends; dependency allowances; travel allowances; vacation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Benefits: Stipends; dependency allowances; travel...; dependency allowances; travel allowances; vacation. Individuals awarded regular fellowships shall be entitled...) Stipend. (b) Dependency allowances. (c) When authorized in advance, separate allowances for travel. Such...

  11. 42 CFR 61.9 - Payments: Stipends; dependency allowances; travel allowances.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Payments: Stipends; dependency allowances; travel... FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.9 Payments: Stipends; dependency allowances; travel allowances. Payments for stipends, dependency allowances, and the travel allowances...

  12. Good quantification practices of flavours and fragrances by mass spectrometry.

    Science.gov (United States)

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  13. Collaborative framework for PIV uncertainty quantification: comparative assessment of methods

    International Nuclear Information System (INIS)

    Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard

    2015-01-01

    A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)

  14. 33 CFR 136.235 - Compensation allowable.

    Science.gov (United States)

    2010-07-01

    ... allowable. The amount of compensation allowable is limited to the actual net reduction or loss of earnings or profits suffered. Calculations for net reductions or losses must clearly reflect adjustments for... available; (d) Any saved overhead or normal expenses not incurred as a result of the incident; and (e) State...

  15. 10 CFR 600.317 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... to the type of entity incurring the cost as follows: (1) For-profit organizations. Allowability of costs incurred by for-profit organizations and those nonprofit organizations listed in Attachment C to... specifically authorized in the award document. (2) Other types of organizations. Allowability of costs incurred...

  16. 29 CFR 97.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. [53 FR 8069, 8087... LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 97.22 Allowable costs. (a... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...

  17. 22 CFR 135.22 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... Procedures, or uniform cost accounting standards that comply with cost principles acceptable to the Federal... AGREEMENTS TO STATE AND LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 135.22 Allowable... principles. For each kind of organization, there is a set of Federal principles for determining allowable...

  18. 34 CFR 74.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... Procedures or uniform cost accounting standards that comply with cost principles acceptable to ED. (b) The... OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial... principles for determining allowable costs. Allowability of costs are determined in accordance with the cost...

  19. 44 CFR 13.22 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... STATE AND LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 13.22 Allowable costs. (a... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...

  20. 24 CFR 85.22 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... TRIBAL GOVERNMENTS Post-Award Requirements Financial Administration § 85.22 Allowable costs. (a... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...

  1. 36 CFR 1207.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... GOVERNMENTS Post-Award Requirements Financial Administration § 1207.22 Allowable costs. (a) Limitation on use... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...

  2. 32 CFR 33.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... Post-Award Requirements Financial Administration § 33.22 Allowable costs. (a) Limitation on use of... allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization...

  3. 20 CFR 631.84 - Allowable projects.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable projects. 631.84 Section 631.84... THE JOB TRAINING PARTNERSHIP ACT Disaster Relief Employment Assistance § 631.84 Allowable projects...) Shall be used exclusively to provide employment on projects that provide food, clothing, shelter and...

  4. 45 CFR 2541.220 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in...

  5. 15 CFR 14.27 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NON-PROFIT, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 14.27 Allowable costs. For each kind of... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...

  6. 45 CFR 2543.27 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 2543.27 Allowable costs. For each kind... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...

  7. 28 CFR 70.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... AND AGREEMENTS (INCLUDING SUBAWARDS) WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 70.27 Allowable costs. (a... Organizations.” The allowability of costs incurred by institutions of higher education is determined in...

  8. 38 CFR 49.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 49.27 Allowable...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...

  9. 20 CFR 435.27 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NON-PROFIT ORGANIZATIONS, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 435.27 Allowable costs. For each kind... Organizations.” (c) Allowability of costs incurred by institutions of higher education is determined in...

  10. 40 CFR 30.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 30.27 Allowable...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...

  11. Utility allowed returns and market extremes

    International Nuclear Information System (INIS)

    Murry, D.A.; Nan, G.D.; Harrington, B.M.

    1993-01-01

    In recent years interest rates have fluctuated from exceptionally high levels in the early 1980s to their current levels, the lowest in two decades. Observers and analysts generally have assumed that allowed returns by regulatory commissions follow the movement of interest rates; indeed some analysts use a risk premium method to estimate the cost of common equity, assuming a constant and linear relationship between interest rates and the cost of common equity. That suggests we could expect a relatively stable relationship between interest rates and allowed returns, as well. However, a simple comparison of allowed returns and interest rates shows that this is not the case in recent years. The relationship between market interest rates and the returns allowed by commissions varies and is obviously a great deal more complicated. Empirically, there appears to be only a narrow range where market interest rates significantly affect the allowed returns on common stock set by state commissions, at least for electric and combination utilities. If rates are at historically low levels, allowed returns based largely on market rates will hasten subsequent rate filings, and commissions appear to look beyond the low rate levels. Conversely, it appears that regulators do not let historically high market rates determine allowed returns either. At either high or low interest levels, caution seems to be the policy

  12. The future(s) of emission allowances

    International Nuclear Information System (INIS)

    Rosenzweig, K.M.; Villarreal, J.A.

    1993-01-01

    The Clean Air Act Amendments of 1990 (CAAA) established a sulfur dioxide emission allowance system to be implemented by the US Environmental Protection Agency (EPA). Under the two-phase implementation of the program, electric utilities responsible for approximately 70 percent of SO 2 emissions in the United States will be issued emission allowances, each representing authorization to emit one ton of sulfur dioxide during a specified calendar year or a later year. Allowances will be issued to utilities with electric-generating units affected by the CAAA limits, as well as to certain entities which may choose to opt-in to the program. Each utility or other emission source must hold a number of allowances at least equal to its total SO 2 emissions during any given year. Unused allowances may be sold, traded, or held in inventory for use against SO 2 emissions in future years. Anyone can buy and hold allowances, including affected utilities, non-utility companies, SO 2 allowances brokers and dealers, environmental groups, and individuals. During Phase I of the program, allowances equivalent to approximately 6.4 million tons of SO 2 emissions will be allocated annually to a group of 110 large, high-SO 2 -emitting power plants. In Phase II, virtually all power-generating utilities (representing approximately 99.4 percent of total US utility emissions) will be subject to the program. The number of allowances issued will increase to approximately 8.9 million a year, with certain special allocations raising the actual number issued to 9.48 million between the years 2000 to 2009, and 8.95 million yearly thereafter. Thus, the CAAA goal of annual emissions of 9 million tons should be achieved by 2010, when virtually all US emission sources will be participating in the program

  13. Keynes, family allowances and Keynesian economic policy

    OpenAIRE

    Pressman, Steven

    2014-01-01

    This paper provides a short history of family allowances and documents the fact that Keynes supported family allowances as early as the 1920s, continuing through the 1930s and early 1940s. Keynes saw this policy as a way to help households raise their children and also as a way to increase consumption without reducing business investment. The paper goes on to argue that a policy of family allowances is consistent with Keynesian economics. Finally, the paper uses the Luxembourg Income Study to...

  14. A fast and robust hepatocyte quantification algorithm including vein processing

    Directory of Open Access Journals (Sweden)

    Homeyer André

    2010-03-01

    Full Text Available Abstract Background Quantification of different types of cells is often needed for analysis of histological images. In our project, we compute the relative number of proliferating hepatocytes for the evaluation of the regeneration process after partial hepatectomy in normal rat livers. Results Our presented automatic approach for hepatocyte (HC quantification is suitable for the analysis of an entire digitized histological section given in form of a series of images. It is the main part of an automatic hepatocyte quantification tool that allows for the computation of the ratio between the number of proliferating HC-nuclei and the total number of all HC-nuclei for a series of images in one processing run. The processing pipeline allows us to obtain desired and valuable results for a wide range of images with different properties without additional parameter adjustment. Comparing the obtained segmentation results with a manually retrieved segmentation mask which is considered to be the ground truth, we achieve results with sensitivity above 90% and false positive fraction below 15%. Conclusions The proposed automatic procedure gives results with high sensitivity and low false positive fraction and can be applied to process entire stained sections.

  15. Maximum allowable load on wheeled mobile manipulators

    International Nuclear Information System (INIS)

    Habibnejad Korayem, M.; Ghariblu, H.

    2003-01-01

    This paper develops a computational technique for finding the maximum allowable load of mobile manipulator during a given trajectory. The maximum allowable loads which can be achieved by a mobile manipulator during a given trajectory are limited by the number of factors; probably the dynamic properties of mobile base and mounted manipulator, their actuator limitations and additional constraints applied to resolving the redundancy are the most important factors. To resolve extra D.O.F introduced by the base mobility, additional constraint functions are proposed directly in the task space of mobile manipulator. Finally, in two numerical examples involving a two-link planar manipulator mounted on a differentially driven mobile base, application of the method to determining maximum allowable load is verified. The simulation results demonstrates the maximum allowable load on a desired trajectory has not a unique value and directly depends on the additional constraint functions which applies to resolve the motion redundancy

  16. 33 CFR 136.223 - Compensation allowable.

    Science.gov (United States)

    2010-07-01

    ...) MARINE POLLUTION FINANCIAL RESPONSIBILITY AND COMPENSATION OIL SPILL LIABILITY TRUST FUND; CLAIMS... allowable under paragraph (a) of this section must be reduced by— (1) All compensation made available to the... under § 136.235. Government Revenues ...

  17. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  18. Improved perfusion quantification in FAIR imaging by offset correction

    DEFF Research Database (Denmark)

    Sidaros, Karam; Andersen, Irene Klærke; Gesmar, Henrik

    2001-01-01

    Perfusion quantification using pulsed arterial spin labeling has been shown to be sensitive to the RF pulse slice profiles. Therefore, in Flow-sensitive Alternating-Inversion Recovery (FAIR) imaging the slice selective (ss) inversion slab is usually three to four times thicker than the imaging...... slice. However, this reduces perfusion sensitivity due to the increased transit delay of the incoming blood with unperturbed spins. In the present article, the dependence of the magnetization on the RF pulse slice profiles is inspected both theoretically and experimentally. A perfusion quantification...... model is presented that allows the use of thinner ss inversion slabs by taking into account the offset of RF slice profiles between ss and nonselective inversion slabs. This model was tested in both phantom and human studies. Magn Reson Med 46:193-197, 2001...

  19. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  20. Emission allowances -- Long-term price trend

    International Nuclear Information System (INIS)

    Lennox, F.H.

    1994-01-01

    Estimated trends in emission allowance (EA) values have been of interest to all those affected by the Clean Air Act Amendments of 1990 since it became law in 1990. The authors published estimates of the values of EAs in December 1991, and revised their estimate in November 1992. The summary trends of the 1992 estimate is shown here. General estimates such as these are no longer useful. Everyone directly involved in complying with the Act or in buying and selling allowances has developed their own outlook on EA values. Many recent trades have been publicized. The prices from the first auction are also well known. Therefore this article is concerned only with what might happening the long-run. Once Phase 2 compliance is essentially complete and emissions roughly match Emission Allowance allocations of some 9.8 million tons annually, what pressures will there be on prices? What will be the direction of values after Phase 2 is in balance?

  1. Tradable allowances in a restructuring electric industry

    International Nuclear Information System (INIS)

    Tschirhart, J.

    1999-01-01

    The SO 2 tradable allowance program has been introduced into an electric industry undergoing dramatic changes. Entry of nonutilities into the industry and the emergence of stranded costs are two major changes that are shown to have an impact on the market for allowances and the industry's incentives to switch to cleaner fuels. The degree of impact depends on the extent to which consumers bypass traditional utilities and buy from entrants, and on public utility commission policies regarding the recovery of stranded costs. In turn, the amount of stranded costs depends on fuel switching. The results follow from simulations of a two-utility model that illustrate the qualitative effects of changing policies

  2. Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems

    Science.gov (United States)

    2017-11-27

    first of these introductory sections is an overview of UQ and its various methods. The second of these discusses issues pertaining to the use of UQ...can be readily assessed, as well as the variance or other statistical measures of the distribu- tion of parameters. The uncertainty in the parameters is... statistics of the outputs of these methods, such as the moments of the probability distributions of model outputs. The module does not explicitly support

  3. 22 CFR 226.27 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... Relations AGENCY FOR INTERNATIONAL DEVELOPMENT ADMINISTRATION OF ASSISTANCE AWARDS TO U.S. NON-GOVERNMENTAL ORGANIZATIONS Post-award Requirements Financial and Program Management § 226.27 Allowable costs. For each kind... organizations is determined in accordance with the provisions of OMB Circular A-122, “Cost Principles for Non...

  4. 13 CFR 143.22 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... to that circular 48 CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting... Financial Administration § 143.22 Allowable costs. (a) Limitation on use of funds. Grant funds may be used... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of...

  5. 38 CFR 43.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... Requirements Financial Administration § 43.22 Allowable costs. (a) Limitation on use of funds. Grant funds may... the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a...

  6. 29 CFR 1470.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... to that circular 48 CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting... Financial Administration § 1470.22 Allowable costs. (a) Limitation on use of funds. Grant funds may be used... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of...

  7. 40 CFR 31.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... Requirements Financial Administration § 31.22 Allowable costs. (a) Limitation on use of funds. Grant funds may... the grantee or sub-grantee. (b) Applicable cost principles. For each kind of organization, there is a...

  8. 34 CFR 80.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting standards that comply... COOPERATIVE AGREEMENTS TO STATE AND LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 80.22... kind of organization, there is a set of Federal principles for determining allowable costs. For the...

  9. 45 CFR 92.22 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... to that circular 48 CFR Part 31. Contract Cost Principles and Procedures, or uniform cost accounting... Financial Administration § 92.22 Allowable costs. (a) Limitation on use of funds. Grant funds may be used... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of...

  10. 7 CFR 3016.22 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... Regulations of the Department of Agriculture (Continued) OFFICE OF THE CHIEF FINANCIAL OFFICER, DEPARTMENT OF... GOVERNMENTS Post-Award Requirements Financial Administration § 3016.22 Allowable costs. (a) Limitation on use...

  11. 10 CFR 440.18 - Allowable expenditures.

    Science.gov (United States)

    2010-01-01

    ... part for labor, weatherization materials, and related matters for a renewable energy system, shall not... beginning in calendar year 2010 and the $3,000 average for renewable energy systems will be adjusted... 10 Energy 3 2010-01-01 2010-01-01 false Allowable expenditures. 440.18 Section 440.18 Energy...

  12. 42 CFR 417.534 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... typical “provider” costs, and costs (such as marketing, enrollment, membership, and operation of the HMO... principles applicable to provider costs, as set forth in § 417.536. (2) The allowability of other costs is determined in accordance with principles set forth in §§ 417.538 through 417.550. (3) Costs for covered...

  13. 44 CFR 295.21 - Allowable compensation.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Allowable compensation. 295.21 Section 295.21 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT... no-cost crisis counseling services available in the community. FEMA will not reimburse for treatment...

  14. 38 CFR 21.260 - Subsistence allowance.

    Science.gov (United States)

    2010-07-01

    ... rehabilitation facility or sheltered workshop; independent instructor; institutional non-farm cooperative: Full...) VOCATIONAL REHABILITATION AND EDUCATION Vocational Rehabilitation and Employment Under 38 U.S.C. Chapter 31... rehabilitation program under 38 U.S.C. Chapter 31 will receive a monthly subsistence allowance at the rates in...

  15. 43 CFR 12.62 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... COST PRINCIPLES FOR ASSISTANCE PROGRAMS Uniform Administrative Requirements for Grants and Cooperative... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...

  16. 45 CFR 74.27 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... FOR AWARDS AND SUBAWARDS TO INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, OTHER NONPROFIT ORGANIZATIONS, AND COMMERCIAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 74.27... Organizations” and paragraph (b) of this section. The allowability of costs incurred by institutions of higher...

  17. 22 CFR 145.27 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... Relations DEPARTMENT OF STATE CIVIL RIGHTS GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 145...-Profit Organizations.” The allowability of costs incurred by institutions of higher education is...

  18. 22 CFR 518.27 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... INSTITUTIONS OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial and Program Management § 518.27 Allowable costs. For each kind of recipient, there is a set of... by institutions of higher education is determined in accordance with the provisions of OMB Circular A...

  19. 33 CFR 136.217 - Compensation allowable.

    Science.gov (United States)

    2010-07-01

    ...) MARINE POLLUTION FINANCIAL RESPONSIBILITY AND COMPENSATION OIL SPILL LIABILITY TRUST FUND; CLAIMS PROCEDURES; DESIGNATION OF SOURCE; AND ADVERTISEMENT Procedures for Particular Claims § 136.217 Compensation... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Compensation allowable. 136.217...

  20. 33 CFR 136.205 - Compensation allowable.

    Science.gov (United States)

    2010-07-01

    ...) MARINE POLLUTION FINANCIAL RESPONSIBILITY AND COMPENSATION OIL SPILL LIABILITY TRUST FUND; CLAIMS PROCEDURES; DESIGNATION OF SOURCE; AND ADVERTISEMENT Procedures for Particular Claims § 136.205 Compensation... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Compensation allowable. 136.205...

  1. 20 CFR 632.37 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... otherwise indicated below, direct and indirect costs shall be charged in accordance with 41 CFR 29-70 and 41... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable costs. 632.37 Section 632.37 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR INDIAN AND NATIVE AMERICAN...

  2. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...... standard curve in the Quantifiler Human DNA Quantification kit, the DNA quantification results of the human DNA preparations were 31% higher than expected based on the manufacturers' information. The results indicate a calibration problem with the Quantifiler human DNA standard for its use...

  3. Super-allowed Fermi beta-decay

    International Nuclear Information System (INIS)

    Wilkinson, D.H.

    2005-01-01

    A final analysis of J π =0 + ->0 + super-allowed Fermi transitions yields vertical bar V ud vertical bar 2 =0.9500±0.0007; vertical bar V ud vertical bar 2 + vertical bar V us vertical bar 2 + vertical bar V ub vertical bar 2 =0.9999±0.0011 with the operational vector coupling constant G V */(-bar c) 3 =(1.15052±0.00021)x10 -5 GeV -2

  4. Making It Personal: Per Capita Carbon Allowances

    DEFF Research Database (Denmark)

    Fawcett, Tina; Hvelplund, Frede; Meyer, Niels I

    2009-01-01

    The Chapter highligts the importance of introducing new, efficient schemes for mitigation of global warming. One such scheme is Personal Carbon Allowances (PCA), whereby individuals are allotted a tradable ration of CO2 emission per year.This chapter reviews the fundamentals of PCA and analyzes its...... merits and problems. The United Kingdom and Denmark have been chosen as case studies because the energy situation and the institutional setup are quite different between the two countries....

  5. Cross recurrence quantification for cover song identification

    Energy Technology Data Exchange (ETDEWEB)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G [Department of Information and Communication Technologies, Universitat Pompeu Fabra, Roc Boronat 138, 08018 Barcelona (Spain)], E-mail: joan.serraj@upf.edu

    2009-09-15

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  6. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  7. Cross recurrence quantification for cover song identification

    International Nuclear Information System (INIS)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G

    2009-01-01

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  8. Quantification of eDNA shedding rates from invasive bighead carp Hypophthalmichthys nobilis and silver carp Hypophthalmichthys molitrix

    Science.gov (United States)

    Klymus, Katy E.; Richter, Catherine A.; Chapman, Duane C.; Paukert, Craig P.

    2015-01-01

    Wildlife managers can more easily mitigate the effects of invasive species if action takes place before a population becomes established. Such early detection requires sensitive survey tools that can detect low numbers of individuals. Due to their high sensitivity, environmental DNA (eDNA) surveys hold promise as an early detection method for aquatic invasive species. Quantification of eDNA amounts may also provide data on species abundance and timing of an organism’s presence, allowing managers to successfully combat the spread of ecologically damaging species. To better understand the link between eDNA and an organism’s presence, it is crucial to know how eDNA is shed into the environment. Our study used quantitative PCR (qPCR) and controlled laboratory experiments to measure the amount of eDNA that two species of invasive bigheaded carps (Hypophthalmichthys nobilis and Hypophthalmichthys molitrix) shed into the water. We first measured how much eDNA a single fish sheds and the variability of these measurements. Then, in a series of manipulative lab experiments, we studied how temperature, biomass (grams of fish), and diet affect the shedding rate of eDNA by these fish. We found that eDNA amounts exhibit a positive relationship with fish biomass, and that feeding could increase the amount of eDNA shed by ten-fold, whereas water temperature did not have an effect. Our results demonstrate that quantification of eDNA may be useful for predicting carp density, as well as densities of other rare or invasive species.

  9. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    Science.gov (United States)

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  10. The role of PET quantification in cardiovascular imaging.

    Science.gov (United States)

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  11. Strategy study of quantification harmonization of SUV in PET/CT images

    International Nuclear Information System (INIS)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-01-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  12. Surveying Future Surveys

    Science.gov (United States)

    Carlstrom, John E.

    2016-06-01

    The now standard model of cosmology has been tested and refined by the analysis of increasingly sensitive, large astronomical surveys, especially with statistically significant millimeter-wave surveys of the cosmic microwave background and optical surveys of the distribution of galaxies. This talk will offer a glimpse of the future, which promises an acceleration of this trend with cosmological information coming from new surveys across the electromagnetic spectrum as well as particles and even gravitational waves.

  13. Aspect-Oriented Programming is Quantification and Obliviousness

    Science.gov (United States)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  14. Comparison of Suitability of the Most Common Ancient DNA Quantification Methods.

    Science.gov (United States)

    Brzobohatá, Kristýna; Drozdová, Eva; Smutný, Jiří; Zeman, Tomáš; Beňuš, Radoslav

    2017-04-01

    Ancient DNA (aDNA) extracted from historical bones is damaged and fragmented into short segments, present in low quantity, and usually copurified with microbial DNA. A wide range of DNA quantification methods are available. The aim of this study was to compare the five most common DNA quantification methods for aDNA. Quantification methods were tested on DNA extracted from skeletal material originating from an early medieval burial site. The tested methods included ultraviolet (UV) absorbance, real-time quantitative polymerase chain reaction (qPCR) based on SYBR ® green detection, real-time qPCR based on a forensic kit, quantification via fluorescent dyes bonded to DNA, and fragmentary analysis. Differences between groups were tested using a paired t-test. Methods that measure total DNA present in the sample (NanoDrop ™ UV spectrophotometer and Qubit ® fluorometer) showed the highest concentrations. Methods based on real-time qPCR underestimated the quantity of aDNA. The most accurate method of aDNA quantification was fragmentary analysis, which also allows DNA quantification of the desired length and is not affected by PCR inhibitors. Methods based on the quantification of the total amount of DNA in samples are unsuitable for ancient samples as they overestimate the amount of DNA presumably due to the presence of microbial DNA. Real-time qPCR methods give undervalued results due to DNA damage and the presence of PCR inhibitors. DNA quantification methods based on fragment analysis show not only the quantity of DNA but also fragment length.

  15. Development of a test method that will allow evaluation and quantification of the effects of healing on asphalt mixture [summary].

    Science.gov (United States)

    2012-01-01

    Top-down cracking in flexible pavement is one of the most common and crucial modes of pavement distress in Florida, reducing both service quality and life of flexible pavement. The process begins with micro-cracks (micro-damage), which grow and merge...

  16. Dermatologic radiotherapy and thyroid cancer. Dose measurements and risk quantification

    International Nuclear Information System (INIS)

    Goldschmidt, H.; Gorson, R.O.; Lassen, M.

    1983-01-01

    Thyroid doses for various dermatologic radiation techniques were measured with thermoluminescent dosimeters and ionization rate meters in an Alderson-Rando anthropomorphic phantom. The effects of changes in radiation quality and of the use or nonuse of treatment cones and thyroid shields were evaluated in detail. The results indicate that the potential risk of radiogenic thyroid cancer is very small when proper radiation protection measures are used. The probability of radiogenic thyroid cancer developing and the potential mortality risk were assessed quantitatively for each measurement. The quantification of radiation risks allows comparisons with risks of other therapeutic modalities and the common hazards of daily life

  17. Sulfur dioxide allowances. Trading and technological progress

    International Nuclear Information System (INIS)

    Kumar, Surender; Managi, Shunsuke

    2010-01-01

    The US Clean Air Act Amendments introduce an emissions trading system to regulate SO 2 emissions. This study finds that changes in SO 2 emissions prices are related to innovations induced by these amendments. We find that electricity-generating plants are able to increase electricity output and reduce emissions of SO 2 and NO x from 1995 to 2007 due to the introduction of the allowance trading system. However, compared to the approximate 8% per year of exogenous technological progress, the induced effect is relatively small, and the contribution of the induced effect to overall technological progress is about 1-2%. (author)

  18. What parents say about the allowance: Function of the allowance for parents of different economic incomes

    Directory of Open Access Journals (Sweden)

    Irani Lauer Lellis

    2012-06-01

    Full Text Available The practice of giving allowance is used by several parents in different parts of the world and can contribute to the economic education of children. This study aimed to investigate the purposes of the allowance with 32 parents of varying incomes. We used the focus group technique and Alceste software to analyze the data. The results involved two classes related to the process of using the allowance. These classes have covered aspects of the role of socialization and education allowance, serving as an instrument of reward, but sometimes encouraging bad habits in children. The justification of the fathers concerning the amount of money to be given to the children and when to stop giving allowance were also highlighted.   Keywords: allowance; economic socialization; parenting practices.

  19. Development of Quantification Method for Bioluminescence Imaging

    International Nuclear Information System (INIS)

    Kim, Hyeon Sik; Min, Jung Joon; Lee, Byeong Il; Choi, Eun Seo; Tak, Yoon O; Choi, Heung Kook; Lee, Ju Young

    2009-01-01

    Optical molecular luminescence imaging is widely used for detection and imaging of bio-photons emitted by luminescent luciferase activation. The measured photons in this method provide the degree of molecular alteration or cell numbers with the advantage of high signal-to-noise ratio. To extract useful information from the measured results, the analysis based on a proper quantification method is necessary. In this research, we propose a quantification method presenting linear response of measured light signal to measurement time. We detected the luminescence signal by using lab-made optical imaging equipment of animal light imaging system (ALIS) and different two kinds of light sources. One is three bacterial light-emitting sources containing different number of bacteria. The other is three different non-bacterial light sources emitting very weak light. By using the concept of the candela and the flux, we could derive simplified linear quantification formula. After experimentally measuring light intensity, the data was processed with the proposed quantification function. We could obtain linear response of photon counts to measurement time by applying the pre-determined quantification function. The ratio of the re-calculated photon counts and measurement time present a constant value although different light source was applied. The quantification function for linear response could be applicable to the standard quantification process. The proposed method could be used for the exact quantitative analysis in various light imaging equipment with presenting linear response behavior of constant light emitting sources to measurement time

  20. Quantification of residual host cell DNA in adenoviral vectors produced on PER.C6 cells

    NARCIS (Netherlands)

    Gijsbers, Linda; Koel, Björn; Weggeman, Miranda; Goudsmit, Jaap; Havenga, Menzo; Marzio, Giuseppe

    2005-01-01

    Recombinant adenoviral vectors for gene therapy and vaccination are routinely prepared on cultures of immortalized cells, allowing the production of vector batches of high titer and consistent quality. Quantification of residual DNA from the producing cell line is part of the purity tests for

  1. Automatic Drusen Quantification and Risk Assessment of Age-related Macular Degeneration on Color Fundus Images

    NARCIS (Netherlands)

    Grinsven, M.J.J.P. van; Lechanteur, Y.T.E.; Ven, J.P.H. van de; Ginneken, B. van; Hoyng, C.B.; Theelen, T.; Sanchez, C.I.

    2013-01-01

    PURPOSE: To evaluate a machine learning algorithm that allows for computer aided diagnosis (CAD) of non-advanced age-related macular degeneration (AMD) by providing an accurate detection and quantification of drusen location, area and size. METHODS: Color fundus photographs of 407 eyes without AMD

  2. A Novel Assay for Easy and Rapid Quantification of Helicobacter pylori Adhesion

    DEFF Research Database (Denmark)

    Skindersoe, Mette E; Rasmussen, Lone; Andersen, Leif P

    2015-01-01

    BACKGROUND: Reducing adhesion of Helicobacter pylori to gastric epithelial cells could be a new way to counteract infections with this organism. We here present a novel method for quantification of Helicobacter pylori adhesion to cells. METHODS: Helicobacter pylori is allowed to adhere to AGS...

  3. 34 CFR 656.30 - What are allowable costs and limitations on allowable costs?

    Science.gov (United States)

    2010-07-01

    ... FOREIGN LANGUAGE AND AREA STUDIES OR FOREIGN LANGUAGE AND INTERNATIONAL STUDIES What Conditions Must Be... 34 Education 3 2010-07-01 2010-07-01 false What are allowable costs and limitations on allowable costs? 656.30 Section 656.30 Education Regulations of the Offices of the Department of Education...

  4. 40 CFR 82.8 - Grant of essential use allowances and critical use allowances.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Grant of essential use allowances and critical use allowances. 82.8 Section 82.8 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Albemarle Bill Clark Pest Control, Inc. Burnside Services, Inc. Cardinal Professional Products Chemtura Corp...

  5. 75 FR 14442 - Federal Travel Regulation (FTR); Relocation Allowances-Relocation Income Tax Allowance (RITA) Tables

    Science.gov (United States)

    2010-03-25

    ... GENERAL SERVICES ADMINISTRATION [GSA Bulletin FTR 10-04] Federal Travel Regulation (FTR); Relocation Allowances-- Relocation Income Tax Allowance (RITA) Tables AGENCY: Office of Governmentwide Policy... (73 FR 35952) specifying that GSA would no longer publish the RITA tables found in 41 CFR Part 301-17...

  6. 76 FR 32340 - Federal Travel Regulation; Temporary Duty (TDY) Travel Allowances (Taxes); Relocation Allowances...

    Science.gov (United States)

    2011-06-06

    ... reflection of the actual tax impact on the employee. Therefore, this proposed rule offers the one-year RITA... to estimate the additional income tax liability that you incur as a result of relocation benefits and... Allowances (Taxes); Relocation Allowances (Taxes) AGENCY: Office of Governmentwide Policy (OGP), General...

  7. Allowable Pressure In Soils and Rocks by Seismic Wave Velocities

    International Nuclear Information System (INIS)

    Tezcan, S.; Keceli, A.; Oezdemir, Z.

    2007-01-01

    Firstly, the historical background is presented for the determination of ultimate bearing capacity of shallow foundations. The principles of plastic equilibrium used in the classical formulation of the ultimate bearing capacity are reviewed, followed by a discussion about the sources of approximations inherent in the classical theory. Secondly, based on a variety of case histories of site investigations, including extensive bore hole data, laboratory testing and geophysical prospecting, an empirical formulation is proposed for the determination of allowable bearing capacity of shallow foundations. The proposed expression corroborates consistently with the results of the classical theory and is proven to be reliable and safe, also from the view point of maximum allowable settlements. It consists of only two soil parameters, namely, the Institut measured shear wave velocity, and the unit weight. The unit weight may be also determined with sufficient accuracy, by means of another empirical expression, using the P-wave velocity. It is indicated that once the shear and P-wave velocities are measured Institut by an appropriate geophysical survey, the allowable bearing capacity is determined reliably through a single step operation. Such an approach, is considerably cost and time-saving, in practice

  8. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  9. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  10. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  11. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... alternative for the quantification of the disease' syndromes in regards to this crop. The result of these ..... parison of treatments such as cultivars or control measures and ..... Vascular discoloration and stem necrosis. 2.

  12. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.; Curioni, A.; Fedulova, I.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost

  13. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    Science.gov (United States)

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    Science.gov (United States)

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  15. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  16. Stochastic approach for radionuclides quantification

    Science.gov (United States)

    Clement, A.; Saurel, N.; Perrin, G.

    2018-01-01

    Gamma spectrometry is a passive non-destructive assay used to quantify radionuclides present in more or less complex objects. Basic methods using empirical calibration with a standard in order to quantify the activity of nuclear materials by determining the calibration coefficient are useless on non-reproducible, complex and single nuclear objects such as waste packages. Package specifications as composition or geometry change from one package to another and involve a high variability of objects. Current quantification process uses numerical modelling of the measured scene with few available data such as geometry or composition. These data are density, material, screen, geometric shape, matrix composition, matrix and source distribution. Some of them are strongly dependent on package data knowledge and operator backgrounds. The French Commissariat à l'Energie Atomique (CEA) is developing a new methodology to quantify nuclear materials in waste packages and waste drums without operator adjustment and internal package configuration knowledge. This method suggests combining a global stochastic approach which uses, among others, surrogate models available to simulate the gamma attenuation behaviour, a Bayesian approach which considers conditional probability densities of problem inputs, and Markov Chains Monte Carlo algorithms (MCMC) which solve inverse problems, with gamma ray emission radionuclide spectrum, and outside dimensions of interest objects. The methodology is testing to quantify actinide activity in different kind of matrix, composition, and configuration of sources standard in terms of actinide masses, locations and distributions. Activity uncertainties are taken into account by this adjustment methodology.

  17. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  18. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  19. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  20. Mixture quantification using PLS in plastic scintillation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bagan, H.; Tarancon, A.; Rauret, G. [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Garcia, J.F., E-mail: jfgarcia@ub.ed [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain)

    2011-06-15

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ({sup 241}Am, {sup 137}Cs and {sup 90}Sr/{sup 90}Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter.

  1. Spectroscopic quantification of 5-hydroxymethylcytosine in genomic DNA.

    Science.gov (United States)

    Shahal, Tamar; Gilat, Noa; Michaeli, Yael; Redy-Keisar, Orit; Shabat, Doron; Ebenstein, Yuval

    2014-08-19

    5-Hydroxymethylcytosine (5hmC), a modified form of the DNA base cytosine, is an important epigenetic mark linked to regulation of gene expression in development, and tumorigenesis. We have developed a spectroscopic method for a global quantification of 5hmC in genomic DNA. The assay is performed within a multiwell plate, which allows simultaneous recording of up to 350 samples. Our quantification procedure of 5hmC is direct, simple, and rapid. It relies on a two-step protocol that consists of enzymatic glucosylation of 5hmC with an azide-modified glucose, followed by a "click reaction" with an alkyne-fluorescent tag. The fluorescence intensity recorded from the DNA sample is proportional to its 5hmC content and can be quantified by a simple plate reader measurement. This labeling technique is specific and highly sensitive, allowing detection of 5hmC down to 0.002% of the total nucleotides. Our results reveal significant variations in the 5hmC content obtained from different mouse tissues, in agreement with previously reported data.

  2. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    Science.gov (United States)

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  3. Lung involvement quantification in chest radiographs

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Oliveira, Marcela de; Miranda, Jose Ricardo A.; Pina, Diana R.; Pereira, Paulo C.M.; Ribeiro, Sergio M.

    2014-01-01

    Tuberculosis (TB) caused by Mycobacterium tuberculosis, is an infectious disease which remains a global health problem. The chest radiography is the commonly method employed to assess the TB's evolution. The methods for quantification of abnormalities of chest are usually performed on CT scans (CT). This quantification is important to assess the TB evolution and treatment and comparing different treatments. However, precise quantification is not feasible for the amount of CT scans required. The purpose of this work is to develop a methodology for quantification of lung damage caused by TB through chest radiographs. It was developed an algorithm for computational processing of exams in Matlab, which creates a lungs' 3D representation, with compromised dilated regions inside. The quantification of lung lesions was also made for the same patients through CT scans. The measurements from the two methods were compared and resulting in strong correlation. Applying statistical Bland and Altman, all samples were within the limits of agreement, with a confidence interval of 95%. The results showed an average variation of around 13% between the two quantification methods. The results suggest the effectiveness and applicability of the method developed, providing better risk-benefit to the patient and cost-benefit ratio for the institution. (author)

  4. Does Confucianism allow for body donation?

    Science.gov (United States)

    Jones, D Gareth; Nie, Jing-Bao

    2018-01-16

    Confucianism has been widely perceived as a major moral and cultural obstacle to the donation of bodies for anatomical purposes. The rationale for this is the Confucian stress on xiao (filial piety), whereby individuals' bodies are to be intact at death. In the view of many, the result is a prohibition on the donation of bodies to anatomy departments for the purpose of dissection. The role of dissection throughout the development of anatomy within a Confucian context is traced, and in contemporary China the establishment of donation programs and the appearance of memorial monuments is noted. In reassessing Confucian attitudes, the stress laid on a particular interpretation of filial piety is questioned, and an attempt is made to balance this with the Confucian emphasis on a moral duty to those outside one's immediate family. The authors argue that the fundamental Confucian norm ren (humaneness or benevolence) allows for body donation as people have a moral duty to help others. Moreover, the other central Confucian value, li (rites), offers important insights on how body donation should be performed as a communal activity, particularly the necessity of developing ethically and culturally appropriate rituals for body donation. In seeking to learn from this from a Western perspective, it is contended that in all societies the voluntary donation of bodies is a deeply human activity that is to reflect the characteristics of the community within which it takes place. This is in large part because it has educational and personal repercussions for students. Anat Sci Educ. © 2018 American Association of Anatomists. © 2018 American Association of Anatomists.

  5. Pesquisa sobre as Condições de Saúde Bucal da População Brasileira (SB Brasil 2003: seus dados não produzem estimativas populacionais, mas há possibilidade de correção Brazilian Oral Health Survey (SB Brazil 2003: data do not allow for population estimates, but correction is possible

    Directory of Open Access Journals (Sweden)

    Rejane Christine de Sousa Queiroz

    2009-01-01

    Full Text Available A Pesquisa sobre as Condições de Saúde Bucal da População Brasileira (SB Brasil 2003 foi a mais abrangente pesquisa sobre as condições de saúde bucal realizada no Brasil até hoje. Métodos de amostragem probabilística foram utilizados para assegurar que os dados coletados representariam a população das faixas etárias escolhidas nas cinco regiões do Brasil. No entanto, isso não foi possível, pois o processo de amostragem da pesquisa não foi finalizado. Sua conclusão pressupõe o cálculo dos pesos amostrais e a identificação das demais variáveis estruturais da amostra (estratos de seleção e unidades primárias de amostragem. Este artigo descreve o desenho de amostra desta pesquisa, formula as probabilidades de inclusão nos diversos estágios de seleção e propõe estratégias de cálculo dos pesos amostrais. A solução para determinar os pesos amostrais e identificar as demais variáveis estruturais da amostra, proposta no artigo, parte da recuperação de informações que deveriam ter sido registradas nos relatórios produzidos durante a pesquisa de campo e, em sua ausência, em informações disponíveis no Instituto Brasileiro de Geografia e Estatística e no Ministério da Educação, como aproximações válidas.The Brazilian Oral Health Survey (SB Brazil 2003 was the most comprehensive study on oral health conditions ever conducted in Brazil. Probabilistic sampling methods were applied in order for the collected data to represent the population age groups selected in the 5 regions of the country. However, this was not possible because the sampling process was never concluded, which would require estimation of the sample weights and identification of the sample's other structural variables (selection strata and primary sampling units. This paper describes the SB Brazil 2003 sample design, formulates the inclusion probabilities in the multiple selection stages, and proposes strategies for estimating the sample

  6. Quantification of trace metals in water using complexation and filter concentration.

    Science.gov (United States)

    Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel

    2010-06-15

    Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.

  7. Direct liquid chromatography method for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines.

    Science.gov (United States)

    Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen

    2011-11-09

    A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.

  8. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  9. Identification of Spectral Regions for Quantification of Red Wine Tannins with Fourier Transform Mid-Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Jensen, Jacob Skibsted; Egebo, Max; Meyer, Anne S.

    2008-01-01

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due...... to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included...... to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69−79 mg of CE/L; r = 0...

  10. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    Science.gov (United States)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  11. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  12. 3D automatic quantification applied to optically sectioned images to improve microscopy analysis

    Directory of Open Access Journals (Sweden)

    JE Diaz-Zamboni

    2009-08-01

    Full Text Available New fluorescence microscopy techniques, such as confocal or digital deconvolution microscopy, allow to easily obtain three-dimensional (3D information from specimens. However, there are few 3D quantification tools that allow extracting information of these volumes. Therefore, the amount of information acquired by these techniques is difficult to manipulate and analyze manually. The present study describes a model-based method, which for the first time shows 3D visualization and quantification of fluorescent apoptotic body signals, from optical serial sections of porcine hepatocyte spheroids correlating them to their morphological structures. The method consists on an algorithm that counts apoptotic bodies in a spheroid structure and extracts information from them, such as their centroids in cartesian and radial coordinates, relative to the spheroid centre, and their integrated intensity. 3D visualization of the extracted information, allowed us to quantify the distribution of apoptotic bodies in three different zones of the spheroid.

  13. A performance study on three qPCR quantification kits and their compatibilities with the 6-dye DNA profiling systems.

    Science.gov (United States)

    Lin, Sze-Wah; Li, Christina; Ip, Stephen C Y

    2018-03-01

    DNA quantification plays an integral role in forensic DNA profiling. Not only does it estimate the total amount of amplifiable human autosomal and male DNA to ensure optimal amplification of target DNA for subsequent analysis, but also assesses the extraction efficiency and purity of the DNA extract. Latest DNA quantification systems even offer an estimate for the degree of DNA degradation in a sample. Here, we report the performance of three new generation qPCR kits, namely Investigator ® Quantiplex HYres Kit from QIAGEN, Quantifiler ® Trio DNA Quantification Kit from Applied Biosystems™, and PowerQuant ® System from Promega, and their compatibilities with three 6-dye DNA profiling systems. Our results have demonstrated that all three kits generate standard curves with satisfactory consistency and reproducibility, and are capable of screening out traces of male DNA in the presence of 30-fold excess of female DNA. They also exhibit a higher tolerance to PCR inhibition than Quantifiler ® Human DNA Quantification Kit from Applied Biosystems™ in autosomal DNA quantification. PowerQuant ® , as compared to Quantiplex HYres and Quantifiler ® Trio, shows a better precision for both autosomal and male DNA quantifications. Quantifiler ® Trio and PowerQuant ® in contrast to Quantiplex HYres offer better correlations with lower discrepancies between autosomal and male DNA quantification, and their additional degradation index features provide a detection platform for inhibited and/or degraded DNA template. Regarding the compatibility between these quantification and profiling systems: (1) both Quantifiler ® Trio and PowerQuant ® work well with GlobalFiler and Fusion 6C, allowing a fairly accurate prediction of their DNA typing results based on the quantification values; (2) Quantiplex HYres offers a fairly reliable IPC system for detecting any potential inhibitions on Investigator 24plex, whereas Quantifiler ® Trio and PowerQuant ® suit better for Global

  14. 76 FR 16629 - Federal Travel Regulation (FTR); Relocation Allowances-Relocation Income Tax Allowance (RITA) Tables

    Science.gov (United States)

    2011-03-24

    ... other FTR Bulletins can be found at http://www.gsa.gov/ftrbulletin . The RIT allowance tables are located at http://www.gsa.gov/relocationpolicy . DATES: This notice is effective March 24, 2011. FOR... CFR part 301-17 Appendices A through D. The tables will be published at http://www.gsa.gov...

  15. 78 FR 26637 - Federal Travel Regulation (FTR); Relocation Allowance-Relocation Income Tax (RIT) Allowable Tables

    Science.gov (United States)

    2013-05-07

    ...: The GSA published FTR Amendment 2008-04, in the Federal Register on June 25, 2008 (73 FR 35952), specifying that GSA would no longer publish the RIT Allowance tables in Title 41 of the Code of Federal..., 2013. Carolyn Austin-Diggs, Principal Deputy Administrator, Office of Asset and Transportation...

  16. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    Science.gov (United States)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair

  17. The quantification of risk and tourism

    Directory of Open Access Journals (Sweden)

    Piet Croucamp

    2014-01-01

    Full Text Available Tourism in South Africa comprises 9.5% of Gross Domestic Product (GDP, but remains an underresearched industry, especially regarding the quantification of the risks prevailing in the social, political and economic environment in which the industry operates. Risk prediction, extrapolation forecasting is conducted largely in the context of a qualitative methodology. This article reflects on the quantification of social constructs as variables of risk in the tourism industry with reference to South Africa. The theory and methodology of quantification is briefly reviewed and the indicators of risk are conceptualized and operationalized. The identified indicators are scaled in indices for purposes of quantification. Risk assessments and the quantification of constructs rely heavily on the experience - often personal - of the researcher and this scholarly endeavour is, therefore, not inclusive of all possible identified indicators of risk. It is accepted that tourism in South Africa is an industry comprising of a large diversity of sectors, each with a different set of risk indicators and risk profiles. The emphasis of this article is thus on the methodology to be applied to a risk profile. A secondary endeavour is to provide for clarity about the conceptual and operational confines of risk in general, as well as how quantified risk relates to the tourism industry. The indices provided include both domesticand international risk indicators. The motivation for the article is to encourage a greater emphasis on quantitative research in our efforts to understand and manage a risk profile for the tourist industry.

  18. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  19. An EPGPT-based approach for uncertainty quantification

    International Nuclear Information System (INIS)

    Wang, C.; Abdel-Khalik, H. S.

    2012-01-01

    Generalized Perturbation Theory (GPT) has been widely used by many scientific disciplines to perform sensitivity analysis and uncertainty quantification. This manuscript employs recent developments in GPT theory, collectively referred to as Exact-to-Precision Generalized Perturbation Theory (EPGPT), to enable uncertainty quantification for computationally challenging models, e.g. nonlinear models associated with many input parameters and many output responses and with general non-Gaussian parameters distributions. The core difference between EPGPT and existing GPT is in the way the problem is formulated. GPT formulates an adjoint problem that is dependent on the response of interest. It tries to capture via the adjoint solution the relationship between the response of interest and the constraints on the state variations. EPGPT recasts the problem in terms of a smaller set of what is referred to as the 'active' responses which are solely dependent on the physics model and the boundary and initial conditions rather than on the responses of interest. The objective of this work is to apply an EPGPT methodology to propagate cross-sections variations in typical reactor design calculations. The goal is to illustrate its use and the associated impact for situations where the typical Gaussian assumption for parameters uncertainties is not valid and when nonlinear behavior must be considered. To allow this demonstration, exaggerated variations will be employed to stimulate nonlinear behavior in simple prototypical neutronics models. (authors)

  20. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  1. Complex Empiricism and the Quantification of Uncertainty in Paleoclimate Reconstructions

    Science.gov (United States)

    Brumble, K. C.

    2014-12-01

    Because the global climate cannot be observed directly, and because of vast and noisy data sets, climate science is a rich field to study how computational statistics informs what it means to do empirical science. Traditionally held virtues of empirical science and empirical methods like reproducibility, independence, and straightforward observation are complicated by representational choices involved in statistical modeling and data handling. Examining how climate reconstructions instantiate complicated empirical relationships between model, data, and predictions reveals that the path from data to prediction does not match traditional conceptions of empirical inference either. Rather, the empirical inferences involved are "complex" in that they require articulation of a good deal of statistical processing wherein assumptions are adopted and representational decisions made, often in the face of substantial uncertainties. Proxy reconstructions are both statistical and paleoclimate science activities aimed at using a variety of proxies to reconstruct past climate behavior. Paleoclimate proxy reconstructions also involve complex data handling and statistical refinement, leading to the current emphasis in the field on the quantification of uncertainty in reconstructions. In this presentation I explore how the processing needed for the correlation of diverse, large, and messy data sets necessitate the explicit quantification of the uncertainties stemming from wrangling proxies into manageable suites. I also address how semi-empirical pseudo-proxy methods allow for the exploration of signal detection in data sets, and as intermediary steps for statistical experimentation.

  2. Terahertz identification and quantification of penicillamine enantiomers

    International Nuclear Information System (INIS)

    Ji Te; Zhao Hongwei; Chen Min; Xiao Tiqiao; Han Pengyu

    2013-01-01

    Identification and characterization of L-, D- and DL- penicillamine were demonstrated by Terahertz time-domain spectroscopy (THz-TDS). To understand the physical origins of the low frequency resonant modes, the density functional theory (DFT) was adopted for theoretical calculation. It was found that the collective THz frequency motions were decided by the intramolecular and intermolecular hydrogen bond interactions. Moreover, the quantification of penicillamine enantiomers mixture was demonstrated by a THz spectra fitting method with a relative error of less than 3.5%. This technique can be a valuable tool for the discrimination and quantification of chiral drugs in pharmaceutical industry. (authors)

  3. Quantification model for energy consumption in edification

    Directory of Open Access Journals (Sweden)

    Mercader, Mª P.

    2012-12-01

    Full Text Available The research conducted in this paper focuses on the generation of a model for the quantification of energy consumption in building. This is to be done through one of the most relevant environmental impact indicators associated with weight per m2 of construction, as well as the energy consumption resulting from the manufacturing process of materials used in building construction. The practical application of the proposed model on different buildings typologies in Seville, will provide information regarding the building materials, the subsystems and the most relevant construction elements. Hence, we will be able to observe the impact the built surface has on the environment. The results obtained aim to reference the scientific community, providing quantitative data comparable to other types of buildings and geographical areas. Furthermore, it may also allow the analysis and the characterization of feasible solutions to reduce the environmental impact generated by the different materials, subsystems and construction elements commonly used in the different building types defined in this study.

    La investigación realizada en el presente trabajo plantea la generación de un modelo de cuantificación del consumo energético en edificación, a través de uno de los indicadores de impacto ambiental más relevantes asociados al peso por m2 de construcción, el consumo energético derivado del proceso de fabricación de los materiales de construcción empleados en edificación. La aplicación práctica del modelo propuesto sobre diferentes tipologías edificatorias en Sevilla aportará información respecto a los materiales de construcción, subsistemas y elementos constructivos más impactantes, permitiendo visualizar la influencia que presenta la superficie construida en cuanto al impacto ambiental generado. Los resultados obtenidos pretenden servir de referencia a la comunidad científica, aportando datos num

  4. Benchmarking common quantification strategies for large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Hogrebe, Alexander; von Stechow, Louise; Bekker-Jensen, Dorte B

    2018-01-01

    Comprehensive mass spectrometry (MS)-based proteomics is now feasible, but reproducible quantification remains challenging, especially for post-translational modifications such as phosphorylation. Here, we compare the most popular quantification techniques for global phosphoproteomics: label-free...

  5. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    Science.gov (United States)

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  6. 49 CFR 230.24 - Maximum allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...

  7. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  8. Recurrence quantification analysis in Liu's attractor

    International Nuclear Information System (INIS)

    Balibrea, Francisco; Caballero, M. Victoria; Molera, Lourdes

    2008-01-01

    Recurrence Quantification Analysis is used to detect transitions chaos to periodical states or chaos to chaos in a new dynamical system proposed by Liu et al. This system contains a control parameter in the second equation and was originally introduced to investigate the forming mechanism of the compound structure of the chaotic attractor which exists when the control parameter is zero

  9. Quantification of coating aging using impedance measurements

    NARCIS (Netherlands)

    Westing, E.P.M. van; Weijde, D.H. van der; Vreijling, M.P.W.; Ferrari, G.M.; Wit, J.H.W. de

    1998-01-01

    This chapter shows the application results of a novel approach to quantify the ageing of organic coatings using impedance measurements. The ageing quantification is based on the typical impedance behaviour of barrier coatings in immersion. This immersion behaviour is used to determine the limiting

  10. Quantification analysis of CT for aphasic patients

    International Nuclear Information System (INIS)

    Watanabe, Shunzo; Ooyama, Hiroshi; Hojo, Kei; Tasaki, Hiroichi; Hanazono, Toshihide; Sato, Tokijiro; Metoki, Hirobumi; Totsuka, Motokichi; Oosumi, Noboru.

    1987-01-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on Slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis). (author)

  11. Quantification analysis of CT for aphasic patients

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, S.; Ooyama, H.; Hojo, K.; Tasaki, H.; Hanazono, T.; Sato, T.; Metoki, H.; Totsuka, M.; Oosumi, N.

    1987-02-01

    Using a microcomputer, the locus and extent of the lesions, as demonstrated by computed tomography, for 44 aphasic patients with various types of aphasia were superimposed onto standardized matrices, composed of 10 slices with 3000 points (50 by 60). The relationships between the foci of the lesions and types of aphasia were investigated on the slices numbered 3, 4, 5, and 6 using a quantification theory, Type 3 (pattern analysis). Some types of regularities were observed on slices 3, 4, 5, and 6. The group of patients with Broca's aphasia and the group with Wernicke's aphasia were generally separated on the 1st component and the 2nd component of the quantification theory, Type 3. On the other hand, the group with global aphasia existed between the group with Broca's aphasia and that with Wernicke's aphasia. The group of patients with amnestic aphasia had no specific findings, and the group with conduction aphasia existed near those with Wernicke's aphasia. The above results serve to establish the quantification theory, Type 2 (discrimination analysis) and the quantification theory, Type 1 (regression analysis).

  12. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  13. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  14. Noninvasive Quantification of Pancreatic Fat in Humans

    OpenAIRE

    Lingvay, Ildiko; Esser, Victoria; Legendre, Jaime L.; Price, Angela L.; Wertz, Kristen M.; Adams-Huet, Beverley; Zhang, Song; Unger, Roger H.; Szczepaniak, Lidia S.

    2009-01-01

    Objective: To validate magnetic resonance spectroscopy (MRS) as a tool for non-invasive quantification of pancreatic triglyceride (TG) content and to measure the pancreatic TG content in a diverse human population with a wide range of body mass index (BMI) and glucose control.

  15. Cues, quantification, and agreement in language comprehension.

    Science.gov (United States)

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  16. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated...

  17. Characterization and quantification of preferential flow in fractured rock systems, using resistivity tomography

    CSIR Research Space (South Africa)

    May, F

    2010-11-01

    Full Text Available , N Jovanovic2 and A Rozanov1 University of Stellenbosch1 and Council for Scientific and Industrial Research (CSIR)2 Characterization and quantification of preferential flow in fractured rock systems, using resistivity tomography Introduction... of slow and fast flowing pathways. Materials and Methods TABLE 1 DATE, TIME AND WEATHER CONDITIONS DURING RESISTIVITY TOMOGRAPHY SURVEY Survey No. Date Start time End time Precipitation (mm) Description KB001 8/27/2010 12H00 13H40 0.0 Sunny KB002 8...

  18. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    Science.gov (United States)

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    Science.gov (United States)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  20. New technique using [125I]labeled rose bengal for the quantification in blood samples of pipecuronium bromide, a muscle relaxant drug

    International Nuclear Information System (INIS)

    Schopfer, C.; Benakis, A.; Pittet, J.-F.; Tassonyi, E.

    1991-01-01

    A new technique involving the use of [ 125 I]labeled rose bengal for the quantification of pipecuronium bromide (a muscle relaxant drug) is presented. This technique, which is based on the ability of rose bengal to react with pipecuronium and then form a complex which can be extracted into an organic solvent, involves two steps: the purification and labeling of rose bengal with 125 I, and the quantification of pipecuronium. The specific activity of the compound (106 μCi/mg) allows for the quantification of pipecuronium in biological samples at concentrations as low as 5 ng/ml. (author)

  1. A method for the 3-D quantification of bridging ligaments during crack propagation

    International Nuclear Information System (INIS)

    Babout, L.; Janaszewski, M.; Marrow, T.J.; Withers, P.J.

    2011-01-01

    This letter shows how a hole-closing algorithm can be used to identify and quantify crack-bridging ligaments from a sequence of X-ray tomography images of intergranular stress corrosion cracking. This allows automatic quantification of the evolution of bridging ligaments through the crack propagation sequence providing fracture mechanics insight previously unobtainable from fractography. The method may also be applied to other three-dimensional materials science problems, such as closing walls in foams.

  2. Ex vivo activity quantification in micrometastases at the cellular scale using the α-camera technique

    DEFF Research Database (Denmark)

    Chouin, Nicolas; Lindegren, Sture; Frost, Sofia H L

    2013-01-01

    Targeted α-therapy (TAT) appears to be an ideal therapeutic technique for eliminating malignant circulating, minimal residual, or micrometastatic cells. These types of malignancies are typically infraclinical, complicating the evaluation of potential treatments. This study presents a method of ex...... vivo activity quantification with an α-camera device, allowing measurement of the activity taken up by tumor cells in biologic structures a few tens of microns....

  3. FRANX. Application for analysis and quantification of the APS fire; FRANK. Aplicacion para el analisis y cuantificacion de los APS de incendios

    Energy Technology Data Exchange (ETDEWEB)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-07-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  4. A Subaru galaxy redshift survey: WFMOS survey

    International Nuclear Information System (INIS)

    Takada, M

    2008-01-01

    A planned galaxy redshift survey with the Subaru 8.2m telescope, the WFMOS survey, offers a unique opportunity for probing detailed properties of large-scale structure formation in the expanding universe by measuring clustering strength of galaxy distribution as a function of distance scale and redshift. In particular, the precise measurement of the galaxy power spectrum, combined with the cosmic microwave background experiments, allows us to obtain stringent constraints on or even determine absolute mass scales of the Big-Bang relic neutrinos as the neutrinos imprint characteristic scale- and redshift-dependent modifications onto the galaxy power spectrum shape. Here we describe the basic concept of how the galaxy clustering measurement can be used to explore the neutrino masses, with particular emphasis on advantages of the WFMOS survey over the existing low-redshift surveys such as SDSS

  5. Quantification of Drosophila Grooming Behavior.

    Science.gov (United States)

    Barradale, Francesca; Sinha, Kairav; Lebestky, Tim

    2017-07-19

    Drosophila grooming behavior is a complex multi-step locomotor program that requires coordinated movement of both forelegs and hindlegs. Here we present a grooming assay protocol and novel chamber design that is cost-efficient and scalable for either small or large-scale studies of Drosophila grooming. Flies are dusted all over their body with Brilliant Yellow dye and given time to remove the dye from their bodies within the chamber. Flies are then deposited in a set volume of ethanol to solubilize the dye. The relative spectral absorbance of dye-ethanol samples for groomed versus ungroomed animals are measured and recorded. The protocol yields quantitative data of dye accumulation for individual flies, which can be easily averaged and compared across samples. This allows experimental designs to easily evaluate grooming ability for mutant animal studies or circuit manipulations. This efficient procedure is both versatile and scalable. We show work-flow of the protocol and comparative data between WT animals and mutant animals for the Drosophila type I Dopamine Receptor (DopR).

  6. Optical coherence tomography assessment and quantification of intracoronary thrombus: Status and perspectives

    International Nuclear Information System (INIS)

    Porto, Italo; Mattesini, Alessio; Valente, Serafina; Prati, Francesco; Crea, Filippo; Bolognese, Leonardo

    2015-01-01

    Coronary angiography is the “golden standard” imaging technique in interventional cardiology and it is still widely used to guide interventions. A major drawback of this technique, however, is that it is inaccurate in the evaluation and quantification of intracoronary thrombus burden, a critical prognosticator and predictor of intraprocedural complications in acute coronary syndromes. The introduction of optical coherence tomography (OCT) holds the promise of overcoming this important limitation, as near-infrared light is uniquely sensitive to hemoglobin, the pigment of red blood cells trapped in the thrombus. This narrative review will focus on the use of OCT for the assessment, evaluation and quantification of intracoronary thrombosis. - Highlights: • Thrombotic burden in acute coronary syndromes Is not adequately evaluated by standard coronary angiography, whereas Optical Coherence Tomography is exquisitely sensitive to the hemoglobin contained in red blood cells and can be used to precisely quantify thrombus. • Both research and clinical applications have been developed using the OCT-based evaluation of thrombus. In particular, whereas precise quantification scores are useful for comparing antithrombotic therapies in randomized trials, both pharmacological and mechanical, the most important practical applications for OCT-based assessment of thrombus are the individuation of culprit lesions in the context of diffuse atheromata in acute coronary syndromes, and the so-called “delayed stenting” strategies. • Improvements in 3D rendering techniques are on the verge of revolutionizing OCT-based thrombus assessment, allowing extremely precise quantification of the thrombotic burden

  7. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    Science.gov (United States)

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  8. Theoretical Study of Penalized-Likelihood Image Reconstruction for Region of Interest Quantification

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2006-01-01

    Region of interest (ROI) quantification is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Statistical image reconstruction methods based on the penalized maximum-likelihood (PML) or maximum a posteriori principle have been developed for emission tomography to deal with the low signal-to-noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the regularization parameter in PML reconstruction controls the resolution and noise tradeoff and, hence, affects ROI quantification. In this paper, we theoretically analyze the performance of ROI quantification in PML reconstructions. Building on previous work, we derive simplified theoretical expressions for the bias, variance, and ensemble mean-squared-error (EMSE) of the estimated total activity in an ROI that is surrounded by a uniform background. When the mean and covariance matrix of the activity inside the ROI are known, the theoretical expressions are readily computable and allow for fast evaluation of image quality for ROI quantification with different regularization parameters. The optimum regularization parameter can then be selected to minimize the EMSE. Computer simulations are conducted for small ROIs with variable uniform uptake. The results show that the theoretical predictions match the Monte Carlo results reasonably well

  9. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    Science.gov (United States)

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  11. Absolute and direct microRNA quantification using DNA-gold nanoparticle probes.

    Science.gov (United States)

    Degliangeli, Federica; Kshirsagar, Prakash; Brunetti, Virgilio; Pompa, Pier Paolo; Fiammengo, Roberto

    2014-02-12

    DNA-gold nanoparticle probes are implemented in a simple strategy for direct microRNA (miRNA) quantification. Fluorescently labeled DNA-probe strands are immobilized on PEGylated gold nanoparticles (AuNPs). In the presence of target miRNA, DNA-RNA heteroduplexes are formed and become substrate for the endonuclease DSN (duplex-specific nuclease). Enzymatic hydrolysis of the DNA strands yields a fluorescence signal due to diffusion of the fluorophores away from the gold surface. We show that the molecular design of our DNA-AuNP probes, with the DNA strands immobilized on top of the PEG-based passivation layer, results in nearly unaltered enzymatic activity toward immobilized heteroduplexes compared to substrates free in solution. The assay, developed in a real-time format, allows absolute quantification of as little as 0.2 fmol of miR-203. We also show the application of the assay for direct quantification of cancer-related miR-203 and miR-21 in samples of extracted total RNA from cell cultures. The possibility of direct and absolute quantification may significantly advance the use of microRNAs as biomarkers in the clinical praxis.

  12. Tool for objective quantification of pulmonary sequelae in monitoring of patients with tuberculosis

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Alvarez, Matheus; Pina, Diana R. de; Bacchim Neto, Fernando A.; Pereira, Paulo C.M.; Ribeiro, Sergio M.; Miranda, Jose Ricardo de A.

    2014-01-01

    Tuberculosis (TB), caused by Mycobacterium tuberculosis, is an ancient infectious disease that remains a global health problem. Chest radiography is the method commonly employed in assessing the evolution of TB. However, lung damage quantification methods are usually performed on a computerized tomography (CT). This objective quantification is important in the radiological monitoring of the patient by assessing the progression and treatment of TB. However, precise quantification is not feasible by the number of CT examinations necessary due to the high dose subjected to the patient and high cost to the institution. The purpose of this work is to develop a tool to quantify pulmonary sequelae caused by TB through chest X-rays. Aiming the proposed objective, a computational algorithm was developed, creating a three-dimensional representation of the lungs, with regions of dilated sequelae inside. It also made the quantification of pulmonary sequelae of these patients through CT scans performed in upcoming dates, minimizing the differences in disease progression. The measurements from the two methods were compared with results suggest that the effectiveness and applicability of the developed tool, allowing lower doses radiological monitoring of the patient during treatment

  13. Optical coherence tomography assessment and quantification of intracoronary thrombus: Status and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Porto, Italo, E-mail: italo.porto@gmail.com [Interventional Cardiology Unit, San Donato Hospital, Arezzo (Italy); Mattesini, Alessio; Valente, Serafina [Interventional Cardiology Unit, Careggi Hospital, Florence (Italy); Prati, Francesco [Interventional Cardiology San Giovanni Hospital, Rome (Italy); CLI foundation (Italy); Crea, Filippo [Department of Cardiovascular Sciences, Catholic University of the Sacred Heart, Rome (Italy); Bolognese, Leonardo [Interventional Cardiology Unit, San Donato Hospital, Arezzo (Italy)

    2015-04-15

    Coronary angiography is the “golden standard” imaging technique in interventional cardiology and it is still widely used to guide interventions. A major drawback of this technique, however, is that it is inaccurate in the evaluation and quantification of intracoronary thrombus burden, a critical prognosticator and predictor of intraprocedural complications in acute coronary syndromes. The introduction of optical coherence tomography (OCT) holds the promise of overcoming this important limitation, as near-infrared light is uniquely sensitive to hemoglobin, the pigment of red blood cells trapped in the thrombus. This narrative review will focus on the use of OCT for the assessment, evaluation and quantification of intracoronary thrombosis. - Highlights: • Thrombotic burden in acute coronary syndromes Is not adequately evaluated by standard coronary angiography, whereas Optical Coherence Tomography is exquisitely sensitive to the hemoglobin contained in red blood cells and can be used to precisely quantify thrombus. • Both research and clinical applications have been developed using the OCT-based evaluation of thrombus. In particular, whereas precise quantification scores are useful for comparing antithrombotic therapies in randomized trials, both pharmacological and mechanical, the most important practical applications for OCT-based assessment of thrombus are the individuation of culprit lesions in the context of diffuse atheromata in acute coronary syndromes, and the so-called “delayed stenting” strategies. • Improvements in 3D rendering techniques are on the verge of revolutionizing OCT-based thrombus assessment, allowing extremely precise quantification of the thrombotic burden.

  14. 40 CFR 60.4160 - Submission of Hg allowance transfers.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Submission of Hg allowance transfers... Times for Coal-Fired Electric Steam Generating Units Hg Allowance Transfers § 60.4160 Submission of Hg allowance transfers. An Hg authorized account representative seeking recordation of a Hg allowance transfer...

  15. 40 CFR 60.4142 - Hg allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Hg allowance allocations. 60.4142... Coal-Fired Electric Steam Generating Units Hg Allowance Allocations § 60.4142 Hg allowance allocations. (a)(1) The baseline heat input (in MMBtu) used with respect to Hg allowance allocations under...

  16. 40 CFR 73.27 - Special allowance reserve.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Special allowance reserve. 73.27 Section 73.27 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) SULFUR DIOXIDE ALLOWANCE SYSTEM Allowance Allocations § 73.27 Special allowance reserve. (a...

  17. 40 CFR 73.30 - Allowance tracking system accounts.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Allowance tracking system accounts. 73.30 Section 73.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) SULFUR DIOXIDE ALLOWANCE SYSTEM Allowance Tracking System § 73.30 Allowance tracking system...

  18. Coal sulfur-premium models for SO2 allowance valuation

    International Nuclear Information System (INIS)

    Henry, J.B. II; Radulski, D.R.; Ellingson, E.G.; Engels, J.P.

    1995-01-01

    Clean Air Capital Markets, an investment bank structuring SO 2 Allowance transactions, has designed two allowance value models. The first forecasts an equilibrium allowance value based on coal supply and demand. The second estimates the sulfur premium of all reported coal deliveries to utilities. Both models demonstrate that the fundamental allowance value is approximately double current spot market prices for small volumes of off-system allowances

  19. Ratemaking and accounting for allowances and compliance costs

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    The regulatory treatment of compliance costs and allowances will significantly affect both the utility's CAAA compliance decisions and the cost of compliance. Sections in this chapter include ratemaking treatment of allowances, utility buy-ins, the market test of compliance costs and utility incentive, FERC account classification, measuring the value of allowances, inventory methods for allowances, expense recognition of allowances, regulatory-created assets and liabilities, and application of the FERC proposal. 8 refs., 1 tab

  20. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  1. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  2. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  3. Level 2 probabilistic event analyses and quantification

    International Nuclear Information System (INIS)

    Boneham, P.

    2003-01-01

    In this paper an example of quantification of a severe accident phenomenological event is given. The performed analysis for assessment of the probability that the debris released from the reactor vessel was in a coolable configuration in the lower drywell is presented. It is also analysed the assessment of the type of core/concrete attack that would occur. The coolability of the debris ex-vessel evaluation by an event in the Simplified Boiling Water Reactor (SBWR) Containment Event Tree (CET) and a detailed Decomposition Event Tree (DET) developed to aid in the quantification of this CET event are considered. The headings in the DET selected to represent plant physical states (e.g., reactor vessel pressure at the time of vessel failure) and the uncertainties associated with the occurrence of critical physical phenomena (e.g., debris configuration in the lower drywell) considered important to assessing whether the debris was coolable or not coolable ex-vessel are also discussed

  4. SPECT quantification of regional radionuclide distributions

    International Nuclear Information System (INIS)

    Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1986-01-01

    SPECT quantification of regional radionuclide activities within the human body is affected by several physical and instrumental factors including attenuation of photons within the patient, Compton scattered events, the system's finite spatial resolution and object size, finite number of detected events, partial volume effects, the radiopharmaceutical biokinetics, and patient and/or organ motion. Furthermore, other instrumentation factors such as calibration of the center-of-rotation, sampling, and detector nonuniformities will affect the SPECT measurement process. These factors are described, together with examples of compensation methods that are currently available for improving SPECT quantification. SPECT offers the potential to improve in vivo estimates of absorbed dose, provided the acquisition, reconstruction, and compensation procedures are adequately implemented and utilized. 53 references, 2 figures

  5. Uncertainty quantification for hyperbolic and kinetic equations

    CERN Document Server

    Pareschi, Lorenzo

    2017-01-01

    This book explores recent advances in uncertainty quantification for hyperbolic, kinetic, and related problems. The contributions address a range of different aspects, including: polynomial chaos expansions, perturbation methods, multi-level Monte Carlo methods, importance sampling, and moment methods. The interest in these topics is rapidly growing, as their applications have now expanded to many areas in engineering, physics, biology and the social sciences. Accordingly, the book provides the scientific community with a topical overview of the latest research efforts.

  6. Quantification of heterogeneity observed in medical images

    OpenAIRE

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    Background There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging mod...

  7. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    Science.gov (United States)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  8. Artifacts Quantification of Metal Implants in MRI

    Science.gov (United States)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  9. Quantification of rutile in anatase by X-ray diffraction

    International Nuclear Information System (INIS)

    Chavez R, A.

    2001-01-01

    Nowadays the discovering of new and better materials required in all areas of the industry has been lead to the human being to introduce him to this small and great world. The crystalline materials, have properties markedly directional. When it is necessary to realize a quantitative analysis to these materials the task is not easy. The main objective of this work is the research of a real problem, its solution and perfecting of a technique involving the theoretical and experimental principles which allow the quantification of crystalline phases. The chapter 1 treats about the study of crystalline state during the last century, by means of the X-ray diffraction technique. The chapter 2 studies the nature and production of X-rays, the chapter 3 expounds the principles of the diffraction technique which to carry out when it is satisfied the Bragg law studying the powder diffraction method and its applications. In the chapter 4 it is explained how the intensities of the beams diffracted are determined by the atoms positions inside of the elemental cell of the crystal. The properties of the crystalline samples of anatase and rutile are described in the chapter 5. The results of this last analysis are the information which will be processed by means of the auxiliary software: Diffrac AT, Axum and Peakfit as well as the TAFOR and CUANTI software describing this part with more detail in the chapters 6 and 7 where it is mentioned step by step the function of each software until to reach the quantification of crystalline phases, objective of this work. Finally, in the chapter 8 there are a results analysis and conclusions. The contribution of this work is for those learned institutions of limited resources which can tackle in this way the characterization of materials. (Author)

  10. Developmental validation of the Quantifiler(®) HP and Trio Kits for human DNA quantification in forensic samples.

    Science.gov (United States)

    Holt, Allison; Wootton, Sharon Chao; Mulero, Julio J; Brzoska, Pius M; Langit, Emanuel; Green, Robert L

    2016-03-01

    The quantification of human genomic DNA is a necessary first step in the DNA casework sample analysis workflow. DNA quantification determines optimal sample input amounts for subsequent STR (short tandem repeat) genotyping procedures, as well as being a useful screening tool to identify samples most likely to provide probative genotypic evidence. To better mesh with the capabilities of newest-generation STR analysis assays, the Quantifiler(®) HP and Quantifiler(®) Trio DNA Quantification Kits were designed for greater detection sensitivity and more robust performance with samples that contain PCR inhibitors or degraded DNA. The new DNA quantification kits use multiplex TaqMan(®) assay-based fluorescent probe technology to simultaneously quantify up to three human genomic targets, allowing samples to be assessed for total human DNA, male contributor (i.e., Y-chromosome) DNA, as well as a determination of DNA degradation state. The Quantifiler HP and Trio Kits use multiple-copy loci to allow for significantly improved sensitivity compared to earlier-generation kits that employ single-copy target loci. The kits' improved performance provides better predictive ability for results with downstream, newest-generation STR assays, and their shortened time-to-result allows more efficient integration into the forensic casework analysis workflow. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Inter-laboratory assessment of different digital PCR platforms for quantification of human cytomegalovirus DNA.

    Science.gov (United States)

    Pavšič, Jernej; Devonshire, Alison; Blejec, Andrej; Foy, Carole A; Van Heuverswyn, Fran; Jones, Gerwyn M; Schimmel, Heinz; Žel, Jana; Huggett, Jim F; Redshaw, Nicholas; Karczmarczyk, Maria; Mozioğlu, Erkan; Akyürek, Sema; Akgöz, Müslüm; Milavec, Mojca

    2017-04-01

    Quantitative PCR (qPCR) is an important tool in pathogen detection. However, the use of different qPCR components, calibration materials and DNA extraction methods reduces comparability between laboratories, which can result in false diagnosis and discrepancies in patient care. The wider establishment of a metrological framework for nucleic acid tests could improve the degree of standardisation of pathogen detection and the quantification methods applied in the clinical context. To achieve this, accurate methods need to be developed and implemented as reference measurement procedures, and to facilitate characterisation of suitable certified reference materials. Digital PCR (dPCR) has already been used for pathogen quantification by analysing nucleic acids. Although dPCR has the potential to provide robust and accurate quantification of nucleic acids, further assessment of its actual performance characteristics is needed before it can be implemented in a metrological framework, and to allow adequate estimation of measurement uncertainties. Here, four laboratories demonstrated reproducibility (expanded measurement uncertainties below 15%) of dPCR for quantification of DNA from human cytomegalovirus, with no calibration to a common reference material. Using whole-virus material and extracted DNA, an intermediate precision (coefficients of variation below 25%) between three consecutive experiments was noted. Furthermore, discrepancies in estimated mean DNA copy number concentrations between laboratories were less than twofold, with DNA extraction as the main source of variability. These data demonstrate that dPCR offers a repeatable and reproducible method for quantification of viral DNA, and due to its satisfactory performance should be considered as candidate for reference methods for implementation in a metrological framework.

  12. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    Science.gov (United States)

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  13. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    Directory of Open Access Journals (Sweden)

    Robert G Rutledge

    Full Text Available BACKGROUND: Linear regression of efficiency (LRE introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. FINDINGS: Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. CONCLUSIONS: The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  14. 42 CFR 50.504 - Allowable cost of drugs.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Allowable cost of drugs. 50.504 Section 50.504... APPLICABILITY Maximum Allowable Cost for Drugs § 50.504 Allowable cost of drugs. (a) The maximum amount which may be expended from program funds for the acquisition of any drug shall be the lowest of (1) The...

  15. 46 CFR 54.25-5 - Corrosion allowance.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Corrosion allowance. 54.25-5 Section 54.25-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Construction With Carbon, Alloy, and Heat Treated Steels § 54.25-5 Corrosion allowance. The corrosion allowance...

  16. 48 CFR 2152.231-70 - Accounting and allowable cost.

    Science.gov (United States)

    2010-10-01

    ... allowable cost. As prescribed in 2131.270, insert the following clause: Accounting and Allowable Cost (OCT... cost; (ii) Incurred with proper justification and accounting support; (iii) Determined in accordance... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Accounting and allowable...

  17. 45 CFR 1801.43 - Allowance for books.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Allowance for books. 1801.43 Section 1801.43... HARRY S. TRUMAN SCHOLARSHIP PROGRAM Payments to Finalists and Scholars § 1801.43 Allowance for books. The cost allowance for a Scholar's books is $1000 per year, or such higher amount published on the...

  18. 40 CFR 60.4153 - Recordation of Hg allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Recordation of Hg allowance allocations... Times for Coal-Fired Electric Steam Generating Units Hg Allowance Tracking System § 60.4153 Recordation of Hg allowance allocations. (a) By December 1, 2006, the Administrator will record in the Hg Budget...

  19. 17 CFR 190.07 - Calculation of allowed net equity.

    Science.gov (United States)

    2010-04-01

    ...; and (iii) The current realizable market value, determined as of the close of the market on the last... 17 Commodity and Securities Exchanges 1 2010-04-01 2010-04-01 false Calculation of allowed net... BANKRUPTCY § 190.07 Calculation of allowed net equity. Allowed net equity shall be computed as follows: (a...

  20. 32 CFR 842.35 - Depreciation and maximum allowances.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Depreciation and maximum allowances. 842.35... LITIGATION ADMINISTRATIVE CLAIMS Personnel Claims (31 U.S.C. 3701, 3721) § 842.35 Depreciation and maximum allowances. The military services have jointly established the “Allowance List-Depreciation Guide” to...

  1. 50 CFR 665.127 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Fisheries § 665.127 Allowable gear and gear restrictions. (a) American Samoa coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...

  2. 50 CFR 665.627 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Island Area Fisheries § 665.627 Allowable gear and gear restrictions. (a) Coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...

  3. 50 CFR 665.227 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Fisheries § 665.227 Allowable gear and gear restrictions. (a) Hawaii coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...

  4. 50 CFR 665.427 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Archipelago Fisheries § 665.427 Allowable gear and gear restrictions. (a) Mariana coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp...

  5. Quantification of structural uncertainties in multi-scale models; case study of the Lublin Basin, Poland

    Science.gov (United States)

    Małolepszy, Zbigniew; Szynkaruk, Ewa

    2015-04-01

    The multiscale static modeling of regional structure of the Lublin Basin is carried on in the Polish Geological Institute, in accordance with principles of integrated 3D geological modelling. The model is based on all available geospatial data from Polish digital databases and analogue archives. Mapped regional structure covers the area of 260x80 km located between Warsaw and Polish-Ukrainian border, along NW-SE-trending margin of the East European Craton. Within the basin, the Paleozoic beds with coalbearing Carboniferous and older formations containing hydrocarbons and unconventional prospects are covered unconformably by Permo-Mesozoic and younger rocks. Vertical extent of the regional model is set from topographic surface to 6000 m ssl and at the bottom includes some Proterozoic crystalline formations of the craton. The project focuses on internal consistency of the models built at different scales - from basin (small) scale to field-scale (large-scale). The models, nested in the common structural framework, are being constructed with regional geological knowledge, ensuring smooth transition in the 3D model resolution and amount of geological detail. Major challenge of the multiscale approach to subsurface modelling is the assessment and consistent quantification of various types of geological uncertainties tied to those various scale sub-models. Decreasing amount of information with depth and, particularly, very limited data collected below exploration targets, as well as accuracy and quality of data, all have the most critical impact on the modelled structure. In deeper levels of the Lublin Basin model, seismic interpretation of 2D surveys is sparsely tied to well data. Therefore time-to-depth conversion carries one of the major uncertainties in the modeling of structures, especially below 3000 m ssl. Furthermore, as all models at different scales are based on the same dataset, we must deal with different levels of generalization of geological structures. The

  6. Factors affecting the carbon allowance market in the US

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Seok; Koo, Won W. [Center for Agricultural Policy and Trade Studies, Department of Agribusiness and Applied Economics, North Dakota State University, Dept 7610, P.O. Box 6050, Fargo, ND 58103-6050 (United States)

    2010-04-15

    The US carbon allowance market has different characteristic and price determination process from the EU ETS market, since emitting installations voluntarily participate in emission trading scheme. This paper examines factors affecting the US carbon allowance market. An autoregressive distributed lag model is used to examine the short- and long-run relationships between the US carbon allowance market and its determinant factors. In the long-run, the price of coal is a main factor in the determination of carbon allowance trading. In the short-run, on the other hand, the changes in crude oil and natural gas prices as well as coal price have significant effects on carbon allowance market. (author)

  7. Factors affecting the carbon allowance market in the US

    International Nuclear Information System (INIS)

    Kim, Hyun Seok; Koo, Won W.

    2010-01-01

    The US carbon allowance market has different characteristic and price determination process from the EU ETS market, since emitting installations voluntarily participate in emission trading scheme. This paper examines factors affecting the US carbon allowance market. An autoregressive distributed lag model is used to examine the short- and long-run relationships between the US carbon allowance market and its determinant factors. In the long-run, the price of coal is a main factor in the determination of carbon allowance trading. In the short-run, on the other hand, the changes in crude oil and natural gas prices as well as coal price have significant effects on carbon allowance market.

  8. Quantification of viral DNA during HIV-1 infection: A review of relevant clinical uses and laboratory methods.

    Science.gov (United States)

    Alidjinou, E K; Bocket, L; Hober, D

    2015-02-01

    Effective antiretroviral therapy usually leads to undetectable HIV-1 RNA in the plasma. However, the virus persists in some cells of infected patients as various DNA forms, both integrated and unintegrated. This reservoir represents the greatest challenge to the complete cure of HIV-1 infection and its characteristics highly impact the course of the disease. The quantification of HIV-1 DNA in blood samples constitutes currently the most practical approach to measure this residual infection. Real-time quantitative PCR (qPCR) is the most common method used for HIV-DNA quantification and many strategies have been developed to measure the different forms of HIV-1 DNA. In the literature, several "in-house" PCR methods have been used and there is a need for standardization to have comparable results. In addition, qPCR is limited for the precise quantification of low levels by background noise. Among new assays in development, digital PCR was shown to allow an accurate quantification of HIV-1 DNA. Total HIV-1 DNA is most commonly measured in clinical routine. The absolute quantification of proviruses and unintegrated forms is more often used for research purposes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  9. "Suntelligence" Survey

    Science.gov (United States)

    ... to the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure ... be able to view a ranking of major cities suntelligence based on residents' responses to this survey. ...

  10. Absolute quantification of Bovine Viral Diarrhea Virus (BVDV) RNA by the digital PCR technique

    Science.gov (United States)

    Flatschart, R. B.; Almeida, D. O.; Heinemann, M. B.; Medeiros, M. N.; Granjeiro, J. M.; Folgueras-Flatschart, A. V.

    2015-01-01

    The quality control of cell lines used in research and industry is critical to ensure confidence in experimental results and to guarantee the safety of biopharmaceuticals to consumers. The BVDV is a common adventitious agent in many cell lines. We preliminarly evaluate the use of Digital Droplet PCR (ddPCR) for the detection and enumeration of genome copies of BVDV in cell culture and on FBS. The application of a commercial Real-Time PCR kit with the ddPCR technique was successful on different matrices. The technique allowed the absolute quantification of the genome without the use of calibration standards, suggesting its promising application on the development of reference materials for quantification of nucleic acids.

  11. Self-optimized construction of transition rate matrices from accelerated atomistic simulations with Bayesian uncertainty quantification

    Science.gov (United States)

    Swinburne, Thomas D.; Perez, Danny

    2018-05-01

    A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.

  12. Rapid capillary electrophoresis approach for the quantification of ewe milk adulteration with cow milk.

    Science.gov (United States)

    Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico

    2017-10-13

    The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. A simple method to improve the quantification accuracy of energy-dispersive X-ray microanalysis

    International Nuclear Information System (INIS)

    Walther, T

    2008-01-01

    Energy-dispersive X-ray spectroscopy in a transmission electron microscope is a standard tool for chemical microanalysis and routinely provides qualitative information on the presence of all major elements above Z=5 (boron) in a sample. Spectrum quantification relies on suitable corrections for absorption and fluorescence, in particular for thick samples and soft X-rays. A brief presentation is given of an easy way to improve quantification accuracy by evaluating the intensity ratio of two measurements acquired at different detector take-off angles. As the take-off angle determines the effective sample thickness seen by the detector this method corresponds to taking two measurements from the same position at two different thicknesses, which allows to correct absorption and fluorescence more reliably. An analytical solution for determining the depth of a feature embedded in the specimen foil is also provided.

  14. FIM imaging and FIMtrack: two new tools allowing high-throughput and cost effective locomotion analysis.

    Science.gov (United States)

    Risse, Benjamin; Otto, Nils; Berh, Dimitri; Jiang, Xiaoyi; Klämbt, Christian

    2014-12-24

    The analysis of neuronal network function requires a reliable measurement of behavioral traits. Since the behavior of freely moving animals is variable to a certain degree, many animals have to be analyzed, to obtain statistically significant data. This in turn requires a computer assisted automated quantification of locomotion patterns. To obtain high contrast images of almost translucent and small moving objects, a novel imaging technique based on frustrated total internal reflection called FIM was developed. In this setup, animals are only illuminated with infrared light at the very specific position of contact with the underlying crawling surface. This methodology results in very high contrast images. Subsequently, these high contrast images are processed using established contour tracking algorithms. Based on this, we developed the FIMTrack software, which serves to extract a number of features needed to quantitatively describe a large variety of locomotion characteristics. During the development of this software package, we focused our efforts on an open source architecture allowing the easy addition of further modules. The program operates platform independent and is accompanied by an intuitive GUI guiding the user through data analysis. All locomotion parameter values are given in form of csv files allowing further data analyses. In addition, a Results Viewer integrated into the tracking software provides the opportunity to interactively review and adjust the output, as might be needed during stimulus integration. The power of FIM and FIMTrack is demonstrated by studying the locomotion of Drosophila larvae.

  15. Quantification in single photon emission computed tomography (SPECT)

    International Nuclear Information System (INIS)

    Buvat, Irene

    2005-01-01

    The objective of this lecture is to understand the possibilities and limitations of the quantitative analysis of single photon emission computed tomography (SPECT) images. It is also to identify the conditions to be fulfilled to obtain reliable quantitative measurements from images. Content: 1 - Introduction: Quantification in emission tomography - definition and challenges; quantification biasing phenomena; 2 - quantification in SPECT, problems and correction methods: Attenuation, scattering, un-stationary spatial resolution, partial volume effect, movement, tomographic reconstruction, calibration; 3 - Synthesis: actual quantification accuracy; 4 - Beyond the activity concentration measurement

  16. Digital PCR for direct quantification of viruses without DNA extraction.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    DNA extraction before amplification is considered an essential step for quantification of viral DNA using real-time PCR (qPCR). However, this can directly affect the final measurements due to variable DNA yields and removal of inhibitors, which leads to increased inter-laboratory variability of qPCR measurements and reduced agreement on viral loads. Digital PCR (dPCR) might be an advantageous methodology for the measurement of virus concentrations, as it does not depend on any calibration material and it has higher tolerance to inhibitors. DNA quantification without an extraction step (i.e. direct quantification) was performed here using dPCR and two different human cytomegalovirus whole-virus materials. Two dPCR platforms were used for this direct quantification of the viral DNA, and these were compared with quantification of the extracted viral DNA in terms of yield and variability. Direct quantification of both whole-virus materials present in simple matrices like cell lysate or Tris-HCl buffer provided repeatable measurements of virus concentrations that were probably in closer agreement with the actual viral load than when estimated through quantification of the extracted DNA. Direct dPCR quantification of other viruses, reference materials and clinically relevant matrices is now needed to show the full versatility of this very promising and cost-efficient development in virus quantification.

  17. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    Erythropoietin (EPO) quantification during cell line selection and bioreactor cultivation has traditionally been performed with ELISA or HPLC. As these techniques suffer from several drawbacks, we developed a novel EPO quantification assay. A camelid single-domain antibody fragment directed against...... human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  18. 40 CFR 82.10 - Availability of consumption allowances in addition to baseline consumption allowances for class I...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Availability of consumption allowances in addition to baseline consumption allowances for class I controlled substances. 82.10 Section 82.10... STRATOSPHERIC OZONE Production and Consumption Controls § 82.10 Availability of consumption allowances in...

  19. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  20. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  2. Quantification of thermal damage in skin tissue

    Institute of Scientific and Technical Information of China (English)

    Xu Feng; Wen Ting; Lu Tianjian; Seffen Keith

    2008-01-01

    Skin thermal damage or skin burns are the most commonly encountered type of trauma in civilian and military communities. Besides, advances in laser, microwave and similar technologies have led to recent developments of thermal treatments for disease and damage involving skin tissue, where the objective is to induce thermal damage precisely within targeted tissue structures but without affecting the surrounding, healthy tissue. Further, extended pain sensation induced by thermal damage has also brought great problem for burn patients. Thus, it is of great importance to quantify the thermal damage in skin tissue. In this paper, the available models and experimental methods for quantification of thermal damage in skin tissue are discussed.

  3. Automated Quantification of Pneumothorax in CT

    Science.gov (United States)

    Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091

  4. Uncertainty quantification for PZT bimorph actuators

    Science.gov (United States)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  5. Linking probe thermodynamics to microarray quantification

    International Nuclear Information System (INIS)

    Li, Shuzhao; Pozhitkov, Alexander; Brouwer, Marius

    2010-01-01

    Understanding the difference in probe properties holds the key to absolute quantification of DNA microarrays. So far, Langmuir-like models have failed to link sequence-specific properties to hybridization signals in the presence of a complex hybridization background. Data from washing experiments indicate that the post-hybridization washing has no major effect on the specifically bound targets, which give the final signals. Thus, the amount of specific targets bound to probes is likely determined before washing, by the competition against nonspecific binding. Our competitive hybridization model is a viable alternative to Langmuir-like models. (comment)

  6. Image cytometry: nuclear and chromosomal DNA quantification.

    Science.gov (United States)

    Carvalho, Carlos Roberto; Clarindo, Wellington Ronildo; Abreu, Isabella Santiago

    2011-01-01

    Image cytometry (ICM) associates microscopy, digital image and software technologies, and has been particularly useful in spatial and densitometric cytological analyses, such as DNA ploidy and DNA content measurements. Basically, ICM integrates methodologies of optical microscopy calibration, standard density filters, digital CCD camera, and image analysis softwares for quantitative applications. Apart from all system calibration and setup, cytological protocols must provide good slide preparations for efficient and reliable ICM analysis. In this chapter, procedures for ICM applications employed in our laboratory are described. Protocols shown here for human DNA ploidy determination and quantification of nuclear and chromosomal DNA content in plants could be used as described, or adapted for other studies.

  7. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  8. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  9. Preclinical imaging characteristics and quantification of Platinum-195m SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Aalbersberg, E.A.; Wit-van der Veen, B.J. de; Vegt, E.; Vogel, Wouter V. [The Netherlands Cancer Institute (NKI-AVL), Department of Nuclear Medicine, Amsterdam (Netherlands); Zwaagstra, O.; Codee-van der Schilden, K. [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands)

    2017-08-15

    In vivo biodistribution imaging of platinum-based compounds may allow better patient selection for treatment with chemo(radio)therapy. Radiolabeling with Platinum-195m ({sup 195m}Pt) allows SPECT imaging, without altering the chemical structure or biological activity of the compound. We have assessed the feasibility of {sup 195m}Pt SPECT imaging in mice, with the aim to determine the image quality and accuracy of quantification for current preclinical imaging equipment. Enriched (>96%) {sup 194}Pt was irradiated in the High Flux Reactor (HFR) in Petten, The Netherlands (NRG). A 0.05 M HCl {sup 195m}Pt-solution with a specific activity of 33 MBq/mg was obtained. Image quality was assessed for the NanoSPECT/CT (Bioscan Inc., Washington DC, USA) and U-SPECT{sup +}/CT (MILabs BV, Utrecht, the Netherlands) scanners. A radioactivity-filled rod phantom (rod diameter 0.85-1.7 mm) filled with 1 MBq {sup 195m}Pt was scanned with different acquisition durations (10-120 min). Four healthy mice were injected intravenously with 3-4 MBq {sup 195m}Pt. Mouse images were acquired with the NanoSPECT for 120 min at 0, 2, 4, or 24 h after injection. Organs were delineated to quantify {sup 195m}Pt concentrations. Immediately after scanning, the mice were sacrificed, and the platinum concentration was determined in organs using a gamma counter and graphite furnace - atomic absorption spectroscopy (GF-AAS) as reference standards. A 30-min acquisition of the phantom provided visually adequate image quality for both scanners. The smallest visible rods were 0.95 mm in diameter on the NanoSPECT and 0.85 mm in diameter on the U-SPECT{sup +}. The image quality in mice was visually adequate. Uptake was seen in the kidneys with excretion to the bladder, and in the liver, blood, and intestine. No uptake was seen in the brain. The Spearman correlation between SPECT and gamma counter was 0.92, between SPECT and GF-AAS it was 0.84, and between GF-AAS and gamma counter it was0.97 (all p < 0

  10. Uncertainty Quantification Bayesian Framework for Porous Media Flows

    Science.gov (United States)

    Demyanov, V.; Christie, M.; Erbas, D.

    2005-12-01

    Uncertainty quantification is an increasingly important aspect of many areas of applied science, where the challenge is to make reliable predictions about the performance of complex physical systems in the absence of complete or reliable data. Predicting flows of fluids through undersurface reservoirs is an example of a complex system where accuracy in prediction is needed (e.g. in oil industry it is essential for financial reasons). Simulation of fluid flow in oil reservoirs is usually carried out using large commercially written finite difference simulators solving conservation equations describing the multi-phase flow through the porous reservoir rocks, which is a highly computationally expensive task. This work examines a Bayesian Framework for uncertainty quantification in porous media flows that uses a stochastic sampling algorithm to generate models that match observed time series data. The framework is flexible for a wide range of general physical/statistical parametric models, which are used to describe the underlying hydro-geological process in its temporal dynamics. The approach is based on exploration of the parameter space and update of the prior beliefs about what the most likely model definitions are. Optimization problem for a highly parametric physical model usually have multiple solutions, which impact the uncertainty of the made predictions. Stochastic search algorithm (e.g. genetic algorithm) allows to identify multiple "good enough" models in the parameter space. Furthermore, inference of the generated model ensemble via MCMC based algorithm evaluates the posterior probability of the generated models and quantifies uncertainty of the predictions. Machine learning algorithm - Artificial Neural Networks - are used to speed up the identification of regions in parameter space where good matches to observed data can be found. Adaptive nature of ANN allows to develop different ways of integrating them into the Bayesian framework: as direct time

  11. Slotting allowances to coordinate manufacturers’ retail sales effort

    OpenAIRE

    Foros, Øystein; Kind, Hans Jarle; Sand, Jan Yngve

    2007-01-01

    Slotting allowances are fees paid by manufacturers to get access to retailers’ shelf space. Although the main attention towards slotting allowances has been within the grocery industry, slotting allowances have also been applied within e.g. e-commerce and mobile telephony. In these industries we observe that distributors have large market power due to their control of access to customers. We analyse how shifting bargaining power from manufacturers to retailers and the use of slotting allowanc...

  12. Impediments to markets for SO2 emission allowances

    International Nuclear Information System (INIS)

    Walsh, M.; Ramesh, V.C.; Ghosh, K.

    1996-01-01

    The Clean Air Act (CAA) of 1990 imposed tighter limits on allowed emissions from electric utilities. The CAA also introduced an innovative SO 2 market mechanism to help lower the cost of compliance. The annual Environmental Protection Agency (EPA) auctions of emission allowances intended to help usher in the market mechanisms for trading allowances. In that respect, the results have been mixed. A full fledged market for emission allowances has been slow to emerge. Starting with a detailed study of the EPA auctions to date, this paper analyzes and discusses some of the reasons for this slow development

  13. Wavelets in quantification of liver tumors in contrasted computed tomography images

    International Nuclear Information System (INIS)

    Rodrigues, Bruna T.; Alvarez, Matheus; Souza, Rafael T.F.; Miranda, Jose R.A.; Romeiro, Fernando G.; Pina, Diana R. de; Trindade, Andre Petean

    2012-01-01

    This paper presents an original methodology of liver tumors segmentation, based on wavelet transform. A virtual phantom was constructed with the same mean and standard deviation of the intensity of gray presented by the measured liver tissue. The optimized algorithm had a sensitivity ranging from 0.81 to 0.83, with a specificity of 0.95 for differentiation of hepatic tumors from normal tissues. We obtained a 96% agreement between the pixels segmented by an experienced radiologist and the algorithm presented here. According to the results shown in this work, the algorithm is optimal for the beginning of the tests for quantification of liver tumors in retrospective surveys. (author)

  14. Seed shape quantification in the order Cucurbitales

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2018-02-01

    Full Text Available Seed shape quantification in diverse species of the families belonging to the order Cucurbitales is done based on the comparison of seed images with geometric figures. Quantification of seed shape is a useful tool in plant description for phenotypic characterization and taxonomic analysis. J index gives the percent of similarity of the image of a seed with a geometric figure and it is useful in taxonomy for the study of relationships between plant groups. Geometric figures used as models in the Cucurbitales are the ovoid, two ellipses with different x/y ratios and the outline of the Fibonacci spiral. The images of seeds have been compared with these figures and values of J index obtained. The results obtained for 29 species in the family Cucurbitaceae support a relationship between seed shape and species ecology. Simple seed shape, with images resembling simple geometric figures like the ovoid, ellipse or the Fibonacci spiral, may be a feature in the basal clades of taxonomic groups.

  15. Quantification of abdominal aortic deformation after EVAR

    Science.gov (United States)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  16. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  17. Bathymetric survey and estimation of the water balance of Lake ...

    African Journals Online (AJOL)

    Quantification of the water balance components and bathymetric survey is very crucial for sustainable management of lake waters. This paper focuses on the bathymetry and the water balance of the crater Lake Ardibo, recently utilized for irrigation. The bathymetric map of the lake is established at a contour interval of 10 ...

  18. 24 CFR 982.517 - Utility allowance schedule.

    Science.gov (United States)

    2010-04-01

    ... utilities and services paid by energy-conservative households that occupy housing of similar size and type... utility allowance for an individual family, must include the utilities and services that are necessary in...-family detached, and manufactured housing) that are typical in the community. (4) The utility allowance...

  19. 41 CFR 105-71.122 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... GOVERNMENTS 71.12-Post-Award Requirements/Financial Administration § 105-71.122 Allowable costs. (a... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...

  20. Assessing the Implications of Allowing Transgender Personnel to Serve Openly

    Science.gov (United States)

    2016-01-01

    Openly? There are 18 countries that allow transgender personnel to serve openly in their mili- taries: Australia, Austria, Belgium, Bolivia , Canada...clinical and cultural competence for the proper care of transgender patients. Surgical procedures quite similar to those used for gender transition...tries that allow transgender personnel to serve openly in their militaries: Austra- lia, Austria, Belgium, Bolivia , Canada, Czech Republic, Denmark

  1. 48 CFR 1652.216-71 - Accounting and Allowable Cost.

    Science.gov (United States)

    2010-10-01

    ... of FEHBP Clauses 1652.216-71 Accounting and Allowable Cost. As prescribed in section 1616.7002, the...). Accounting and Allowable Cost (FEHBAR 1652.216-71) (JAN 2003) (a) Annual Accounting Statements. (1) The... addition, the Carrier must: (i) on request, document and make available accounting support for the cost to...

  2. 19 CFR 148.103 - Family grouping of allowances.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Family grouping of allowances. 148.103 Section 148... Value § 148.103 Family grouping of allowances. (a) Generally. When members of a family residing in one... household. “Members of a family residing in one household” shall include all persons, regardless of age, who...

  3. 40 CFR 96.142 - CAIR NOX allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... the 3 highest amounts of the unit's adjusted control period heat input for 2000 through 2004, with the adjusted control period heat input for each year calculated as follows: (A) If the unit is coal-fired... CAIR NOX Allowance Allocations § 96.142 CAIR NOX allowance allocations. (a)(1) The baseline heat input...

  4. 40 CFR 73.21 - Phase II repowering allowances.

    Science.gov (United States)

    2010-07-01

    ... in the following table Unit Year 2000 adjusted basic allowances RE Burger 1 1273 RE Burger 2 1245 RE... manner: EC01SE92.082 where: Forfeiture Period = difference (as a portion of a year) between the end of... table in paragraph (a) of this section. (c)(2) The Administrator will reallocate any allowances...

  5. 26 CFR 1.42-10 - Utility allowances.

    Science.gov (United States)

    2010-04-01

    ... Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY INCOME TAX INCOME TAXES Credits Against Tax § 1.42-10 Utility allowances. (a) Inclusion of utility allowances in gross rent. If the cost.... (b) Applicable utility allowances—(1) Buildings assisted by the Rural Housing Service. If a building...

  6. 42 CFR 61.37 - Stipends, allowances, and benefits.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Stipends, allowances, and benefits. 61.37 Section..., INTERNSHIPS, TRAINING FELLOWSHIPS Service Fellowships § 61.37 Stipends, allowances, and benefits. (a) Stipends... employees of the Public Health Service. (c) Benefits. In addition to other benefits provided herein, service...

  7. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    International Nuclear Information System (INIS)

    Seebauer, Matthias

    2014-01-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO 2  ha −1  yr −1 with significantly different mitigation benefits depending on typologies of the crop–livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms. (paper)

  8. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  9. La quantification en Kabiye: une approche linguistique | Pali ...

    African Journals Online (AJOL)

    ... which is denoted by lexical quantifiers. Quantification with specific reference is provided by different types of linguistic units (nouns, numerals, adjectives, adverbs, ideophones and verbs) in arguments/noun phrases and in the predicative phrase in the sense of Chomsky. Keywords: quantification, class, number, reference, ...

  10. Survey research.

    Science.gov (United States)

    Alderman, Amy K; Salem, Barbara

    2010-10-01

    Survey research is a unique methodology that can provide insight into individuals' perspectives and experiences and can be collected on a large population-based sample. Specifically, in plastic surgery, survey research can provide patients and providers with accurate and reproducible information to assist with medical decision-making. When using survey methods in research, researchers should develop a conceptual model that explains the relationships of the independent and dependent variables. The items of the survey are of primary importance. Collected data are only useful if they accurately measure the concepts of interest. In addition, administration of the survey must follow basic principles to ensure an adequate response rate and representation of the intended target sample. In this article, the authors review some general concepts important for successful survey research and discuss the many advantages this methodology has for obtaining limitless amounts of valuable information.

  11. In vivo MRS metabolite quantification using genetic optimization

    Science.gov (United States)

    Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure.

  12. In vivo MRS metabolite quantification using genetic optimization

    International Nuclear Information System (INIS)

    Papakostas, G A; Mertzios, B G; Karras, D A; Van Ormondt, D; Graveron-Demilly, D

    2011-01-01

    The in vivo quantification of metabolites' concentrations, revealed in magnetic resonance spectroscopy (MRS) spectra, constitutes the main subject under investigation in this work. Significant contributions based on artificial intelligence tools, such as neural networks (NNs), with good results have been presented lately but have shown several drawbacks, regarding their quantification accuracy under difficult conditions. A general framework that encounters the quantification procedure as an optimization problem, which is solved using a genetic algorithm (GA), is proposed in this paper. Two different lineshape models are examined, while two GA configurations are applied on artificial data. Moreover, the introduced quantification technique deals with metabolite peaks' overlapping, a considerably difficult situation occurring under real conditions. Appropriate experiments have proved the efficiency of the introduced methodology, in artificial MRS data, by establishing it as a generic metabolite quantification procedure

  13. Forecasting the market for SO2 emission allowances under uncertainty

    International Nuclear Information System (INIS)

    Hanson, D.; Molburg, J.; Fisher, R.; Boyd, G.; Pandola, G.; Lurie, G.; Taxon, T.

    1991-01-01

    This paper deals with the effects of uncertainty and risk aversion on market outcomes for SO 2 emission allowance prices and on electric utility compliance choices. The 1990 Clean Air Act Amendments (CAAA), which are briefly reviewed here, provide for about twice as many SO 2 allowances to be issued per year in Phase 1 (1995--1999) than in Phase 2. Considering the scrubber incentives in Phase 1, there is likely to be substantial emission banking for use in Phase 2. Allowance prices are expected to increase over time at a rate less than the return on alternative investments, so utilities which are risk neutral, or potential speculators in the allowance market, are not expected to bank allowances. The allowances will be banked by utilities that are risk averse. The Argonne Utility Simulation Model (ARGUS2) is being revised to incorporate the provisions of the CAAA acid rain title and to simulate SO 2 allowance prices, compliance choices, capacity expansion, system dispatch, fuel use, and emissions using a unit level data base and alternative scenario assumptions. 1 fig

  14. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR.

    Science.gov (United States)

    Daems, Devin; Peeters, Bernd; Delport, Filip; Remans, Tony; Lammertyn, Jeroen; Spasic, Dragana

    2017-07-31

    Abstract : Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery ( Apium graveolens ) is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR) is followed by a high-resolution melting analysis (HRM). In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA) was developed to determine different concentrations of celery DNA (1 pM-0.1 fM). The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd ). The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement ( R ² = 0.96). In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  15. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR

    Directory of Open Access Journals (Sweden)

    Devin Daems

    2017-07-01

    Full Text Available Abstract: Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery (Apium graveolens is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR is followed by a high-resolution melting analysis (HRM. In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA was developed to determine different concentrations of celery DNA (1 pM–0.1 fM. The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd. The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement (R2 = 0.96. In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  16. Application of adaptive hierarchical sparse grid collocation to the uncertainty quantification of nuclear reactor simulators

    Energy Technology Data Exchange (ETDEWEB)

    Yankov, A.; Downar, T. [University of Michigan, 2355 Bonisteel Blvd, Ann Arbor, MI 48109 (United States)

    2013-07-01

    Recent efforts in the application of uncertainty quantification to nuclear systems have utilized methods based on generalized perturbation theory and stochastic sampling. While these methods have proven to be effective they both have major drawbacks that may impede further progress. A relatively new approach based on spectral elements for uncertainty quantification is applied in this paper to several problems in reactor simulation. Spectral methods based on collocation attempt to couple the approximation free nature of stochastic sampling methods with the determinism of generalized perturbation theory. The specific spectral method used in this paper employs both the Smolyak algorithm and adaptivity by using Newton-Cotes collocation points along with linear hat basis functions. Using this approach, a surrogate model for the outputs of a computer code is constructed hierarchically by adaptively refining the collocation grid until the interpolant is converged to a user-defined threshold. The method inherently fits into the framework of parallel computing and allows for the extraction of meaningful statistics and data that are not within reach of stochastic sampling and generalized perturbation theory. This paper aims to demonstrate the advantages of spectral methods-especially when compared to current methods used in reactor physics for uncertainty quantification-and to illustrate their full potential. (authors)

  17. Rapid quantification of plant-powdery mildew interactions by qPCR and conidiospore counts.

    Science.gov (United States)

    Weßling, Ralf; Panstruga, Ralph

    2012-08-31

    The powdery mildew disease represents a valuable patho-system to study the interaction between plant hosts and obligate biotrophic fungal pathogens. Numerous discoveries have been made on the basis of the quantitative evaluation of plant-powdery mildew interactions, especially in the context of hyper-susceptible and/or resistant plant mutants. However, the presently available methods to score the pathogenic success of powdery mildew fungi are laborious and thus not well suited for medium- to high-throughput analysis. Here we present two new protocols that allow the rapid quantitative assessment of powdery mildew disease development. One procedure depends on quantitative polymerase chain reaction (qPCR)-based evaluation of fungal biomass, while the other relies on the quantification of fungal conidiospores. We validated both techniques using the powdery mildew pathogen Golovinomyces orontii on a set of hyper-susceptible and resistant Arabidopsis thaliana mutants and found that both cover a wide dynamic range of one to two (qPCR) and four to five (quantification of conidia) orders of magnitude, respectively. The two approaches yield reproducible results and are easy to perform without specialized equipment. The qPCR and spore count assays rapidly and reproducibly quantify powdery mildew pathogenesis. Our methods are performed at later stages of infection and discern mutant phenotypes accurately. The assays therefore complement currently used procedures of powdery mildew quantification and can overcome some of their limitations. In addition, they can easily be adapted to other plant-powdery mildew patho-systems.

  18. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lines, Amanda M. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Adami, Susan R. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Sinkov, Sergey I. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Lumetta, Gregg J. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Bryan, Samuel A. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States

    2017-08-09

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example of a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.

  19. Quantification of cellular uptake of DNA nanostructures by qPCR.

    Science.gov (United States)

    Okholm, Anders Hauge; Nielsen, Jesper Sejrup; Vinther, Mathias; Sørensen, Rasmus Schøler; Schaffert, David; Kjems, Jørgen

    2014-05-15

    DNA nanostructures facilitating drug delivery are likely soon to be realized. In the past few decades programmed self-assembly of DNA building blocks have successfully been employed to construct sophisticated nanoscale objects. By conjugating functionalities to DNA, other molecules such as peptides, proteins and polymers can be precisely positioned on DNA nanostructures. This exceptional ability to produce modular nanoscale devices with tunable and controlled behavior has initiated an interest in employing DNA nanostructures for drug delivery. However, to obtain this the relationship between cellular interactions and structural and functional features of the DNA delivery device must be thoroughly investigated. Here, we present a rapid and robust method for the precise quantification of the component materials of DNA origami structures capable of entering cells in vitro. The quantification is performed by quantitative polymerase chain reaction, allowing a linear dynamic range of detection of five orders of magnitude. We demonstrate the use of this method for high-throughput screening, which could prove efficient to identify key features of DNA nanostructures enabling cell penetration. The method described here is suitable for quantification of in vitro uptake studies but should easily be extended to quantify DNA nanostructures in blood or tissue samples. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    Science.gov (United States)

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  1. Future prices and market for SO2 allowances

    International Nuclear Information System (INIS)

    Sanghi, A.; Joseph, A.; Michael, K.; Munro, W.; Wang, J.

    1993-01-01

    The expected price of SO 2 emission allowances is an important issue in energy and integrated resource planning activities. For example, the expected price of SO 2 allowances in needed in order to evaluate alternative strategies for meeting SO 2 provisions of the Clean Air Act Amendments of 1990. In addition, the expected SO 2 allowance price is important to state public utility regulators who must provide guidance on rate-making issues regarding utility compliance plans which involve allowance trading and direct investment of SO 2 control technologies. Last but not the least, the expected SO 2 allowance price is an important determinant of the future market for natural gas and low sulfur coal. The paper develops estimates of SO 2 allowance prices over time by constructing national supply and demand curves for SO 2 reductions. Both the supply and demand for SO 2 reductions are based on an analysis of the sulfur content of fuels burned in 1990 by utilities throughout the United States; and on assumptions about plant retirements, the rate of new capacity growth, the types of new and replacement plants constructed, the costs of SO 2 reduction measures and legislation by midwest states to maintain the use of high sulfur coal to protect local jobs. The paper shows that SO 2 allowance prices will peak around the year 2000 at about $500 per ton, and will eventually fall to zero by about the year 2020. A sensitivity analysis indicates that the price of SO 2 allowances is relatively insensitive to assumptions regarding the availability of natural gas or energy demand growth. However, SO 2 allowance prices tend to be quite sensitive to assumptions regarding regulations which may force early retirement of existing power plants and possible legislation which may reduce CO 2 emissions

  2. Atomic Resolution Imaging and Quantification of Chemical Functionality of Surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Udo D. [Yale Univ., New Haven, CT (United States). Dept. of Mechanical Engineering and Materials Science; Altman, Eric I. [Yale Univ., New Haven, CT (United States). Dept. of Chemical and Environmental Engineering

    2014-12-10

    The work carried out from 2006-2014 under DoE support was targeted at developing new approaches to the atomic-scale characterization of surfaces that include species-selective imaging and an ability to quantify chemical surface interactions with site-specific accuracy. The newly established methods were subsequently applied to gain insight into the local chemical interactions that govern the catalytic properties of model catalysts of interest to DoE. The foundation of our work was the development of three-dimensional atomic force microscopy (3DAFM), a new measurement mode that allows the mapping of the complete surface force and energy fields with picometer resolution in space (x, y, and z) and piconewton/millielectron volts in force/energy. From this experimental platform, we further expanded by adding the simultaneous recording of tunneling current (3D-AFM/STM) using chemically well-defined tips. Through comparison with simulations, we were able to achieve precise quantification and assignment of local chemical interactions to exact positions within the lattice. During the course of the project, the novel techniques were applied to surface-oxidized copper, titanium dioxide, and silicon oxide. On these materials, defect-induced changes to the chemical surface reactivity and electronic charge density were characterized with site-specific accuracy.

  3. Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals

    Science.gov (United States)

    Crooks, Kevin R.; Burdett, Christopher L.; Theobald, David M.; King, Sarah R. B.; Rondinini, Carlo; Boitani, Luigi

    2017-01-01

    Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world’s terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world’s terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation. PMID:28673992

  4. Serendipity: Global Detection and Quantification of Plant Stress

    Science.gov (United States)

    Schimel, D.; Verma, M.; Drewry, D.

    2016-12-01

    Detecting and quantifying plant stress is a grand challenge for remote sensing, and is important for understanding climate impacts on ecosystems broadly and also for early warning systems supporting food security. The long record from moderate resolution sensors providing frequent data has allowed using phenology to detect stress in forest and agroecosystems, but can fail or give ambiguous results when stress occurs during later phases of growth and in high leaf area systems. The recent recognition that greenhouse gas satellites such as GOSAT and OCO-2 observe Solar-Induced Fluorescence has added a new and complementary tool for the quantification of stress but algorithms to detect and quantify stress using SIF are in their infancy. Here we report new results showing a more complex response of SIF to stress by evaluating spaceborne SIF against in situ eddy covariance data. The response observed is as predicted by theory, and shows that SIF, used in conjunction with moderate resolution remote sensing, can detect and likely quantify stress by indexing the nonlinear part of the SIF-GPP relationship using the photochemical reflectance index and remotely observed light absorption. There are several exciting opportunities on the near horizon for the implementation of SIF, together with syngeristic measurements such as PRI and evapotranspiration that suggest the next few years will be a golden age for global ecology. Adancing the science and associated algorithms now is essential to fully exploiting the next wave of missions.

  5. Quantification of deep medullary veins at 7 T brain MRI

    Energy Technology Data Exchange (ETDEWEB)

    Kuijf, Hugo J.; Viergever, Max A.; Vincken, Koen L. [University Medical Center Utrecht, Image Sciences Institute, Utrecht (Netherlands); Bouvy, Willem H.; Razoux Schultz, Tom B.; Biessels, Geert Jan [University Medical Center Utrecht, Department of Neurology, Brain Center Rudolf Magnus, Utrecht (Netherlands); Zwanenburg, Jaco J.M. [University Medical Center Utrecht, Image Sciences Institute, Utrecht (Netherlands); University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands)

    2016-10-15

    Deep medullary veins support the venous drainage of the brain and may display abnormalities in the context of different cerebrovascular diseases. We present and evaluate a method to automatically detect and quantify deep medullary veins at 7 T. Five participants were scanned twice, to assess the robustness and reproducibility of manual and automated vein detection. Additionally, the method was evaluated on 24 participants to demonstrate its application. Deep medullary veins were assessed within an automatically created region-of-interest around the lateral ventricles, defined such that all veins must intersect it. A combination of vesselness, tubular tracking, and hysteresis thresholding located individual veins, which were quantified by counting and computing (3-D) density maps. Visual assessment was time-consuming (2 h/scan), with an intra-/inter-observer agreement on absolute vein count of ICC = 0.76 and 0.60, respectively. The automated vein detection showed excellent inter-scan reproducibility before (ICC = 0.79) and after (ICC = 0.88) visually censoring false positives. It had a positive predictive value of 71.6 %. Imaging at 7 T allows visualization and quantification of deep medullary veins. The presented method offers fast and reliable automated assessment of deep medullary veins. (orig.)

  6. Towards an uncertainty quantification methodology with CASMO-5

    International Nuclear Information System (INIS)

    Wieselquist, W.; Vasiliev, A.; Ferroukhi, H.

    2011-01-01

    We present the development of an uncertainty quantification (UQ) methodology for the CASMO-5 lattice physics code, used extensively at the Paul Scherrer Institut for standalone neutronics calculations, as well as the generation of nuclear fuel segment libraries for the downstream core simulator, SIMULATE-3. We focus here on propagation of nuclear data uncertainties and describe the framework required for 'black box' UQ--in this case minor modifications of the code are necessary to allow perturbation of the CASMO-5 nuclear data library. We then implement a basic rst-order UQ method, direct perturbation, which directly produces sensitivity coefficients and when folded with the input nuclear data variance-covariance matrix (VCM) yields output uncertainties in the form of an output VCM. We discuss the implementation, including how to map the VCMs of a different group structure to the code library group structure (in our case the ENDF/B-VII-based 586-group library in CASMO-5), present some results for pin cell calculations, and conclude with future work. (author)

  7. Real-time PCR protocols for the quantification of the begomovirus tomato yellow leaf curl Sardinia virus in tomato plants and in its insect vector.

    Science.gov (United States)

    Noris, Emanuela; Miozzi, Laura

    2015-01-01

    Tomato yellow leaf curl Sardinia virus (TYLCSV) (Geminiviridae) is an important pathogen, transmitted by the whitefly Bemisia tabaci, that severely affects the tomato production in the Mediterranean basin. Here, we describe real-time PCR protocols suitable for relative and absolute quantification of TYLCSV in tomato plants and in whitefly extracts. Using primers and probe specifically designed for TYLCSV, the protocols for relative quantification allow to compare the amount of TYLCSV present in different plant or whitefly samples, normalized to the amount of DNA present in each sample using endogenous tomato or Bemisia genes as internal references. The absolute quantification protocol allows to calculate the number of genomic units of TYLCSV over the genomic units of the plant host (tomato), with a sensitivity of as few as ten viral genome copies per sample. The described protocols are potentially suitable for several applications, such as plant breeding for resistance, analysis of virus replication, and virus-vector interaction studies.

  8. 17 CFR 240.17i-7 - Calculations of allowable capital and risk allowances or alternative capital assessment.

    Science.gov (United States)

    2010-04-01

    ...) Allowance for market risk. The supervised investment bank holding company must compute an allowance for market risk on a consolidated basis for all proprietary positions, including debt instruments, equity instruments, commodity instruments, foreign exchange contracts, and derivative contracts as the aggregate of...

  9. 40 CFR 82.20 - Availability of consumption allowances in addition to baseline consumption allowances for class...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Availability of consumption allowances in addition to baseline consumption allowances for class II controlled substances. 82.20 Section 82...) PROTECTION OF STRATOSPHERIC OZONE Production and Consumption Controls § 82.20 Availability of consumption...

  10. Indexing contamination surveys

    International Nuclear Information System (INIS)

    Brown, R.L.

    1998-01-01

    The responsibility for safely managing the Tank Farms at Hanford belongs to Lockheed Martin Hanford Corporation which is part of the six company Project Hanford Management Team led by Fluor Daniel Hanford, Inc.. These Tank Farm Facilities contain numerous outdoor contamination areas which are surveyed at a periodicity consistent with the potential radiological conditions, occupancy, and risk of changes in radiological conditions. This document describes the survey documentation and data tracking method devised to track the results of contamination surveys this process is referred to as indexing. The indexing process takes a representative data set as an indicator for the contamination status of the facility. The data are further manipulated into a single value that can be tracked and trended using standard statistical methodology. To report meaningful data, the routine contamination surveys must be performed in a manner that allows the survey method and the data collection process to be recreated. Three key criteria are necessary to accomplish this goal: Accurate maps, consistent documentation, and consistent consolidation of data meeting these criteria provides data of sufficient quality to be tracked. Tracking of survey data is accomplished by converting the individual survey results into a weighted value, corrected for the actual number of survey points. This information can be compared over time using standard statistical analysis to identify trends. At the Tank Farms, the need to track and trend the facility's radiological status presents unique challenges. Many of these Tank Farm facilities date back to the second world war. The Tank Farm Facilities are exposed to weather extremes, plant and animal intrusion, as well as all of the normal challenges associated with handling radiological waste streams. Routine radiological surveys did not provide a radiological status adequate for continuing comparisons

  11. Strategy study of quantification harmonization of SUV in PET/CT images; Estudo da estrategia de harmonizacao da quantificacao do SUV em imagens de PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-07-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  12. 40 CFR 86.1725-01 - Allowable maintenance.

    Science.gov (United States)

    2010-07-01

    ... Trucks § 86.1725-01 Allowable maintenance. This section includes text that specifies requirements that... this subpart, with the following additions: (a) Hybrid electric vehicles that use Otto-cycle or diesel...

  13. 50 CFR 86.44 - What are my allowable costs?

    Science.gov (United States)

    2010-10-01

    ...) PROGRAM Funding Availability § 86.44 What are my allowable costs? (a) The State may spend grant funds to pay only costs that are necessary and reasonable to accomplish the approved grant objectives. Grant...

  14. The economic efficiency of allowing longer combination vehicles in Texas.

    Science.gov (United States)

    2011-08-01

    This paper shows the economic efficiency of allowing longer combination vehicles in Texas. First, an : overview of the truck size and weight policies is explained, with an emphasis on those that affect : Texas. Next, LCV operations in other countries...

  15. 30 CFR 206.157 - Determination of transportation allowances.

    Science.gov (United States)

    2010-07-01

    ... a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research... industry and gas customers. GRI fees are allowable provided such fees are mandatory in FERC-approved...

  16. Economic rationale for an emission allowance trading program

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    The assumption behind the economic model of allowance trading is that managers of firms are better at solving pollution abatement problems than government overseers. This is because firms know more than an environmental regulator about their own operations and because the profit motive, rather than direct government mandate of compliance decisions, may be more effective at minimizing emission control costs. The allowance trading program in the CAAA is designed to provide firms with an incentive to make good choices about how to reduce emissions by allowing the firm to reduce compliance cost and profit from trading. This chapter discusses the benefits of allowance trading and summarizes the economic literature on tradable pollution rights. 17 refs., 2 figs

  17. 14 CFR 1261.109 - Computation of allowance.

    Science.gov (United States)

    2010-01-01

    ... exchange). There will be no allowance for replacement cost or for appreciation in the value of the property...) Depreciation in value is determined by considering the type of article involved, its cost, its condition when...

  18. Surveys & Programs

    Science.gov (United States)

    Employment and Payroll Survey of Business Owners Work from Home Our statistics highlight trends in household statistics from multiple surveys. Data Tools & Apps Main American FactFinder Census Business Builder My residential construction. Business Dynamics Statistics (BDS) Provides measures of openings and closings, job

  19. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    International Nuclear Information System (INIS)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    Highlights: ► Survey of bio-analytical approaches utilizing biomolecule labelling. ► Detailed discussion of methodology and chemistry of elemental labelling. ► Biomedical and bio-analytical applications of elemental labelling. ► FI-ICP-MS and LC–ICP-MS for quantification of elemental labelled biomolecules. ► Review of selected applications. - Abstract: This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given.

  20. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  1. Recurrence quantification analysis of global stock markets

    Science.gov (United States)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  2. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  3. The Effect of Allowing Pollution Offsets with Imperfect Enforcement

    OpenAIRE

    Hilary Sigman; Howard F. Chang

    2011-01-01

    Public policies for pollution control, including climate change policies, sometimes allow polluters in one sector subject to an emissions cap to offset excessive emissions in that sector with pollution abatement in another sector. The government may often find it more costly to verify offset claims than to verify compliance with emissions caps. Concerns about such difficulties in enforcement may lead regulators to restrict the use of offsets. In this paper, we demonstrate that allowing offset...

  4. Rapid Quantification of Low-Viscosity Acetyl-Triacylglycerols Using Electrospray Ionization Mass Spectrometry.

    Science.gov (United States)

    Bansal, Sunil; Durrett, Timothy P

    2016-09-01

    Acetyl-triacylglycerols (acetyl-TAG) possess an sn-3 acetate group, which confers useful chemical and physical properties to these unusual triacylglycerols (TAG). Current methods for quantification of acetyl-TAG are time consuming and do not provide any information on the molecular species profile. Electrospray ionization mass spectrometry (ESI-MS)-based methods can overcome these drawbacks. However, the ESI-MS signal intensity for TAG depends on the aliphatic chain length and unsaturation index of the molecule. Therefore response factors for different molecular species need to be determined before any quantification. The effects of the chain length and the number of double-bonds of the sn-1/2 acyl groups on the signal intensity for the neutral loss of short chain length sn-3 groups were quantified using a series of synthesized sn-3 specific structured TAG. The signal intensity for the neutral loss of the sn-3 acyl group was found to negatively correlated with the aliphatic chain length and unsaturation index of the sn-1/2 acyl groups. The signal intensity of the neutral loss of the sn-3 acyl group was also negatively correlated with the size of that chain. Further, the position of the group undergoing neutral loss was also important, with the signal from an sn-2 acyl group much lower than that from one located at sn-3. Response factors obtained from these analyses were used to develop a method for the absolute quantification of acetyl-TAG. The increased sensitivity of this ESI-MS-based approach allowed successful quantification of acetyl-TAG in various biological settings, including the products of in vitro enzyme activity assays.

  5. Assessment of probiotic viability during Cheddar cheese manufacture and ripening using propidium monoazide-PCR quantification

    Directory of Open Access Journals (Sweden)

    Emilie eDesfossés-Foucault

    2012-10-01

    Full Text Available The use of a suitable food carrier such as cheese could significantly enhance probiotic viability during storage. The main goal of this study was to assess viability of commercial probiotic strains during Cheddar cheesemaking and ripening (four to six months by comparing the efficiency of microbiological and molecular approaches. Molecular methods such as quantitative PCR (qPCR allow bacterial quantification, and DNA-blocking molecules such as propidium monoazide (PMA select only the living cells’ DNA. Cheese samples were manufactured with a lactococci starter and with one of three probiotic strains (Bifidobacterium animalis subsp. lactis BB-12, Lactobacillus rhamnosus RO011 or Lactobacillus helveticus RO052 or a mixed culture containing B. animalis subsp. lactis BB-12 and L. helveticus RO052 (MC1, both lactobacilli strains (MC2 or all three strains (MC3. DNA extractions were then carried out on PMA-treated and non-treated cell pellets in order to assess PMA treatment efficiency, followed by quantification using the 16S rRNA gene, the elongation factor Tu gene (tuf or the transaldolase gene (tal. Results with intact/dead ratios of bacteria showed that PMA-treated cheese samples had a significantly lower bacterial count than non-treated DNA samples (P<0.005, confirming that PMA did eliminate dead bacteria from PCR quantification. For both quantification methods, the addition of probiotic strains seemed to accelerate the loss of lactococci viability in comparison to control cheese samples, especially when L. helveticus RO052 was added. Viability of all three probiotic strains was also significantly reduced in mixed culture cheese samples (P<0.0001, B. animalis subsp. lactis BB-12 being the most sensitive to the presence of other strains. However, all probiotic strains did retain their viability (log nine cfu/g of cheese throughout ripening. This study was successful in monitoring living probiotic species in Cheddar cheese samples through PMA-qPCR.

  6. Accurate quantification of mouse mitochondrial DNA without co-amplification of nuclear mitochondrial insertion sequences.

    Science.gov (United States)

    Malik, Afshan N; Czajka, Anna; Cunningham, Phil

    2016-07-01

    Mitochondria contain an extra-nuclear genome in the form of mitochondrial DNA (MtDNA), damage to which can lead to inflammation and bioenergetic deficit. Changes in MtDNA levels are increasingly used as a biomarker of mitochondrial dysfunction. We previously reported that in humans, fragments in the nuclear genome known as nuclear mitochondrial insertion sequences (NumtS) affect accurate quantification of MtDNA. In the current paper our aim was to determine whether mouse NumtS affect the quantification of MtDNA and to establish a method designed to avoid this. The existence of NumtS in the mouse genome was confirmed using blast N, unique MtDNA regions were identified using FASTA, and MtDNA primers which do not co-amplify NumtS were designed and tested. MtDNA copy numbers were determined in a range of mouse tissues as the ratio of the mitochondrial and nuclear genome using real time qPCR and absolute quantification. Approximately 95% of mouse MtDNA was duplicated in the nuclear genome as NumtS which were located in 15 out of 21 chromosomes. A unique region was identified and primers flanking this region were used. MtDNA levels differed significantly in mouse tissues being the highest in the heart, with levels in descending order (highest to lowest) in kidney, liver, blood, brain, islets and lung. The presence of NumtS in the nuclear genome of mouse could lead to erroneous data when studying MtDNA content or mutation. The unique primers described here will allow accurate quantification of MtDNA content in mouse models without co-amplification of NumtS. Copyright © 2016 Elsevier B.V. and Mitochondria Research Society. All rights reserved.

  7. Quantification of Structural Isomers via Mode-Selective Irmpd

    Science.gov (United States)

    Polfer, Nicolas C.

    2016-06-01

    Mixtures of structural isomers can pose a challenge for vibrational ion spectroscopy. In cases where particular structures display diagnostic vibrations, these structures can be selectively "burned away". In ion traps, the ion population can be subjected to multiple laser shots, in order to fully deplete a particular structure, in effect allowing a quantification of this structure. Protonated para-amino benzoic acid (PABA) serves as an illustrative example. PABA is known to preferentially exist in the N-protonated (N-prot) form in solution, but in the gas phase it is energetically favorable in the O-protonated (O-prot) form. As shown in Figure 1, the N-prot structure can be kinetically trapped in the gas phase when sprayed from non-protic solvent, whereas the O-prot structure is obtained when sprayed from protic solvents, analogous to results by others [1,2]. y parking the light source on the diagnostic 3440 wn mode, the percentage of the O-prot structure can be determined, and by default the remainder is assumed to adopt the N-prot structure. It will be shown that the relative percentages of O-prot vs N-prot are highly dependent on the solvent mixture, going from close to 0% O-prot in non-protic solvents, to 99% in protic solvents. Surprisingly, water behaves much more like a non-protic solvent than methanol. It is observed that the capillary temperature, which aids droplet desolvation by black-body radiation in the ESI source, is critical to promote the appearance of O-prot structures. These results are consistent with the picture that a protic bridge mechanism is at play to facilitate proton transfer, and thus allow conversion from N-prot to O-prot, but that this mechanism is subject to appreciable kinetic barriers on the timescale of solvent evaporation. 1. J. Phys. Chem. A 2011, 115, 7625. 2. Anal. Chem. 2012, 84, 7857.

  8. Kinetic quantification of plyometric exercise intensity.

    Science.gov (United States)

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.

  9. Quantification of heterogeneity observed in medical images

    International Nuclear Information System (INIS)

    Brooks, Frank J; Grigsby, Perry W

    2013-01-01

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity

  10. Quantification of heterogeneity observed in medical images.

    Science.gov (United States)

    Brooks, Frank J; Grigsby, Perry W

    2013-03-02

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.

  11. Engineering surveying

    CERN Document Server

    Schofield, W

    2001-01-01

    The aim of Engineering Surveying has always been to impart and develop a clear understanding of the basic topics of the subject. The author has fully revised the book to make it the most up-to-date and relevant textbook available on the subject.The book also contains the latest information on trigonometric levelling, total stations and one-person measuring systems. A new chapter on satellites ensures a firm grasp of this vitally important topic.The text covers engineering surveying modules for civil engineering students on degree courses and forms a reference for the engineering surveying module in land surveying courses. It will also prove to be a valuable reference for practitioners.* Simple clear introduction to surveying for engineers* Explains key techniques and methods* Details reading systems and satellite position fixing

  12. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography

    Science.gov (United States)

    Venhuizen, Freerk G.; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2018-01-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies. PMID:29675301

  13. Quantification of low-expressed mRNA using 5' LNA-containing real-time PCR primers

    International Nuclear Information System (INIS)

    Malgoyre, A.; Banzet, S.; Mouret, C.; Bigard, A.X.; Peinnequin, A.

    2007-01-01

    Real-time RT-PCR is the most sensitive and accurate method for mRNA quantification. Using specific recombinant DNA as a template, real-time PCR allows accurate quantification within a 7-log range and increased sensitivity below 10 copies. However, when using RT-PCR to quantify mRNA in biological samples, a stochastic off-targeted amplification can occur. Classical adjustments of assay parameters have minimal effects on such amplification. This undesirable amplification appears mostly to be dependent on specific to non-specific target ratio rather than on the absolute quantity of the specific target. This drawback, which decreases assay reliability, mostly appears when quantifying low-expressed transcript in a whole organ. An original primer design using properties of LNA allows to block off-target amplification. 5'-LNA substitution strengthens 5'-hybridization. Consequently on-target hybridization is stabilized and the probability for the off-target to lead to amplification is decreased

  14. Aerosol-type retrieval and uncertainty quantification from OMI data

    Science.gov (United States)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model

  15. Aerosol-type retrieval and uncertainty quantification from OMI data

    Directory of Open Access Journals (Sweden)

    A. Kauppi

    2017-11-01

    Full Text Available We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs and top-of-atmosphere (TOA spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD. The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the

  16. The Parallel C++ Statistical Library ‘QUESO’: Quantification of Uncertainty for Estimation, Simulation and Optimization

    KAUST Repository

    Prudencio, Ernesto E.

    2012-01-01

    QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently abstract in order to handle a large spectrum of models, (b) be algorithmically extensible, allowing an easy insertion of new and improved algorithms, and (c) take advantage of parallel computing, in order to handle realistic models. Such objectives demand a combination of an object-oriented design with robust software engineering practices. QUESO is written in C++, uses MPI, and leverages libraries already available to the scientific community. We describe some UQ concepts, present QUESO, and list planned enhancements.

  17. High-throughput telomere length quantification by FISH and its application to human population studies.

    Science.gov (United States)

    Canela, Andrés; Vera, Elsa; Klatt, Peter; Blasco, María A

    2007-03-27

    A major limitation of studies of the relevance of telomere length to cancer and age-related diseases in human populations and to the development of telomere-based therapies has been the lack of suitable high-throughput (HT) assays to measure telomere length. We have developed an automated HT quantitative telomere FISH platform, HT quantitative FISH (Q-FISH), which allows the quantification of telomere length as well as percentage of short telomeres in large human sample sets. We show here that this technique provides the accuracy and sensitivity to uncover associations between telomere length and human disease.

  18. Note: Electrical detection and quantification of spin rectification effect enabled by shorted microstrip transmission line technique

    International Nuclear Information System (INIS)

    Soh, Wee Tee; Ong, C. K.; Peng, Bin; Chai, Guozhi

    2014-01-01

    We describe a shorted microstrip method for the sensitive quantification of Spin Rectification Effect (SRE). SRE for a Permalloy (Ni 80 Fe 20 ) thin film strip sputtered onto SiO 2 substrate is demonstrated. Our method obviates the need for simultaneous lithographic patterning of the sample and transmission line, therefore greatly simplifying the SRE measurement process. Such a shorted microstrip method can allow different contributions to SRE (anisotropic magnetoresistance, Hall effect, and anomalous Hall effect) to be simultaneously determined. Furthermore, SRE signals from unpatterned 50 nm thick Permalloy films of area dimensions 5 mm × 10 mm can even be detected

  19. Use of cesium 137 as a radiotracer in the quantification of tropical soil erosion

    International Nuclear Information System (INIS)

    Sibello Hernandez, Rita Y.; Cartas Aguila, Hector; Martin Perez, Jorge

    2005-01-01

    The main objective of this work was to evaluate the applicability of this technique to quantify the soil erosion in the tropical region. With this purpose the technique was applied in the tropical soils belonging to a glide parcel, in Cienfuegos province, in Cuba, in the Caribbean area. This allowed us to compare and to demonstrate the good agreement of the results of the soil loss quantification obtained using the 137 Cs technique: 37.00 + - 0.80 t.ha -1 . year -1 with the obtained using erosion plots in the Soil Experimental Station in Barajagua: 40 t.ha -1 . year -1

  20. Feasibility and accuracy of dual-layer spectral detector computed tomography for quantification of gadolinium: a phantom study.

    Science.gov (United States)

    van Hamersvelt, Robbert W; Willemink, Martin J; de Jong, Pim A; Milles, Julien; Vlassenbroek, Alain; Schilham, Arnold M R; Leiner, Tim

    2017-09-01

    The aim of this study was to evaluate the feasibility and accuracy of dual-layer spectral detector CT (SDCT) for the quantification of clinically encountered gadolinium concentrations. The cardiac chamber of an anthropomorphic thoracic phantom was equipped with 14 tubular inserts containing different gadolinium concentrations, ranging from 0 to 26.3 mg/mL (0.0, 0.1, 0.2, 0.4, 0.5, 1.0, 2.0, 3.0, 4.0, 5.1, 10.6, 15.7, 20.7 and 26.3 mg/mL). Images were acquired using a novel 64-detector row SDCT system at 120 and 140 kVp. Acquisitions were repeated five times to assess reproducibility. Regions of interest (ROIs) were drawn on three slices per insert. A spectral plot was extracted for every ROI and mean attenuation profiles were fitted to known attenuation profiles of water and pure gadolinium using in-house-developed software to calculate gadolinium concentrations. At both 120 and 140 kVp, excellent correlations between scan repetitions and true and measured gadolinium concentrations were found (R > 0.99, P  0.99, CI 0.99-1.00). Relative mean measurement errors stayed below 10% down to 2.0 mg/mL true gadolinium concentration at 120 kVp and below 5% down to 1.0 mg/mL true gadolinium concentration at 140 kVp. SDCT allows for accurate quantification of gadolinium at both 120 and 140 kVp. Lowest measurement errors were found for 140 kVp acquisitions. • Gadolinium quantification may be useful in patients with contraindication to iodine. • Dual-layer spectral detector CT allows for overall accurate quantification of gadolinium. • Interscan variability of gadolinium quantification using SDCT material decomposition is excellent.

  1. Physical Characterisation and Quantification of Total Above Ground Biomass Derived from First Thinnings for Wood Fuel Consumption in Ireland

    OpenAIRE

    Mockler, Nicholas

    2013-01-01

    Comprehensive knowledge of wood fuel properties assists in the optimisation of operations concerned with the harvesting, seasoning, processing and conversion of wood to energy. This study investigated the physical properties of wood fuel. These properties included moisture content and basic density. The field work also allowed for the quantification of above ground biomass partitions. The species investigated were alder (Alnus glutinosa), ash (Fraxinus excelsior L.), birch (Betula spp.), lodg...

  2. ACCOUNTING FOR GREENHOUSE GASES EMISSIONS ALLOWANCES IN ROMANIA

    Directory of Open Access Journals (Sweden)

    Marius Deac

    2013-02-01

    Full Text Available The present paper tries to analyze the accounting challenges that the implementation of EU Emissions Trading Scheme has risen. On 2 December 2004, IASB has issued an interpretation regarding the accounting of the GHG emissions allowances (IFRIC 3 „Emission Rights”. This interpretation should have been effective for annual periods beginning after 1 March 2005, the first year of the EU Emission Trading Scheme implementation. Less than a year after it was issued, IFRIC has withdrawn IFRIC 3. In December 2007, IASB has started a new project in order to provide guidance on accounting for carbon allowances called Emissions Trading Schemes Project. In the absence of an accounting standard regarding the accounting of these emissions allowances a diversity of accounting practices have been identified. Nowadays, there are three main accounting practices for the recognition of the emissions allowances and the GHG emissions liabilities: IFRIC 3 approach, the government grants approach and the net liability or off balance sheet approach. The accounting treatment of greenhouse gas emissions allowances by Romanian companies resembles the net liability or off balance sheet approach. Finance Ministry Order no. 1118/2012 states that GHG emission certificates should be recognized as fixed assets (if the entity is expecting a profit in the long term or in the category of short term investments (if the entity is expecting a profit in the short term. The accounting of the greenhouse gas emissions allowances described above is applicable mainly to traders of such certificates and not for the installations in the scope of the EU ETS directive, which should recognize GHG emissions off balance sheet, at their nominal value (nil if received for free. The shortfall or excess of allowances will be recognized in the profit or loss as they are bought or sold by the entity (the accounting treatment imposed by Finance Ministry Order no. 3055/2009.

  3. Quantification of uncertainty associated with United States high resolution fossil fuel CO2 emissions: updates, challenges and future plans

    Science.gov (United States)

    Gurney, K. R.; Chandrasekaran, V.; Mendoza, D. L.; Geethakumar, S.

    2010-12-01

    The Vulcan Project has estimated United States fossil fuel CO2 emissions at the hourly time scale and at spatial scales below the county level for the year 2002. Vulcan is built from a wide variety of observational data streams including regulated air pollutant emissions reporting, traffic monitoring, energy statistics, and US census data. In addition to these data sets, Vulcan relies on a series of modeling assumptions and constructs to interpolate in space, time and transform non-CO2 reporting into an estimate of CO2 combustion emissions. The recent version 2.0 of the Vulcan inventory has produced advances in a number of categories with particular emphasis on improved temporal structure. Onroad transportation emissions now avail of roughly 5000 automated traffic count monitors allowing for much improved diurnal and weekly time structure in our onroad transportation emissions. Though the inventory shows excellent agreement with independent national-level CO2 emissions estimates, uncertainty quantification has been a challenging task given the large number of data sources and numerous modeling assumptions. However, we have now accomplished a complete uncertainty estimate across all the Vulcan economic sectors and will present uncertainty estimates as a function of space, time, sector and fuel. We find that, like the underlying distribution of CO2 emissions themselves, the uncertainty is also strongly lognormal with high uncertainty associated with a relatively small number of locations. These locations typically are locations reliant upon coal combustion as the dominant CO2 source. We will also compare and contrast Vulcan fossil fuel CO2 emissions estimates against estimates built from DOE fuel-based surveys at the state level. We conclude that much of the difference between the Vulcan inventory and DOE statistics are not due to biased estimation but mechanistic differences in supply versus demand and combustion in space/time.

  4. Sea-Level Allowances along the World Coastlines

    Science.gov (United States)

    Vandewal, R.; Tsitsikas, C.; Reerink, T.; Slangen, A.; de Winter, R.; Muis, S.; Hunter, J. R.

    2017-12-01

    Sea level changes as a result of climate change. For projections we take ocean mass changes and volume changes into account. Including gravitational and rotational fingerprints this provide regional sea level changes. Hence we can calculate sea-level rise patterns based on CMIP5 projections. In order to take the variability around the mean state, which follows from the climate models, into account we use the concept of allowances. The allowance indicates the height a coastal structure needs to be increased to maintain the likelihood of sea-level extremes. Here we use a global reanalysis of storm surges and extreme sea levels based on a global hydrodynamic model in order to calculate allowances. It is shown that the model compares in most regions favourably with tide gauge records from the GESLA data set. Combining the CMIP5 projections and the global hydrodynamical model we calculate sea-level allowances along the global coastlines and expand the number of points with a factor 50 relative to tide gauge based results. Results show that allowances increase gradually along continental margins with largest values near the equator. In general values are lower at midlatitudes both in Northern and Southern Hemisphere. Increased risk for extremes are typically 103-104 for the majority of the coastline under the RCP8.5 scenario at the end of the century. Finally we will show preliminary results of the effect of changing wave heights based on the coordinated ocean wave project.

  5. Myoblots: dystrophin quantification by in-cell western assay for a streamlined development of Duchenne muscular dystrophy (DMD) treatments.

    Science.gov (United States)

    Ruiz-Del-Yerro, E; Garcia-Jimenez, I; Mamchaoui, K; Arechavala-Gomeza, V

    2017-10-31

    New therapies for neuromuscular disorders are often mutation specific and require to be studied in patient's cell cultures. In Duchenne muscular dystrophy (DMD) dystrophin restoration drugs are being developed but as muscle cell cultures from DMD patients are scarce and do not grow or differentiate well, only a limited number of candidate drugs are tested. Moreover, dystrophin quantification by western blotting requires a large number of cultured cells; so fewer compounds are as thoroughly screened as is desirable. We aimed to develop a quantitative assessment tool using fewer cells to contribute in the study of dystrophin and to identify better drug candidates. An 'in-cell western' assay is a quantitative immunofluorescence assay performed in cell culture microplates that allows protein quantification directly in culture, allowing a higher number of experimental repeats and throughput. We have optimized the assay ('myoblot') to be applied to the study of differentiated myoblast cultures. After an exhaustive optimization of the technique to adapt it to the growth and differentiation rates of our cultures and the low intrinsic expression of our proteins of interests, our myoblot protocol allows the quantification of dystrophin and other muscle-associated proteins in muscle cell cultures. We are able to distinguish accurately between the different sets of patients based on their dystrophin expression and detect dystrophin restoration after treatment. We expect that this new tool to quantify muscle proteins in DMD and other muscle disorders will aid in their diagnosis and in the development of new therapies. © 2017 British Neuropathological Society.

  6. Identification of spectral regions for the quantification of red wine tannins with fourier transform mid-infrared spectroscopy.

    Science.gov (United States)

    Jensen, Jacob S; Egebo, Max; Meyer, Anne S

    2008-05-28

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.

  7. Simultaneous Assessment of Cardiomyocyte DNA Synthesis and Ploidy: A Method to Assist Quantification of Cardiomyocyte Regeneration and Turnover.

    Science.gov (United States)

    Richardson, Gavin D

    2016-05-23

    Although it is accepted that the heart has a limited potential to regenerate cardiomyocytes following injury and that low levels of cardiomyocyte turnover occur during normal ageing, quantification of these events remains challenging. This is in part due to the rarity of the process and the fact that multiple cellular sources contribute to myocardial maintenance. Furthermore, DNA duplication within cardiomyocytes often leads to a polyploid cardiomyocyte and only rarely leads to new cardiomyocytes by cellular division. In order to accurately quantify cardiomyocyte turnover discrimination between these processes is essential. The protocol described here employs long term nucleoside labeling in order to label all nuclei which have arisen as a result of DNA replication and cardiomyocyte nuclei identified by utilizing nuclei isolation and subsequent PCM1 immunolabeling. Together this allows the accurate and sensitive identification of the nucleoside labeling of the cardiomyocyte nuclei population. Furthermore, 4',6-diamidino-2-phenylindole labeling and analysis of nuclei ploidy, enables the discrimination of neo-cardiomyocyte nuclei from nuclei which have incorporated nucleoside during polyploidization. Although this method cannot control for cardiomyocyte binucleation, it allows a rapid and robust quantification of neo-cardiomyocyte nuclei while accounting for polyploidization. This method has a number of downstream applications including assessing the potential therapeutics to enhance cardiomyocyte regeneration or investigating the effects of cardiac disease on cardiomyocyte turnover and ploidy. This technique is also compatible with additional downstream immunohistological techniques, allowing quantification of nucleoside incorporation in all cardiac cell types.

  8. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  9. Application of Fuzzy Comprehensive Evaluation Method in Trust Quantification

    Directory of Open Access Journals (Sweden)

    Shunan Ma

    2011-10-01

    Full Text Available Trust can play an important role for the sharing of resources and information in open network environments. Trust quantification is thus an important issue in dynamic trust management. By considering the fuzziness and uncertainty of trust, in this paper, we propose a fuzzy comprehensive evaluation method to quantify trust along with a trust quantification algorithm. Simulation results show that the trust quantification algorithm that we propose can effectively quantify trust and the quantified value of an entity's trust is consistent with the behavior of the entity.

  10. Allowable variance set on left ventricular function parameter

    International Nuclear Information System (INIS)

    Zhou Li'na; Qi Zhongzhi; Zeng Yu; Ou Xiaohong; Li Lin

    2010-01-01

    Purpose: To evaluate the influence of allowable Variance settings on left ventricular function parameter of the arrhythmia patients during gated myocardial perfusion imaging. Method: 42 patients with evident arrhythmia underwent myocardial perfusion SPECT, 3 different allowable variance with 20%, 60%, 100% would be set before acquisition for every patients,and they will be acquired simultaneously. After reconstruction by Astonish, end-diastole volume(EDV) and end-systolic volume (ESV) and left ventricular ejection fraction (LVEF) would be computed with Quantitative Gated SPECT(QGS). Using SPSS software EDV, ESV, EF values of analysis of variance. Result: there is no statistical difference between three groups. Conclusion: arrhythmia patients undergo Gated myocardial perfusion imaging, Allowable Variance settings on EDV, ESV, EF value does not have a statistical meaning. (authors)

  11. Development of a Taqman real-time PCR assay for rapid detection and quantification of Vibrio tapetis in extrapallial fluids of clams

    Directory of Open Access Journals (Sweden)

    Adeline Bidault

    2015-12-01

    Full Text Available The Gram-negative bacterium Vibrio tapetis is known as the causative agent of Brown Ring Disease (BRD in the Manila clam Venerupis (=Ruditapes philippinarum. This bivalve is the second most important species produced in aquaculture and has a high commercial value. In spite of the development of several molecular methods, no survey has been yet achieved to rapidly quantify the bacterium in the clam. In this study, we developed a Taqman real-time PCR assay targeting virB4 gene for accurate and quantitative identification of V. tapetis strains pathogenic to clams. Sensitivity and reproducibility of the method were assessed using either filtered sea water or extrapallial fluids of clam injected with the CECT4600T V. tapetis strain. Quantification curves of V. tapetis strain seeded in filtered seawater (FSW or extrapallial fluids (EF samples were equivalent showing reliable qPCR efficacies. With this protocol, we were able to specifically detect V. tapetis strains down to 1.125 101 bacteria per mL of EF or FSW, taking into account the dilution factor used for appropriate template DNA preparation. This qPCR assay allowed us to monitor V. tapetis load both experimentally or naturally infected Manila clams. This technique will be particularly useful for monitoring the kinetics of massive infections by V. tapetis and for designing appropriate control measures for aquaculture purposes.

  12. Strategic partitioning of emissions allowances in the EU ETS

    Energy Technology Data Exchange (ETDEWEB)

    Boehringer, Christoph (Carl von Ossietzky Univ. Oldenburg (Germany)); Rosendahl, Knut Einar (Research Dept., Statistics Norway, Oslo (Norway))

    2008-07-01

    The EU ETS opens up for strategic partitioning of emissions allowances by the Member States. In this paper we examine the potential effects of such strategic behavior on quota prices and abatement costs. We show that although marginal abatement costs in the sectors outside the EU ETS become quite differentiated, the effects on the quota price and total abatement costs are small. More abatement, however, takes place in the old Member States that are importers of allowances, compared to the cost-effective outcome. Single countries can nevertheless significantly affect the outcome of the EU ETS by exploiting their market power

  13. Isospin impurity and super-allowed β transitions

    International Nuclear Information System (INIS)

    Sagawa, H.; Van Giai Nguyen; Suzuki, T.

    1999-01-01

    We study the effect of isospin impurity on the super-allowed Fermi β decay using microscopic HF and RPA (or TDA) model taking into account CSB and CIB interactions. It is found that the isospin impurity of N = Z nuclei gives enhancement of the sum rule of Fermi transition probabilities. On the other hand, the super-allowed transitions between odd-odd J = 0 nuclei and even-even J = 0 nuclei are quenched because on the cancellation of the isospin impurity effects of mother and daughter nuclei. An implication of the calculated Fermi transition rate on the unitarity of Cabbibo-Kobayashi-Maskawa mixing matrix is also discussed. (authors)

  14. On test and maintenance: Optimization of allowed outage time

    International Nuclear Information System (INIS)

    Mavko, B.; Cepin, M.T.

    2000-01-01

    Probabilistic Safety Assessment is widely becoming standard method for assessing, maintaining, assuring and improving the nuclear power plant safety. To achieve one of its many potential benefits, the optimization of allowed outage time specified in technical specifications is investigated. Proposed is the risk comparison approach for evaluation of allowed outage time. The risk of shutting the plant down due to failure of certain equipment is compared to the risk of continued plant operation with the specified equipment down. The core damage frequency serves as a risk measure. (author)

  15. Structure determination of electrodeposited zinc-nickel alloys: thermal stability and quantification using XRD and potentiodynamic dissolution

    International Nuclear Information System (INIS)

    Fedi, B.; Gigandet, M.P.; Hihn, J-Y; Mierzejewski, S.

    2016-01-01

    Highlights: • Quantification of zinc-nickel phases between 1,2% and 20%. • Coupling XRD to partial potentiodynamic dissolution. • Deconvolution of anodic stripping curves. • Phase quantification after annealing. - Abstract: Electrodeposited zinc-nickel coatings obtained by electrodeposition reveal the presence of metastable phases in various quantities, thus requiring their identification, a study of their thermal stability, and, finally, determination of their respective proportions. By combining XRD measurement with partial potentiodynamic dissolution, anodic peaks were indexed to allow their quantification. Quantification of electrodeposited zinc-nickel alloys approximately 10 μm thick was thus carried out on nickel content between 1.2% and 20%, and exhibited good accuracy. This method was then extended to the same set of alloys after annealing (250 °C, 2 h), thus bringing the structural organization closer to its thermodynamic equilibrium. The result obtained ensures better understanding of crystallization of metastable phases and of phase proportion evolution in a bi-phasic zinc-nickel coating. Finally, the presence of a monophase γ and its thermal stability in the 12% to 15% range provides important information for coating anti-corrosion behavior.

  16. A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.

    Science.gov (United States)

    Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo

    2008-01-01

    Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.

  17. Exploiting multicompartment effects in triple-echo steady-state T2 mapping for fat fraction quantification.

    Science.gov (United States)

    Liu, Dian; Steingoetter, Andreas; Curcic, Jelena; Kozerke, Sebastian

    2018-01-01

    To investigate and exploit the effect of intravoxel off-resonance compartments in the triple-echo steady-state (TESS) sequence without fat suppression for T 2 mapping and to leverage the results for fat fraction quantification. In multicompartment tissue, where at least one compartment is excited off-resonance, the total signal exhibits periodic modulations as a function of echo time (TE). Simulated multicompartment TESS signals were synthesized at various TEs. Fat emulsion phantoms were prepared and scanned at the same TE combinations using TESS. In vivo knee data were obtained with TESS to validate the simulations. The multicompartment effect was exploited for fat fraction quantification in the stomach by acquiring TESS signals at two TE combinations. Simulated and measured multicompartment signal intensities were in good agreement. Multicompartment effects caused erroneous T 2 offsets, even at low water-fat ratios. The choice of TE caused T 2 variations of as much as 28% in cartilage. The feasibility of fat fraction quantification to monitor the decrease of fat content in the stomach during digestion is demonstrated. Intravoxel off-resonance compartments are a confounding factor for T 2 quantification using TESS, causing errors that are dependent on the TE. At the same time, off-resonance effects may allow for efficient fat fraction mapping using steady-state imaging. Magn Reson Med 79:423-429, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    Science.gov (United States)

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  19. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato.

    Science.gov (United States)

    Péréfarres, Frédéric; Hoareau, Murielle; Chiroleu, Frédéric; Reynaud, Bernard; Dintinger, Jacques; Lett, Jean-Michel

    2011-08-05

    real-time PCRs were developed in association with a novel strategy for the quantification standard. These assays should be of a great interest for breeding programs and epidemiological surveys to monitor viral populations.

  20. A novel synthetic quantification standard including virus and internal report targets: application for the detection and quantification of emerging begomoviruses on tomato

    Directory of Open Access Journals (Sweden)

    Lett Jean-Michel

    2011-08-01

    quantify a wide range of begomoviruses, five duplex real-time PCRs were developed in association with a novel strategy for the quantification standard. These assays should be of a great interest for breeding programs and epidemiological surveys to monitor viral populations.

  1. Automated processing of zebrafish imaging data: a survey.

    Science.gov (United States)

    Mikut, Ralf; Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A; Kausler, Bernhard X; Ledesma-Carbayo, María J; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-09-01

    Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines.

  2. Automated Processing of Zebrafish Imaging Data: A Survey

    Science.gov (United States)

    Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A.; Kausler, Bernhard X.; Ledesma-Carbayo, María J.; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-01-01

    Abstract Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines. PMID:23758125

  3. Survey Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Cleaned and QCd data for the Fishing Effort Survey. Questions on fishing and other out are asked on weather and outdoor activity, including fishing trips. Used for...

  4. Surveying Humaness

    DEFF Research Database (Denmark)

    Markussen, Randi; Gad, Christopher

    carried out in a Danish county in order to improve treatment of people who have suffered from long-term illnesses. The surveys concern not only feed back on how people experience their present and past interaction with the social services and health care system; they also ask people to indicate the state......Christopher Gad. Ph.d. Dept. of Information and Media Studies Randi Markussen. Associate Professor, Dept. of Information and Media Studies. rmark@imv.au.dk   Abstract:   Surveying humanness -politics of care improvement   For various reasons we both were subjected to a specific survey procedure...... and development of a large collection of biological and psychological symptoms and psycho-social problems. However, the surveys say nothing about how the information will be of use to the people who answer the procedure or how this scientific intervention will be put to use more specifically within the public...

  5. Quantification of human reliability in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.; Dankg, Vinh N.

    1996-01-01

    Human performance may substantially influence the reliability and safety of complex technical systems. For this reason, Human Reliability Analysis (HRA) constitutes an important part of Probabilistic Safety Assessment (PSAs) or Quantitative Risk Analyses (QRAs). The results of these studies as well as analyses of past accidents and incidents clearly demonstrate the importance of human interactions. The contribution of human errors to the core damage frequency (CDF), as estimated in the Swedish nuclear PSAs, are between 15 and 88%. A survey of the FRAs in the Swiss PSAs shows that also for the Swiss nuclear power plants the estimated HE contributions are substantial (49% of the CDF due to internal events in the case of Beznau and 70% in the case of Muehleberg; for the total CDF, including external events, 25% respectively 20%). Similar results can be extracted from the PSAs carried out for French, German, and US plants. In PSAs or QRAs, the adequate treatment of the human interactions with the system is a key to the understanding of accident sequences and their relative importance to overall risk. The main objectives of HRA are: first, to ensure that the key human interactions are systematically identified and incorporated into the safety analysis in a traceable manner, and second, to quantify the probabilities of their success and failure. Adopting a structured and systematic approach to the assessment of human performance makes it possible to provide greater confidence that the safety and availability of human-machine systems is not unduly jeopardized by human performance problems. Section 2 discusses the different types of human interactions analysed in PSAs. More generally, the section presents how HRA fits in the overall safety analysis, that is, how the human interactions to be quantified are identified. Section 3 addresses the methods for quantification. Section 4 concludes the paper by presenting some recommendations and pointing out the limitations of the

  6. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  7. Conditions allowing the formation of biogenic amines in cheese

    NARCIS (Netherlands)

    Joosten, H.M.L.J.

    1988-01-01

    A study was undertaken to reveal the conditions that allow the formation of biogenic amines in cheese.

    The starters most commonly used in the Dutch cheese industry do not have decarboxylative properties. Only if the milk or curd is contaminated with non-starter bacteria, amine

  8. Is It Safe to Allow Cell Phones in School?

    Science.gov (United States)

    Trump, Kenneth S.

    2009-01-01

    Cell phones were banned from most schools years ago, but after the Columbine High School and 9/11 tragedies, parents started pressuring some school boards and administrators to reverse the bans. On its surface, allowing students to have cell phones under the guise of improved school safety may seem like a "no-brainer" to many board members and…

  9. 48 CFR 243.204-70-6 - Allowable profit.

    Science.gov (United States)

    2010-10-01

    ... profit allowed reflects— (a) Any reduced cost risk to the contractor for costs incurred during contract performance before negotiation of the final price; (b) The contractor's reduced cost risk for costs incurred during performance of the remainder of the contract; and (c) The extent to which costs have been incurred...

  10. 48 CFR 217.7404-6 - Allowable profit.

    Science.gov (United States)

    2010-10-01

    ... ensure the profit allowed reflects— (a) Any reduced cost risk to the contractor for costs incurred during contract performance before negotiation of the final price; (b) The contractor's reduced cost risk for costs incurred during performance of the remainder of the contract; and (c) The requirements at 215.404...

  11. 30 CFR 206.56 - Transportation allowances-general.

    Science.gov (United States)

    2010-07-01

    ... oil has been determined under § 206.52 or § 206.53 of this subpart at a point (e.g., sales point or point of value determination) off the lease, MMS shall allow a deduction for the reasonable, actual... sales type code may not exceed 50 percent of the value of the oil at the point of sale as determined...

  12. Allowed unhindered beta connected states in rare earth nuclei

    International Nuclear Information System (INIS)

    Sood, P.C.; Ray, R.S.

    1986-03-01

    The beta-connected states in odd-mass as well as even mass rare earth nuclei, where the transition is of allowed unhindered nature, are listed. The tabulation includes 54 cases of such transitions. Validity of Alaga selection rules is examined and the results are used to assign configurations to the involved single particle and two-particle states. (author)

  13. 5 CFR 2610.107 - Allowable fees and expenses.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Allowable fees and expenses. 2610.107 Section 2610.107 Administrative Personnel OFFICE OF GOVERNMENT ETHICS ORGANIZATION AND PROCEDURES... factors as may bear on the value of the services provided. (d) The reasonable cost of any study, analysis...

  14. 19 CFR 212.06 - Allowable fees and expenses.

    Science.gov (United States)

    2010-04-01

    ... Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE IMPLEMENTATION OF THE EQUAL ACCESS TO JUSTICE ACT General Provisions § 212.06 Allowable fees and...) If the attorney, agent or expert witness is in private practice, his or her customary fee for similar...

  15. 28 CFR 0.142 - Per diem and travel allowances.

    Science.gov (United States)

    2010-07-01

    ... authorized to approve travel expenses of newly appointed special agents and the transportation expenses of... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Per diem and travel allowances. 0.142... Authorizations With Respect to Personnel and Certain Administrative Matters § 0.142 Per diem and travel...

  16. Mitigation of Global Warming with Focus on Personal Carbon Allowances

    DEFF Research Database (Denmark)

    Meyer, Niels I

    2008-01-01

    The mitigation of global warming requires new efficient systems and methods. The paper presents a new proposal called personal carbon allowances with caps on the CO2 emission from household heating and electricity and on emission from transport in private cars and in personal air flights. Results...

  17. 41 CFR 101-27.503 - Allowable credit.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Allowable credit. 101-27.503 Section 101-27.503 Public Contracts and Property Management Federal Property Management Regulations System FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 27-INVENTORY MANAGEMENT 27.5...

  18. 34 CFR 389.41 - What are allowable costs?

    Science.gov (United States)

    2010-07-01

    ... Education programs— (a) Trainee per diem costs; (b) Trainee travel in connection with a training course; (c... 34 Education 2 2010-07-01 2010-07-01 false What are allowable costs? 389.41 Section 389.41 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION...

  19. 48 CFR 752.7028 - Differential and allowances.

    Science.gov (United States)

    2010-10-01

    ... allowance) at rates prescribed by the Federal Travel Regulations, as from time to time amended, during the... providing adequate elementary and secondary education for his/her children. The Contractor will be...), Chapter 270, as from time to time amended. (g) Educational travel. Educational travel is travel to and...

  20. Allowable stem nut wear and diagnostic monitoring for MOVs

    International Nuclear Information System (INIS)

    Swinburne, P.

    1994-01-01

    After a motor-operated valve (MOV) stem nut failure in 1991 that contributed to a forced plant shutdown, the FitzPatrick Plant staff developed criteria to check for excessive stem nut wear in MOVs. Allowable stem nut wear monitoring uses both direct dimensional measurement and diagnostic test data interpretation. The wear allowance is based on the recommended permitted backlash discussed in the Electric Power Research Institute/Nuclear Maintenance Assistance Center Technical Repair Guideline for the Limitorque SMB-000 Motor Actuator. The diagnostic analysis technique measures the time at zero load and compares this with a precalculated allowable zero force time. Excessive zero force time may be the result of other MOV problems, such as a loose stem nut lock nut or excessive free play in the drive sleeve bearing. Stress levels for new or nominal stem nuts and stem nuts with the full wear allowance were compared. Bending and shear stresses at the thread root increase for the maximum wear condition when compared with a open-quotes newclose quotes stem nut. These stresses are directly related to the thread root thickness. For typical MOV loading and common stem threading (with two diameters of thread engagement), the thread stresses are well within acceptable limits for ASTM B584-C86300 (formerly B147-863) manganese bronze (typical stem nut material)

  1. 9 CFR 73.10 - Permitted dips; substances allowed.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Permitted dips; substances allowed. 73.10 Section 73.10 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT... flowers of sulphur or sulphur flour to 100 gallons of water; or a specifically permitted proprietary brand...

  2. 76 FR 14282 - U.S. Paralympics Monthly Assistance Allowance

    Science.gov (United States)

    2011-03-16

    ... paid as a subsistence allowance for a full-time institutional program under chapter 31 of title 38...' rehabilitation following injury or disease. To the extent the commenter objects to the payment of money to... under this section. (a) Payment will be made at the rate paid for a full-time institutional program...

  3. 40 CFR 97.142 - CAIR NOX allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... heat input for each year calculated as follows: (A) If the unit is coal-fired during the year, the unit... the first such 5 years. (2)(i) A unit's control period heat input, and a unit's status as coal-fired... Allocations § 97.142 CAIR NOX allowance allocations. (a)(1) The baseline heat input (in mmBtu) used with...

  4. 9 CFR 52.6 - Claims not allowed.

    Science.gov (United States)

    2010-01-01

    ... violation of a law or regulation administered by the Secretary regarding animal disease, or in violation of... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Claims not allowed. 52.6 Section 52.6 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  5. 9 CFR 55.7 - Claims not allowed.

    Science.gov (United States)

    2010-01-01

    ... owner in violation of a law or regulation administered by the Secretary regarding animal disease, or in... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Claims not allowed. 55.7 Section 55.7 Animals and Animal Products ANIMAL AND PLANT HEALTH INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE...

  6. 41 CFR 105-72.307 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Allowable costs. 105-72.307 Section 105-72.307 Public Contracts and Property Management Federal Property Management... Administration 72-UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER...

  7. Can preapproval jump-start the allowance market

    Energy Technology Data Exchange (ETDEWEB)

    Dudek, D.J.; Goffman, J.

    1992-06-01

    With compliance deadlines approaching in three years, utility, environmental and financial planners and their regulators are in the process of grappling with the requirements imposed, and opportunities created, by the acid rain program established under Title 4 of the Clean Air Act amendments of 1990. The novel element of the program - emissions or allowance trading through a nationwide allowance market - presents great challenges for utilities and their regulators. Perhaps the foremost challenge is establishing the allowance market. If state utility commissions subject utilities' compliance strategies to traditional after-the-fact prudence reviews, as tradition would impel them to do, the attendant regulatory risks are likely to push utilities toward more conservative compliance schemes that underuse allowance trading (as the exchange at the head of this article demonstrates). If that happens, the market will fail to develop, and its full potential for environmental benefit at least cost will go unrealized. This, in turn, is likely to strengthen the case for non-market regulatory mechanisms - a vicious circle. In this paper, the authors suggest a way out of this.

  8. 37 CFR 1.311 - Notice of allowance.

    Science.gov (United States)

    2010-07-01

    ... fee, in which case the issue fee and publication fee (§ 1.211(e)) must both be paid within three... notice of allowance will operate as a request to charge the correct issue fee or any publication fee due... incorrect issue fee or publication fee; or (2) A fee transmittal form (or letter) for payment of issue fee...

  9. Can preapproval jump-start the allowance market?

    International Nuclear Information System (INIS)

    Dudek, D.J.; Goffman, J.

    1992-01-01

    With compliance deadlines approaching in three years, utility, environmental and financial planners and their regulators are in the process of grappling with the requirements imposed, and opportunities created, by the acid rain program established under Title 4 of the Clean Air Act amendments of 1990. The novel element of the program - emissions or allowance trading through a nationwide allowance market - presents great challenges for utilities and their regulators. Perhaps the foremost challenge is establishing the allowance market. If state utility commissions subject utilities' compliance strategies to traditional after-the-fact prudence reviews, as tradition would impel them to do, the attendant regulatory risks are likely to push utilities toward more conservative compliance schemes that underuse allowance trading (as the exchange at the head of this article demonstrates). If that happens, the market will fail to develop, and its full potential for environmental benefit at least cost will go unrealized. This, in turn, is likely to strengthen the case for non-market regulatory mechanisms - a vicious circle. In this paper, the authors suggest a way out of this

  10. 75 FR 63184 - Temporary Duty (TDY) Travel Allowances

    Science.gov (United States)

    2010-10-14

    ... GENERAL SERVICES ADMINISTRATION [Docket 2010-0009, Sequence 4] Temporary Duty (TDY) Travel Allowances AGENCY: Office of Governmentwide Policy, General Services Administration (GSA). ACTION: Notice of... agency travel programs, save money on travel costs, better protect the environment, and conserve natural...

  11. Precise quantification of minimal residual disease at day 29 allows identification of children with acute lymphoblastic leukemia and an excellent outcome

    DEFF Research Database (Denmark)

    Nyvold, Charlotte; Madsen, Hans O; Ryder, Lars P

    2002-01-01

    The postinduction level of minimal residual disease (MRD) was quantified with a competitive polymerase chain reaction (PCR) technique in 104 children with acute lymphoblastic leukemia (ALL) diagnosed between June 1993 and January 1998 and followed for a median of 4.2 years. A significant correlat......The postinduction level of minimal residual disease (MRD) was quantified with a competitive polymerase chain reaction (PCR) technique in 104 children with acute lymphoblastic leukemia (ALL) diagnosed between June 1993 and January 1998 and followed for a median of 4.2 years. A significant......-free survival for patients with higher MRD levels was 0.52 (P =.0007). The group of patients with a D29 MRD less than 0.01% included patients with T-cell disease, white blood cell count more than 50 x 10(9)/L at diagnosis, or age 10 years or older, and could not be identified by up-front criteria. The best...

  12. Fluorescence recovery allows theimplementation of a fluorescence reporter gene platform applicable for the detection and quantification of horizontal gene transfer in anoxic environments

    DEFF Research Database (Denmark)

    Pinilla-Redondo, Rafael; Riber, Leise; Sørensen, Søren Johannes

    2018-01-01

    H limitations, and provide experimental tools that will help broaden its horizon of application to other fields.IMPORTANCEMany anaerobic environments, like the gastrointestinal tract, anaerobic digesters, and the interiors of dense biofilms, have been shown to be hotspots for horizontal gene transfer (HGT......). Despite the increasing wealth of reports warning about the alarming spread of antibiotic resistance determinants, to date, HGT studies mainly rely on cultivation-based methods. Unfortunately, the relevance of these studies is often questionable, as only a minor fraction of bacteria can be cultivated...

  13. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  14. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.; Adams, M.L.; McClarren, R.G.; Mallick, B.K.

    2011-01-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework

  15. Direct quantification of negatively charged functional groups on membrane surfaces

    KAUST Repository

    Tiraferri, Alberto; Elimelech, Menachem

    2012-01-01

    groups at the surface of dense polymeric membranes. Both techniques consist of associating the membrane surface moieties with chemical probes, followed by quantification of the bound probes. Uranyl acetate and toluidine blue O dye, which interact

  16. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    International Nuclear Information System (INIS)

    Olander, Lydia P; Wollenberg, Eva; Tubiello, Francesco N; Herold, Martin

    2014-01-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term. (paper)

  17. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  18. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens; Hoteit, Ibrahim; Sun, Shuyu

    2015-01-01

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving

  19. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  20. The value of serum Hepatitis B surface antigen quantification in ...

    African Journals Online (AJOL)

    The value of serum Hepatitis B surface antigen quantification in determining viralactivity in chronic Hepatitis B virus infection. ... ofCHB andalso higher in hepatitis e antigen positive patients compared to hepatitis e antigen negative patients.

  1. an expansion of the aboveground biomass quantification model for ...

    African Journals Online (AJOL)

    Research Note BECVOL 3: an expansion of the aboveground biomass quantification model for ... African Journal of Range and Forage Science ... encroachment and estimation of food to browser herbivore species, was proposed during 1989.

  2. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...... + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results......Proton magnetic resonance spectroscopy ((1) H-MRS) can be used to quantify in vivo metabolite levels, such as lactate, γ-aminobutyric acid (GABA) and glutamate (Glu). However, there are considerable analysis choices which can alter the accuracy or precision of (1) H-MRS metabolite quantification...

  3. Quantification Model for Estimating Temperature Field Distributions of Apple Fruit

    OpenAIRE

    Zhang , Min; Yang , Le; Zhao , Huizhong; Zhang , Leijie; Zhong , Zhiyou; Liu , Yanling; Chen , Jianhua

    2009-01-01

    International audience; A quantification model of transient heat conduction was provided to simulate apple fruit temperature distribution in the cooling process. The model was based on the energy variation of apple fruit of different points. It took into account, heat exchange of representative elemental volume, metabolism heat and external heat. The following conclusions could be obtained: first, the quantification model can satisfactorily describe the tendency of apple fruit temperature dis...

  4. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  5. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    International Nuclear Information System (INIS)

    Jackson, Christopher B.; Gallati, Sabina; Schaller, André

    2012-01-01

    Highlights: ► Serial qPCR accurately determines fragmentation state of any given DNA sample. ► Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. ► Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. ► Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze–thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA (λ nDNA ) and mtDNA (λ mtDNA ) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two

  6. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in

  7. Quantification of fossil fuel CO2 at the building/street level for large US cities

    Science.gov (United States)

    Gurney, K. R.; Razlivanov, I. N.; Song, Y.

    2012-12-01

    Quantification of fossil fuel CO2 emissions from the bottom-up perspective is a critical element in emerging plans on a global, integrated, carbon monitoring system (CMS). A space/time explicit emissions data product can act as both a verification and planning system. It can verify atmospheric CO2 measurements (in situ and remote) and offer detailed mitigation information to management authorities in order to optimize the mix of mitigation efforts. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain as a pilot effort to a CMS. A complete data product has been built for the city of Indianapolis and preliminary quantification has been completed for Los Angeles and Phoenix (see figure). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. In collaboration with our INFLUX colleagues, we are transporting these high resolution emissions through an atmospheric transport model for a forward comparison of the Hestia data product with atmospheric measurements, collected on aircraft and cell towers. In preparation for a formal urban-scale inversion, these forward comparisons offer insights into both improving our emissions data product and measurement strategies. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban

  8. Tissue quantification for development of pediatric phantom

    International Nuclear Information System (INIS)

    Alves, A.F.F.; Miranda, J.R.A.; Pina, D.R.

    2013-01-01

    The optimization of the risk- benefit ratio is a major concern in the pediatric radiology, due to the greater vulnerability of children to the late somatic effects and genetic effects of exposure to radiation compared to adults. In Brazil, it is estimated that the causes of death from head trauma are 18 % for the age group between 1-5 years and the radiograph is the primary diagnostic test for the detection of skull fracture . Knowing that the image quality is essential to ensure the identification of structures anatomical and minimizing errors diagnostic interpretation, this paper proposed the development and construction of homogeneous phantoms skull, for the age group 1-5 years. The construction of the phantoms homogeneous was performed using the classification and quantification of tissue present in the skull of pediatric patients. In this procedure computational algorithms were used, using Matlab, to quantify distinct biological tissues present in the anatomical regions studied , using pictures retrospective CT scans. Preliminary data obtained from measurements show that between the ages of 1-5 years, assuming an average anteroposterior diameter of the pediatric skull region of the 145.73 ± 2.97 mm, can be represented by 92.34 mm ± 5.22 of lucite and 1.75 ± 0:21 mm of aluminum plates of a provision of PEP (Pacient equivalent phantom). After its construction, the phantoms will be used for image and dose optimization in pediatric protocols process to examinations of computerized radiography

  9. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  10. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  11. Quantification of the vocal folds’ dynamic displacements

    International Nuclear Information System (INIS)

    Hernández-Montes, María del Socorro; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-01-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ∼100–1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues. (paper)

  12. Tentacle: distributed quantification of genes in metagenomes.

    Science.gov (United States)

    Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik

    2015-01-01

    In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.

  13. Verification Validation and Uncertainty Quantification for CGS

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kamm, James R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Weirs, V. Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    The overall conduct of verification, validation and uncertainty quantification (VVUQ) is discussed through the construction of a workflow relevant to computational modeling including the turbulence problem in the coarse grained simulation (CGS) approach. The workflow contained herein is defined at a high level and constitutes an overview of the activity. Nonetheless, the workflow represents an essential activity in predictive simulation and modeling. VVUQ is complex and necessarily hierarchical in nature. The particular characteristics of VVUQ elements depend upon where the VVUQ activity takes place in the overall hierarchy of physics and models. In this chapter, we focus on the differences between and interplay among validation, calibration and UQ, as well as the difference between UQ and sensitivity analysis. The discussion in this chapter is at a relatively high level and attempts to explain the key issues associated with the overall conduct of VVUQ. The intention is that computational physicists can refer to this chapter for guidance regarding how VVUQ analyses fit into their efforts toward conducting predictive calculations.

  14. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  15. [Quantification of acetabular coverage in normal adult].

    Science.gov (United States)

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  16. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  17. Quantification of the vocal folds’ dynamic displacements

    Science.gov (United States)

    del Socorro Hernández-Montes, María; Muñoz, Silvino; De La Torre, Manuel; Flores, Mauricio; Pérez, Carlos; Mendoza-Santoyo, Fernando

    2016-05-01

    Fast dynamic data acquisition techniques are required to investigate the motional behavior of the vocal folds (VFs) when they are subjected to a steady air-flow through the trachea. High-speed digital holographic interferometry (DHI) is a non-invasive full-field-of-view technique that has proved its usefulness to study rapid and non-repetitive object movements. Hence it is an ideal technique used here to measure VF displacements and vibration patterns at 2000 fps. Analyses from a set of 200 displacement images showed that VFs’ vibration cycles are established along their width (y) and length (x). Furthermore, the maximum deformation for the right and left VFs’ area may be quantified from these images, which in itself represents an important result in the characterization of this structure. At a controlled air pressure, VF displacements fall within the range ~100-1740 nm, with a calculated precision and accuracy that yields a variation coefficient of 1.91%. High-speed acquisition of full-field images of VFs and their displacement quantification are on their own significant data in the study of their functional and physiological behavior since voice quality and production depend on how they vibrate, i.e. their displacement amplitude and frequency. Additionally, the use of high speed DHI avoids prolonged examinations and represents a significant scientific and technological alternative contribution in advancing the knowledge and working mechanisms of these tissues.

  18. Standardless quantification methods in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Trincavelli, Jorge, E-mail: trincavelli@famaf.unc.edu.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Limandri, Silvina, E-mail: s.limandri@conicet.gov.ar [Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba (Argentina); Instituto de Física Enrique Gaviola, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Medina Allende s/n, Ciudad Universitaria, 5000 Córdoba (Argentina); Bonetto, Rita, E-mail: bonetto@quimica.unlp.edu.ar [Centro de Investigación y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco, Consejo Nacional de Investigaciones Científicas y Técnicas de la República Argentina, Facultad de Ciencias Exactas, de la Universidad Nacional de La Plata, Calle 47 N° 257, 1900 La Plata (Argentina)

    2014-11-01

    The elemental composition of a solid sample can be determined by electron probe microanalysis with or without the use of standards. The standardless algorithms are quite faster than the methods that require standards; they are useful when a suitable set of standards is not available or for rough samples, and also they help to solve the problem of current variation, for example, in equipments with cold field emission gun. Due to significant advances in the accuracy achieved during the last years, product of the successive efforts made to improve the description of generation, absorption and detection of X-rays, the standardless methods have increasingly become an interesting option for the user. Nevertheless, up to now, algorithms that use standards are still more precise than standardless methods. It is important to remark, that care must be taken with results provided by standardless methods that normalize the calculated concentration values to 100%, unless an estimate of the errors is reported. In this work, a comprehensive discussion of the key features of the main standardless quantification methods, as well as the level of accuracy achieved by them is presented. - Highlights: • Standardless methods are a good alternative when no suitable standards are available. • Their accuracy reaches 10% for 95% of the analyses when traces are excluded. • Some of them are suitable for the analysis of rough samples.

  19. Quantification of variability in trichome patterns

    Directory of Open Access Journals (Sweden)

    Bettina eGreese

    2014-11-01

    Full Text Available While pattern formation is studied in various areas of biology, little is known about the intrinsic noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to , e.g., the abundance of cell components or environmental conditions. To elevate the understanding of the regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches towards characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability.

  20. Quality Quantification of Evaluated Cross Section Covariances

    International Nuclear Information System (INIS)

    Varet, S.; Dossantos-Uzarralde, P.; Vayatis, N.

    2015-01-01

    Presently, several methods are used to estimate the covariance matrix of evaluated nuclear cross sections. Because the resulting covariance matrices can be different according to the method used and according to the assumptions of the method, we propose a general and objective approach to quantify the quality of the covariance estimation for evaluated cross sections. The first step consists in defining an objective criterion. The second step is computation of the criterion. In this paper the Kullback-Leibler distance is proposed for the quality quantification of a covariance matrix estimation and its inverse. It is based on the distance to the true covariance matrix. A method based on the bootstrap is presented for the estimation of this criterion, which can be applied with most methods for covariance matrix estimation and without the knowledge of the true covariance matrix. The full approach is illustrated on the 85 Rb nucleus evaluations and the results are then used for a discussion on scoring and Monte Carlo approaches for covariance matrix estimation of the cross section evaluations

  1. A study on assessment methodology of surveillance test interval and allowed outage time

    International Nuclear Information System (INIS)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol

    1996-07-01

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method

  2. A study on assessment methodology of surveillance test interval and allowed outage time

    Energy Technology Data Exchange (ETDEWEB)

    Che, Moo Seong; Cheong, Chang Hyeon; Lee, Byeong Cheol [Seoul Nationl Univ., Seoul (Korea, Republic of)] (and others)

    1996-07-15

    The objectives of this study is the development of methodology by which assessing the optimizes Surveillance Test Interval(STI) and Allowed Outage Time(AOT) using PSA method that can supplement the current deterministic methods and the improvement of Korea nuclear power plants safety. In the first year of this study, the survey about the assessment methodologies, modeling and results performed by domestic and international researches is performed as the basic step before developing the assessment methodology of this study. The assessment methodology that supplement the revealed problems in many other studies is presented and the application of new methodology into the example system assures the feasibility of this method.

  3. In situ Biofilm Quantification in Bioelectrochemical Systems by using Optical Coherence Tomography.

    Science.gov (United States)

    Molenaar, Sam D; Sleutels, Tom; Pereira, Joao; Iorio, Matteo; Borsje, Casper; Zamudio, Julian A; Fabregat-Santiago, Francisco; Buisman, Cees J N; Ter Heijne, Annemiek

    2018-04-25

    Detailed studies of microbial growth in bioelectrochemical systems (BESs) are required for their suitable design and operation. Here, we report the use of optical coherence tomography (OCT) as a tool for in situ and noninvasive quantification of biofilm growth on electrodes (bioanodes). An experimental platform is designed and described in which transparent electrodes are used to allow real-time, 3D biofilm imaging. The accuracy and precision of the developed method is assessed by relating the OCT results to well-established standards for biofilm quantification (chemical oxygen demand (COD) and total N content) and show high correspondence to these standards. Biofilm thickness observed by OCT ranged between 3 and 90 μm for experimental durations ranging from 1 to 24 days. This translated to growth yields between 38 and 42 mgCODbiomass  gCODacetate -1 at an anode potential of -0.35 V versus Ag/AgCl. Time-lapse observations of an experimental run performed in duplicate show high reproducibility in obtained microbial growth yield by the developed method. As such, we identify OCT as a powerful tool for conducting in-depth characterizations of microbial growth dynamics in BESs. Additionally, the presented platform allows concomitant application of this method with various optical and electrochemical techniques. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  4. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Science.gov (United States)

    Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji

    2012-01-01

    In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  5. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    Directory of Open Access Journals (Sweden)

    Pramod K Avti

    Full Text Available In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM was investigated to detect, map, and quantify trace amounts [nanograms (ng to micrograms (µg] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds.Optical-resolution (OR and acoustic-resolution (AR--Photoacoustic microscopy (PAM was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR fluorescence microscopy.Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections.The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  6. Methods for modeling and quantification in functional imaging by positron emissions tomography and magnetic resonance imaging

    International Nuclear Information System (INIS)

    Costes, Nicolas

    2017-01-01

    This report presents experiences and researches in the field of in vivo medical imaging by positron emission tomography (PET) and magnetic resonance imaging (MRI). In particular, advances in terms of reconstruction, quantification and modeling in PET are described. The validation of processing and analysis methods is supported by the creation of data by simulation of the imaging process in PET. The recent advances of combined PET/MRI clinical cameras, allowing simultaneous acquisition of molecular/metabolic PET information, and functional/structural MRI information opens the door to unique methodological innovations, exploiting spatial alignment and simultaneity of the PET and MRI signals. It will lead to an increase in accuracy and sensitivity in the measurement of biological phenomena. In this context, the developed projects address new methodological issues related to quantification, and to the respective contributions of MRI or PET information for a reciprocal improvement of the signals of the two modalities. They open perspectives for combined analysis of the two imaging techniques, allowing optimal use of synchronous, anatomical, molecular and functional information for brain imaging. These innovative concepts, as well as data correction and analysis methods, will be easily translated into other areas of investigation using combined PET/MRI. (author) [fr

  7. Just do it: action-dependent learning allows sensory prediction.

    Directory of Open Access Journals (Sweden)

    Itai Novick

    Full Text Available Sensory-motor learning is commonly considered as a mapping process, whereby sensory information is transformed into the motor commands that drive actions. However, this directional mapping, from inputs to outputs, is part of a loop; sensory stimuli cause actions and vice versa. Here, we explore whether actions affect the understanding of the sensory input that they cause. Using a visuo-motor task in humans, we demonstrate two types of learning-related behavioral effects. Stimulus-dependent effects reflect stimulus-response learning, while action-dependent effects reflect a distinct learning component, allowing the brain to predict the forthcoming sensory outcome of actions. Together, the stimulus-dependent and the action-dependent learning components allow the brain to construct a complete internal representation of the sensory-motor loop.

  8. Spent fuel treatment to allow storage in air

    International Nuclear Information System (INIS)

    Williams, K.L.

    1988-01-01

    During Fiscal Year 1987 (FY-87), research began at the Idaho National Engineering Laboratory (INEL) to develop a treatment material and process to coat fuel rods in commercial spent fuel assemblies to allow the assemblies to be stored in hot (up to 380 0 C) air without oxidation of the fuel. This research was conducted under a research and development fund provided by the U.S. Department of Energy (DOE) and independently administered by EG and G Idaho, Inc., DOE's prime contractor at the INEL. The objectives of the research were to identify and evaluate possible treatment processes and materials, identify areas of uncertainty, and to recommend the most likely candidate to allow spent fuel dry storage in hot air. The results of the research are described: results were promising and several good candidates were identified, but further research is needed to examine the candidates to the point where comparison is possible

  9. A new approach for primary overloads allowance in ratcheting evaluation

    International Nuclear Information System (INIS)

    Cabrillat, M.T.; Gatt, J.M.; Lejeail, Y.

    1995-01-01

    Seismic loading must be taken into account in ratchetting design analysis. In LMFBR structures it mainly produces primary overloads, which are characterised by severe magnitudes but a generally low number of occurrences. Other cases of several primary overloads can also observed in pipes during emptying operations for instance. In the RCC-MR design code rule, the maximum primary stress supported by a structure is considered as permanent. No allowance is made for temporary load. Experimental ratchetting tests conducted on different structures with and without overloads clearly point out that temporary overloads lead to less ratchetting effect. A method using the RCC-MR efficiency diagram framework is proposed. A general theoretical approach allows to extend its field of application of various cases of primary loading: constant or null primary loading or overloads. Experimental results are then used to check the validity of this new approach. (author). 2 refs., 2 figs., 2 tabs

  10. Assessment of allowable transuranic activity levels for WIPP wastes

    International Nuclear Information System (INIS)

    1987-12-01

    This study provides a technical evaluation for the establishment of an upper limit on the transuranic content of waste packages to be received. To accomplish this, the predicted radiological performance of WIPP is compared to the radiological performance requirements applicable to WIPP. These performance requirements include radiation protection standards for both routine facility operations and credible operational accidents. These requirements are discussed in Chapter 2.0. From the margin between predicted performance and the performance requirements, the maximum allowable transuranic content of waste packages can then be inferred. Within the resulting compliance envelope, a waste acceptance criterion can be established that delineates the allowable level of transuranic radioactivity content for contact handled (CH) and remote handled (RH) waste packages. 13 refs., 8 tabs

  11. Allowable minimum upper shelf toughness for nuclear reactor pressure vessels

    Energy Technology Data Exchange (ETDEWEB)

    Zahoor, A.

    1988-05-01

    The paper develops methodology and procedure for determining the allowable minimum upper shelf toughness for continued safe operation of nuclear reactor pressure vessels. Elastic-plastic fracture mechanics analysis method based on the J-integral tearing modulus (J/T) approach is used. Closed from expressions for the applied J and tearing modulus are presented for finite length, part-throughwall axial flaw with aspect ratio of 1/6. Solutions are then presented for Section III, Appendix G flaw. A simple flaw evaluation procedure that can be applied quickly by utility engineers is presented. An attractive feature of the simple procedure is that tearing modulus calculations are not required by the user, and a solution for the slope of the applied J/T line is provided. Results for the allowable minimum upper shelf toughness are presented for a range of reactor pressure vessel thickness and heatup/cooldown rates.

  12. Allowable minimum upper shelf toughness for nuclear reactor pressure vessels

    International Nuclear Information System (INIS)

    Zahoor, A.

    1988-01-01

    The paper develops methodology and procedure for determining the allowable minimum upper shelf toughness for continued safe operation of nuclear reactor pressure vessels. Elastic-plastic fracture mechanics analysis method based on the J-integral tearing modulus (J/T) approach is used. Closed from expressions for the applied J and tearing modulus are presented for finite length, part-throughwall axial flaw with aspect ratio of 1/6. Solutions are then presented for Section III, Appendix G flaw. A simple flaw evaluation procedure that can be applied quickly by utility engineers is presented. An attractive feature of the simple procedure is that tearing modulus calculations are not required by the user, and a solution for the slope of the applied J/T line is provided. Results for the allowable minimum upper shelf toughness are presented for a range of reactor pressure vessel thickness and heatup/cooldown rates. (orig.)

  13. Detection and Quantification of the Entomopathogenic Fungal Endophyte Beauveria bassiana in Plants by Nested and Quantitative PCR.

    Science.gov (United States)

    Garrido-Jurado, Inmaculada; Landa, Blanca B; Quesada-Moraga, Enrique

    2016-01-01

    The described protocol allows detecting as low as 10 fg the entomopathogenic fungal endophyte Beauveria bassiana in host plants by using a two-step nested PCR with the ITS1F/ITS4 and BB.fw and BB.rv primer pairs. On the other hand, a qPCR protocol using BB.fw and BB.rv primers is also available allowing the quantification of up to 26 fg of B. bassiana DNA per 20 ng of leaf DNA.

  14. Development of a software of quantification of tumour density from images of biopsies from caner of head and neck

    International Nuclear Information System (INIS)

    Fernandez, J. M.; Alba, J. L.; Mera, M.; Lorenzo, Y.; Iglesias, M. B.; Lopez Medina, A.; Munoz, A.

    2013-01-01

    It has developed a software for automatic quantification of tumoral tissues biopsied density and tinted with Cytokeratin, using information colorimetric and morphologic, that also allows to distinguish between malignant cells and healthy cells. The software allows you to find, within the area biopsied, the area 1 mm 2 with higher tumor density, which in the future may be associated with ADC minimum of the number of patients included in the ARTFIBio project and be able to check the inverse correlation between the two measures. (Author)

  15. Bribery vs. extortion: allowing the lesser of two evils

    OpenAIRE

    Fahad Khalil; Jacques Lawarree; Sungho Yun

    2009-01-01

    Rewards to prevent supervisors from accepting bribes create incentives for extortion. This raises the question whether a supervisor who can engage in bribery and extortion can still be useful in providing incentives. By highlighting the role of team work in forging information, we present a notion of soft information that makes supervision valuable. We show that a fear of inducing extortion may make it optimal to allow bribery, but extortion is never tolerated. Even though both increase incen...

  16. Scientific substantination of maximum allowable concentration of fluopicolide in water

    Directory of Open Access Journals (Sweden)

    Pelo I.М.

    2014-03-01

    Full Text Available In order to substantiate fluopicolide maximum allowable concentration in the water of water reservoirs the research was carried out. Methods of study: laboratory hygienic experiment using organoleptic and sanitary-chemical, sanitary-toxicological, sanitary-microbiological and mathematical methods. The results of fluopicolide influence on organoleptic properties of water, sanitary regimen of reservoirs for household purposes were given and its subthreshold concentration in water by sanitary and toxicological hazard index was calculated. The threshold concentration of the substance by the main hazard criteria was established, the maximum allowable concentration in water was substantiated. The studies led to the following conclusions: fluopicolide threshold concentration in water by organoleptic hazard index (limiting criterion – the smell – 0.15 mg/dm3, general sanitary hazard index (limiting criteria – impact on the number of saprophytic microflora, biochemical oxygen demand and nitrification – 0.015 mg/dm3, the maximum noneffective concentration – 0.14 mg/dm3, the maximum allowable concentration - 0.015 mg/dm3.

  17. Allowable pillar to diameter ratio for strategic petroleum reserve caverns.

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L.; Park, Byoung Yoon

    2011-05-01

    This report compiles 3-D finite element analyses performed to evaluate the stability of Strategic Petroleum Reserve (SPR) caverns over multiple leach cycles. When oil is withdrawn from a cavern in salt using freshwater, the cavern enlarges. As a result, the pillar separating caverns in the SPR fields is reduced over time due to usage of the reserve. The enlarged cavern diameters and smaller pillars reduce underground stability. Advances in geomechanics modeling enable the allowable pillar to diameter ratio (P/D) to be defined. Prior to such modeling capabilities, the allowable P/D was established as 1.78 based on some very limited experience in other cavern fields. While appropriate for 1980, the ratio conservatively limits the allowable number of oil drawdowns and hence limits the overall utility and life of the SPR cavern field. Analyses from all four cavern fields are evaluated along with operating experience gained over the past 30 years to define a new P/D for the reserve. A new ratio of 1.0 is recommended. This ratio is applicable only to existing SPR caverns.

  18. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  19. Quantification of ocular inflammation with technetium-99m glucoheptonate

    International Nuclear Information System (INIS)

    Roizenblatt, J.; Caldeira, J.A.F.; Buchpiguel, C.A.; Meneguetti, J.C.; Camargo, E.E.; Sao Paulo Univ., SP

    1991-01-01

    Histological and morphometric evaluation of ocular inflammation is difficult, particularly when there is extensive ocular involvement with abscess formation and necrosis. A quantitative imaging procedure applicable to humans would be important clinically. To establish such a procedure, turpentine-induced ocular inflammation was obtained by subconjunctival injection in the right eye of 55 rabbits. The left eye was used as control and injected with a volume of saline equal to the volume of turpentine in the right eye. Volumes of turpentine or saline were 0.02, 0.04, 0.06, 0.2 and 0.6 ml, and the rabbits were divided into groups 1-5, according to these volumes. Imaging was performed 48 h after turpentine injection and 6 h after intravenous injection of 10 mCi of technetium 99m glucoheptonate ( 99m Tc-GH). An inflammatory reaction index (IRI), defined as the ratio of counts of the right eye divided by counts of the left eye, was used. IRIs were proportional to the degree of inflammation and allowed the distinction of 3 subgroups: One represented by group 4, one by group 5 and one by groups 1, 2 and 3. This method of quantification of ocular inflammatory processes using 99m Tc-GH is original, rapid, non-invasive, reproducible and safe, although unable to differentiate inflammatory processes caused by doses of turpentine which are very small and close to each other. It is conceivable that its application to humans will bring new insight into the ocular inflammatory process and response to therapy. (orig.)

  20. Engineering surveying

    CERN Document Server

    Schofield, W

    2007-01-01

    Engineering surveying involves determining the position of natural and man-made features on or beneath the Earth's surface and utilizing these features in the planning, design and construction of works. It is a critical part of any engineering project. Without an accurate understanding of the size, shape and nature of the site the project risks expensive and time-consuming errors or even catastrophic failure.Engineering Surveying 6th edition covers all the basic principles and practice of this complex subject and the authors bring expertise and clarity. Previous editions of this classic text have given readers a clear understanding of fundamentals such as vertical control, distance, angles and position right through to the most modern technologies, and this fully updated edition continues that tradition.This sixth edition includes:* An introduction to geodesy to facilitate greater understanding of satellite systems* A fully updated chapter on GPS, GLONASS and GALILEO for satellite positioning in surveying* Al...

  1. Progress Report on Alloy 617 Time Dependent Allowables

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Julie Knibloe [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-06-01

    Time dependent allowable stresses are required in the ASME Boiler and Pressure Vessel Code for design of components in the temperature range where time dependent deformation (i.e., creep) is expected to become significant. There are time dependent allowable stresses in Section IID of the Code for use in the non-nuclear construction codes, however, there are additional criteria that must be considered in developing time dependent allowables for nuclear components. These criteria are specified in Section III NH. St is defined as the lesser of three quantities: 100% of the average stress required to obtain a total (elastic, plastic, primary and secondary creep) strain of 1%; 67% of the minimum stress to cause rupture; and 80% of the minimum stress to cause the initiation of tertiary creep. The values are reported for a range of temperatures and for time increments up to 100,000 hours. These values are determined from uniaxial creep tests, which involve the elevated temperature application of a constant load which is relatively small, resulting in deformation over a long time period prior to rupture. The stress which is the minimum resulting from these criteria is the time dependent allowable stress St. In this report data from a large number of creep and creep-rupture tests on Alloy 617 are analyzed using the ASME Section III NH criteria. Data which are used in the analysis are from the ongoing DOE sponsored high temperature materials program, form Korea Atomic Energy Institute through the Generation IV VHTR Materials Program and historical data from previous HTR research and vendor data generated in developing the alloy. It is found that the tertiary creep criterion determines St at highest temperatures, while the stress to cause 1% total strain controls at low temperatures. The ASME Section III Working Group on Allowable Stress Criteria has recommended that the uncertainties associated with determining the onset of tertiary creep and the lack of significant

  2. Should we Allow our Children to Watch TV Independently

    Directory of Open Access Journals (Sweden)

    Tariq Jalees

    2008-01-01

    Full Text Available The purpose of this study is to (1 deliberate upon the impacts of television advertising on children, (2 identify the critical “impacts”, (3 empirically test the significant factors. Based on literature survey several impacts of adverting were identified including: (1 unnecessary purchasing (2 low nutritional food (3 violence (4 materialism. The variables derived though the literature survey were used to develop a close-ended questionnaire that was administered to a sample size of 108, drawn through non-proportionate stratified technique. The rating on the impacts of advertising were as high as 3.9 on “low nutritional value” and as low as 3.5 for “materialism”, on a scale of (5 to 1. Pearson correlation was used to measure the relationships of the variables on one-to-one basis indicating that “unnecessary purchasing” had a strong relationship with “materialism” (r = .054 and “exposure” (r= 0.54. The weakest relationship was found between “materialism” and “low nutritional value” with correlation of (0.22

  3. Electrical detection and quantification of single and mixed DNA nucleotides in suspension

    Science.gov (United States)

    Ahmad, Mahmoud Al; Panicker, Neena G.; Rizvi, Tahir A.; Mustafa, Farah

    2016-09-01

    High speed sequential identification of the building blocks of DNA, (deoxyribonucleotides or nucleotides for short) without labeling or processing in long reads of DNA is the need of the hour. This can be accomplished through exploiting their unique electrical properties. In this study, the four different types of nucleotides that constitute a DNA molecule were suspended in a buffer followed by performing several types of electrical measurements. These electrical parameters were then used to quantify the suspended DNA nucleotides. Thus, we present a purely electrical counting scheme based on the semiconductor theory that allows one to determine the number of nucleotides in a solution by measuring their capacitance-voltage dependency. The nucleotide count was observed to be similar to the multiplication of the corresponding dopant concentration and debye volume after de-embedding the buffer contribution. The presented approach allows for a fast and label-free quantification of single and mixed nucleotides in a solution.

  4. Quantification and Sequencing of Crossover Recombinant Molecules from Arabidopsis Pollen DNA.

    Science.gov (United States)

    Choi, Kyuha; Yelina, Nataliya E; Serra, Heïdi; Henderson, Ian R

    2017-01-01

    During meiosis, homologous chromosomes undergo recombination, which can result in formation of reciprocal crossover molecules. Crossover frequency is highly variable across the genome, typically occurring in narrow hotspots, which has a significant effect on patterns of genetic diversity. Here we describe methods to measure crossover frequency in plants at the hotspot scale (bp-kb), using allele-specific PCR amplification from genomic DNA extracted from the pollen of F 1 heterozygous plants. We describe (1) titration methods that allow amplification, quantification and sequencing of single crossover molecules, (2) quantitative PCR methods to more rapidly measure crossover frequency, and (3) application of high-throughput sequencing for study of crossover distributions within hotspots. We provide detailed descriptions of key steps including pollen DNA extraction, prior identification of hotspot locations, allele-specific oligonucleotide design, and sequence analysis approaches. Together, these methods allow the rate and recombination topology of plant hotspots to be robustly measured and compared between varied genetic backgrounds and environmental conditions.

  5. Identification and Quantification of Carbonate Species Using Rock-Eval Pyrolysis

    Directory of Open Access Journals (Sweden)

    Pillot D.

    2013-03-01

    Full Text Available This paper presents a new reliable and rapid method to characterise and quantify carbonates in solid samples based on monitoring the CO2 flux emitted by progressive thermal decomposition of carbonates during a programmed heating. The different peaks of destabilisation allow determining the different types of carbonates present in the analysed sample. The quantification of each peak gives the respective proportions of these different types of carbonates in the sample. In addition to the chosen procedure presented in this paper, using a standard Rock-Eval 6 pyrolyser, calibration characteristic profiles are also presented for the most common carbonates in nature. This method should allow different types of application for different disciplines, either academic or industrial.

  6. Surveying Education

    DEFF Research Database (Denmark)

    Enemark, Stig

    2009-01-01

    In relation to surveying education there is one big question to be asked: Is the role of the surveyors changing? In a global perspective the answer will be "Yes". There is a big swing that could be entitled "From Measurement to Management". This does not imply that measurement is no longer....... In surveying education there are a range of other challenges to be faced. These relate to the focus on learning to learn; the need for flexible curriculum to deal with constant change; the move towards introducing virtual academy; the demand for creating a quality culture; and the perspective of lifelong...... on an efficient interaction between education, research, and professional practice....

  7. High performance liquid chromatography-charged aerosol detection applying an inverse gradient for quantification of rhamnolipid biosurfactants.

    Science.gov (United States)

    Behrens, Beate; Baune, Matthias; Jungkeit, Janek; Tiso, Till; Blank, Lars M; Hayen, Heiko

    2016-07-15

    A method using high performance liquid chromatography coupled to charged-aerosol detection (HPLC-CAD) was developed for the quantification of rhamnolipid biosurfactants. Qualitative sample composition was determined by liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS). The relative quantification of different derivatives of rhamnolipids including di-rhamnolipids, mono-rhamnolipids, and their precursors 3-(3-hydroxyalkanoyloxy)alkanoic acids (HAAs) differed for two compared LC-MS instruments and revealed instrument dependent responses. Our here reported HPLC-CAD method provides uniform response. An inverse gradient was applied for the absolute quantification of rhamnolipid congeners to account for the detector's dependency on the solvent composition. The CAD produces a uniform response not only for the analytes but also for structurally different (nonvolatile) compounds. It was demonstrated that n-dodecyl-β-d-maltoside or deoxycholic acid can be used as alternative standards. The method of HPLC-ultra violet (UV) detection after a derivatization of rhamnolipids and HAAs to their corresponding phenacyl esters confirmed the obtained results but required additional, laborious sample preparation steps. Sensitivity determined as limit of detection and limit of quantification for four mono-rhamnolipids was in the range of 0.3-1.0 and 1.2-2.0μg/mL, respectively, for HPLC-CAD and 0.4 and 1.5μg/mL, respectively, for HPLC-UV. Linearity for HPLC-CAD was at least 0.996 (R(2)) in the calibrated range of about 1-200μg/mL. Hence, the here presented HPLC-CAD method allows absolute quantification of rhamnolipids and derivatives. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Alternative Work Schedules: Many Agencies Do Not Allow Employees the Full Flexibility Permitted by Law. Report to Congressional Committees.

    Science.gov (United States)

    General Accounting Office, Washington, DC. General Government Div.

    A review was conducted of the extent to which selected federal agencies are allowing employees to use alternative work schedules (AWS) as authorized by the Federal Employees Flexible and Compressed Work Schedules Act. The statute permits, rather than requires, agencies to institute AWS programs. The study surveyed the policies and practices of 59…

  9. Novel isotopic N, N-Dimethyl Leucine (iDiLeu) Reagents Enable Absolute Quantification of Peptides and Proteins Using a Standard Curve Approach

    Science.gov (United States)

    Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun

    2015-01-01

    Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).

  10. Calculation of the Incremental Conditional Core Damage Probability on the Extension of Allowed Outage Time

    International Nuclear Information System (INIS)

    Kang, Dae Il; Han, Sang Hoon

    2006-01-01

    RG 1.177 requires that the conditional risk (incremental conditional core damage probability and incremental conditional large early release probability: ICCDP and ICLERP), given that a specific component is out of service (OOS), be quantified for a permanent change of the allowed outage time (AOT) of a safety system. An AOT is the length of time that a particular component or system is permitted to be OOS while the plant is operating. The ICCDP is defined as: ICCDP = [(conditional CDF with the subject equipment OOS)- (baseline CDF with nominal expected equipment unavailabilities)] [duration of the single AOT under consideration]. Any event enabling the component OOS can initiate the time clock for the limiting condition of operation for a nuclear power plant. Thus, the largest ICCDP among the ICCDPs estimated from any occurrence of the basic events for the component fault tree should be selected for determining whether the AOT can be extended or not. If the component is under a preventive maintenance, the conditional risk can be straightforwardly calculated without changing the CCF probability. The main concern is the estimations of the CCF probability because there are the possibilities of the failures of other similar components due to the same root causes. The quantifications of the risk, given that a subject equipment is in a failed state, are performed by setting the identified event of subject equipment to TRUE. The CCF probabilities are also changed according to the identified failure cause. In the previous studies, however, the ICCDP was quantified with the consideration of the possibility of a simultaneous occurrence of two CCF events. Based on the above, we derived the formulas of the CCF probabilities for the cases where a specific component is in a failed state and we presented sample calculation results of the ICCDP for the low pressure safety injection system (LPSIS) of Ulchin Unit 3

  11. Quantification accuracy and partial volume effect in dependence of the attenuation correction of a state-of-the-art small animal PET scanner

    International Nuclear Information System (INIS)

    Mannheim, Julia G; Judenhofer, Martin S; Schmid, Andreas; Pichler, Bernd J; Tillmanns, Julia; Stiller, Detlef; Sossi, Vesna

    2012-01-01

    Quantification accuracy and partial volume effect (PVE) of the Siemens Inveon PET scanner were evaluated. The influence of transmission source activities (40 and 160 MBq) on the quantification accuracy and the PVE were determined. Dynamic range, object size and PVE for different sphere sizes, contrast ratios and positions in the field of view (FOV) were evaluated. The acquired data were reconstructed using different algorithms and correction methods. The activity level of the transmission source and the total emission activity in the FOV strongly influenced the attenuation maps. Reconstruction algorithms, correction methods, object size and location within the FOV had a strong influence on the PVE in all configurations. All evaluated parameters potentially influence the quantification accuracy. Hence, all protocols should be kept constant during a study to allow a comparison between different scans. (paper)

  12. Funding child rearing: child allowance and parental leave.

    Science.gov (United States)

    Walker, J R

    1996-01-01

    This article proposes two financing plans to address what the author identifies as the two primary concerns in the child care field: (1) a child allowance for poor and near-poor households to address the child care problems of low-income families, and (2) a program of voluntary parental leave, available to all parents at child birth or adoption, to ensure the adequacy of infant care. The child allowance plan would cover the first three children in families up to 175% of the poverty level (more than 22 million children) at an annual cost of $45 billion. The author suggests that the allowance could be financed by redirecting funds from existing income support (for example, Aid to Families with Dependent Children), tax credit, and tax deduction programs. Financing the parental leave program would require new revenues, generated by an employee-paid increase in payroll tax totaling 3.5%. Each employee's contributions would create a parental leave account (PLA). Families could use the funds in these accounts to cover the cost of a one-year leave from work after the birth or adoption of a child. If families did not have enough dollars in their accounts to cover the cost of the leave, the federal government would extend a low-interest loan to them, which they would have to pay back. The amount individuals receive through Social Security would be adjusted upward or downward according to the balances in their parental leave accounts at retirement. The author suggests that both proposals would help parents balance work and family obligations and protect parental freedom of choice over the care and upbringing of their children.

  13. Environmental surveys

    International Nuclear Information System (INIS)

    Costa-Ribeiro, C.

    1977-01-01

    An environmental survey conducted in high natural radioactivity areas and methods used to evaluated radiation doses received by the population are presented. It is shown doses absorved due to ingestion of radioactively contaminated food and water. Exposure to external gamma radiation fields or inhalation of abnormal quantities of natural airborne radioactivity are discussed [pt

  14. Survey < > Creation

    DEFF Research Database (Denmark)

    2017-01-01

    The project, Survey Creation suggests that point cloud models from 3D scans of an existing space can be the source for explorative drawings. By probing into the procedure of 3D laser scanning, it became possible to make use of the available point clouds to both access geometric representation......) and the creation drawing (of the anticipated)....

  15. Quantification of nanowire uptake by live cells

    KAUST Repository

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  16. Quantification of water in hydrous ringwoodite

    Directory of Open Access Journals (Sweden)

    Sylvia-Monique eThomas

    2015-01-01

    Full Text Available Ringwoodite, γ-(Mg,Fe2SiO4, in the lower 150 km of Earth’s mantle transition zone (410-660 km depth can incorporate up to 1.5-2 wt% H2O as hydroxyl defects. We present a mineral-specific IR calibration for the absolute water content in hydrous ringwoodite by combining results from Raman spectroscopy, secondary ion mass spectrometery (SIMS and proton-proton (pp-scattering on a suite of synthetic Mg- and Fe-bearing hydrous ringwoodites. H2O concentrations in the crystals studied here range from 0.46 to 1.7 wt% H2O (absolute methods, with the maximum H2O in the same sample giving 2.5 wt% by SIMS calibration. Anchoring our spectroscopic results to absolute H-atom concentrations from pp-scattering measurements, we report frequency-dependent integrated IR-absorption coefficients for water in ringwoodite ranging from 78180 to 158880 L mol-1cm-2, depending upon frequency of the OH absorption. We further report a linear wavenumber IR calibration for H2O quantification in hydrous ringwoodite across the Mg2SiO4-Fe2SiO4 solid solution, which will lead to more accurate estimations of the water content in both laboratory-grown and naturally occurring ringwoodites. Re-evaluation of the IR spectrum for a natural hydrous ringwoodite inclusion in diamond from the study of Pearson et al. (2014 indicates the crystal contains 1.43 ± 0.27 wt% H2O, thus confirming near-maximum amounts of H2O for this sample from the transition zone.

  17. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  18. Predicted allowable doses to normal organs for biologically targeted radiotherapy

    International Nuclear Information System (INIS)

    O'Donoghue, J.A.; Wheldon, T.E.; Western Regional Hospital Board, Glasgow

    1988-01-01

    The authors have used Dale's extension to the ''linear quadratic'' (LQ) model (Dale, 1985) to evaluate ''equivalent doses'' in cases involving exponentially decaying dose rates. This analysis indicates that the dose-rate effect will be a significant determinant of allowable doses to organs such as liver, kidney and lung. These organ tolerance doses constitute independent constraints on the therapeutic intensity of biologically targeted radiotherapy in exactly the same way as for conventional external beam radiotherapy. In the context of marrow rescue they will in all likelihood constitute the dose-limiting side-effects and thus be especially important. (author)

  19. Maximum allowable heat flux for a submerged horizontal tube bundle

    International Nuclear Information System (INIS)

    McEligot, D.M.

    1995-01-01

    For application to industrial heating of large pools by immersed heat exchangers, the socalled maximum allowable (or open-quotes criticalclose quotes) heat flux is studied for unconfined tube bundles aligned horizontally in a pool without forced flow. In general, we are considering boiling after the pool reaches its saturation temperature rather than sub-cooled pool boiling which should occur during early stages of transient operation. A combination of literature review and simple approximate analysis has been used. To date our main conclusion is that estimates of q inch chf are highly uncertain for this configuration

  20. Allowable carbon emissions for medium-to-high mitigation scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Tachiiri, Kaoru; Hargreaves, Julia C.; Annan, James D.; Kawamiya, Michio [Research Inst. for Global Change, Japan Agency for Marine-Earth Science and Technology, Yokohama, (Japan)], e-mail: tachiiri@jamstec.go.jp; Huntingford, Chris [Centre for Ecology and Hydrology, Wallingford (United Kingdom)

    2013-11-15

    Using an ensemble of simulations with an intermediate complexity climate model and in a probabilistic framework, we estimate future ranges of carbon dioxide (CO{sub 2}) emissions in order to follow three medium-high mitigation concentration pathways: RCP2.6, RCP4.5 and SCP4.5 to 2.6. Uncertainty is first estimated by allowing modelled equilibrium climate sensitivity, aerosol forcing and intrinsic physical and biogeochemical processes to vary within widely accepted ranges. Results are then constrained by comparison against contemporary measurements. For both constrained and unconstrained projections, our calculated allowable emissions are close to the standard (harmonised) emission scenarios associated with these pathways. For RCP4.5, which is the most moderate scenario considered in terms of required emission abatement, then after year 2100 very low net emissions are needed to maintain prescribed year 2100 CO{sub 2} concentrations. As expected, RCP2.6 and SCP4.5 to 2.6 require more strict emission reductions. The implication of this is that direct sequestration of carbon dioxide is likely to be required for RCP4.5 or higher mitigation scenarios, to offset any minimum emissions for society to function (the 'emissions floor'). Despite large uncertainties in the physical and biogeochemical processes, constraints from model-observational comparisons support a high degree of confidence in predicting the allowable emissions consistent with a particular concentration pathway. In contrast the uncertainty in the resulting temperature range remains large. For many parameter sets, and especially for RCP2.6, the land will turn into a carbon source within the twenty first century, but the ocean will remain as a carbon sink. For land carbon storage and our modelling framework, major reductions are seen in northern high latitudes and the Amazon basin even after atmospheric CO{sub 2} is stabilised, while for ocean carbon uptake, the tropical ocean regions will be a

  1. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    Science.gov (United States)

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  2. Statistical considerations of graphite strength for assessing design allowable stresses

    International Nuclear Information System (INIS)

    Ishihara, M.; Mogi, H.; Ioka, I.; Arai, T.; Oku, T.

    1987-01-01

    Several aspects of statistics need to be considered to determine design allowable stresses for graphite structures. These include: 1) Statistical variation of graphite material strength. 2) Uncertainty of calculated stress. 3) Reliability (survival probability) required from operational and safety performance of graphite structures. This paper deals with some statistical considerations of structural graphite for assessing design allowable stress. Firstly, probability distribution functions of tensile and compressive strengths are investigated on experimental Very High Temperature candidated graphites. Normal, logarithmic normal and Weibull distribution functions are compared in terms of coefficient of correlation to measured strength data. This leads to the adaptation of normal distribution function. Then, the relation between factor of safety and fracture probability is discussed on the following items: 1) As the graphite strength is more variable than metalic material's strength, the effect of strength variation to the fracture probability is evaluated. 2) Fracture probability depending on survival probability of 99 ∼ 99.9 (%) with confidence level of 90 ∼ 95 (%) is discussed. 3) As the material properties used in the design analysis are usually the mean values of their variation, the additional effect of these variations on the fracture probability is discussed. Finally, the way to assure the minimum ultimate strength with required survival probability with confidence level is discussed in view of statistical treatment of the strength data from varying sample numbers in a material acceptance test. (author)

  3. Contested change: how Germany came to allow PGD

    Directory of Open Access Journals (Sweden)

    Bettina Bock von Wülfingen

    2016-12-01

    Full Text Available Until recently, German laws protecting the human embryo from the moment of conception were some of the strictest internationally. These laws had previously prevented any manipulation of the embryo, such as in preimplantation genetic diagnosis (PGD, and continue to affect stem cell research. In 2011, however, the German parliament voted in favour of allowing PGD in specific cases. While the modification in the law in earlier analysis was interpreted as being in keeping with the usual norms in Germany, this article argues instead that the reasoning behind the partial acceptance of PGD, rather than the legal decision itself, is indicative of a sociocultural change that needs to be accredited. Demonstrating that a significant change occurred, this article analyses the arguments that led to the amendment in law: not only has the identity of the embryo been redefined towards a pragmatic concept but the notions of parenting and pregnancy have also changed. The focus on the mother and the moment of birth has given way to a focus on conception and ‘genetic couplehood’. The professional discourse preceding the decision allowing PGD suggested that the rights of the not-yet-implanted embryo should be negotiated with those of the two parents-to-be, a concept that may be called ‘in-vitro pregnancy’.

  4. [Niacin allowance of students of a sports college].

    Science.gov (United States)

    Borisov, I M

    1977-01-01

    In 227 students of the Institute for Physical Culture examined in the winter-spring and summer-fall seasons of the year, the passage of N1-methylnicotinamide (MNA) with urine per hour on an empty stomach amounted to 245 +/- 15.9 and 311 +/- 14.6 microgram/hour (the difference according to seasons in significant). These figues point to the dependence of the MNA excretion with uridine on the quantity of the niacin equivalents supplied together with the food. The content of such equivalentsin the rations of students-sprotrsmen (7-9.5 mg per 1000 calories per day) proved insufficient to maintain the MNA passage with urine at a level accepted as a standard allowance of niacin for the organism, i. e. 400-500 microgram/hour. Furthermore, the author shows changes in the niacin allowances of the student's organism, engaged in different kinds of sporting activities and also depending upon the sporting qualification of the examinees the work performed by them, the periods of training, and conditions of their every-day life.

  5. Characteristics and allowed behaviors of gay male couples' sexual agreements.

    Science.gov (United States)

    Mitchell, Jason W

    2014-01-01

    Research has shown that gay male couples' sexual agreements may affect their risk for HIV. Few U.S. studies have collected dyadic data nationally from gay male couples to assess what sexual behaviors they allow to occur by agreement type and the sequence of when certain behaviors occur within their relationships. In our cross-sectional study, dyadic data from a convenience sample of 361 male couples were collected electronically throughout the United States by using paid Facebook ads. Findings revealed that couples discussed their HIV status before having unprotected anal intercourse (UAI) but established their agreement some time after having UAI. About half of the couples (N = 207) concurred about having an agreement. Among these couples, 58% concurred about explicitly discussing their agreement, 84% concurred about having the same type of agreement, and 54% had both men adhering to it. A variety of sexual behaviors were endorsed and varied by agreement type. Concordance about aspects of couples' agreements varied, suggesting the need to engage couples to be more explicit and detailed when establishing and communicating about their agreements. The allowed behaviors and primary reasons for establishing and breaking sexual agreements further highlight the need to bolster HIV prevention for gay male couples.

  6. Evaluation of allowed outage time using PRA results

    International Nuclear Information System (INIS)

    Johanson, G.

    1985-01-01

    In a probabilistic risk assessment (PRA) different measures of risk importance can be established. These measures can be used as a basis for further evaluation and determination of allowed outage time for specific components, within safety systems of a nuclear power plant. In order to optimize the allowed outage time (AOT) stipulated in the plant's Technical Specification it is necessary to create a methodology which could incorporate existing PRA data into a quantitative extrapolation. In order to evaluate the plant risk status due to AOT in a quantitative manner, the risk achievement worth is utilized. Risk achievement worth is defined as follows: to measure the worth of a feature, in achieving the present risk, one approach is to remove the feature and then determine how much the risk has increased. Thus, the risk achievement worth is formally defined to be the increase in risk if the feature were assumed not be there or to be failed. Another parameter of interest for this analysis is the shutdown risk increase. The shutdown risk achievement worth must be incorporated into the accident sequence risk achievement worth to arrive at an optimal set of plant specific AOTs

  7. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    Science.gov (United States)

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Multi-tissue partial volume quantification in multi-contrast MRI using an optimised spectral unmixing approach.

    Science.gov (United States)

    Collewet, Guylaine; Moussaoui, Saïd; Deligny, Cécile; Lucas, Tiphaine; Idier, Jérôme

    2018-06-01

    Multi-tissue partial volume estimation in MRI images is investigated with a viewpoint related to spectral unmixing as used in hyperspectral imaging. The main contribution of this paper is twofold. It firstly proposes a theoretical analysis of the statistical optimality conditions of the proportion estimation problem, which in the context of multi-contrast MRI data acquisition allows to appropriately set the imaging sequence parameters. Secondly, an efficient proportion quantification algorithm based on the minimisation of a penalised least-square criterion incorporating a regularity constraint on the spatial distribution of the proportions is proposed. Furthermore, the resulting developments are discussed using empirical simulations. The practical usefulness of the spectral unmixing approach for partial volume quantification in MRI is illustrated through an application to food analysis on the proving of a Danish pastry. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Survey and analysis of deep water mineral deposits using nuclear methods

    International Nuclear Information System (INIS)

    Staehle, C.M.; Noakes, J.E.; Spaulding, J.

    1991-01-01

    Present knowledge of the location, quality, quantity and recoverability of sea floor minerals is severely limited, particularly in the abyssal depths and deep water within the 200 mile Exclusion Economic Zone (EEZ) surrounding the U.S. Pacific Islands. To improve this understanding and permit exploitation of these mineral reserves much additional data is needed. This paper will discuss a sponsored program for extending existing proven nuclear survey methods currently used on the shallow continental margins of the Atlantic and Gulf of Mexico into the deeper waters of the Pacific. This nuclear technology can be readily integrated and extended to depths of 2000 m using the existing RCV-150 remotely operated vehicle (ROV) and the PISCESE V manned deep submersible vehicle (DSV) operated by The University of Hawaii's, Hawaii Underseas Research Laboratory (HURL). Previous papers by the authors have also proposed incorporating these nuclear analytical methods for survey of the deep ocean through the use of Autonomous Underwater Vehicle (AUX). Such a vehicle could extend the use of passive nuclear instrument operation, in addition to conventional analytical methods, into the abyssal depths and do so with speed and economy not otherwise possible. The natural radioactivity associated with manganese nodules and crustal deposits is sufficiently above normal background levels to allow discrimination and quantification in near real time

  10. Quantification of silver nanoparticle uptake and distribution within individual human macrophages by FIB/SEM slice and view.

    Science.gov (United States)

    Guehrs, Erik; Schneider, Michael; Günther, Christian M; Hessing, Piet; Heitz, Karen; Wittke, Doreen; López-Serrano Oliver, Ana; Jakubowski, Norbert; Plendl, Johanna; Eisebitt, Stefan; Haase, Andrea

    2017-03-21

    Quantification of nanoparticle (NP) uptake in cells or tissues is very important for safety assessment. Often, electron microscopy based approaches are used for this purpose, which allow imaging at very high resolution. However, precise quantification of NP numbers in cells and tissues remains challenging. The aim of this study was to present a novel approach, that combines precise quantification of NPs in individual cells together with high resolution imaging of their intracellular distribution based on focused ion beam/ scanning electron microscopy (FIB/SEM) slice and view approaches. We quantified cellular uptake of 75 nm diameter citrate stabilized silver NPs (Ag 75 Cit) into an individual human macrophage derived from monocytic THP-1 cells using a FIB/SEM slice and view approach. Cells were treated with 10 μg/ml for 24 h. We investigated a single cell and found in total 3138 ± 722 silver NPs inside this cell. Most of the silver NPs were located in large agglomerates, only a few were found in clusters of fewer than five NPs. Furthermore, we cross-checked our results by using inductively coupled plasma mass spectrometry and could confirm the FIB/SEM results. Our approach based on FIB/SEM slice and view is currently the only one that allows the quantification of the absolute dose of silver NPs in individual cells and at the same time to assess their intracellular distribution at high resolution. We therefore propose to use FIB/SEM slice and view to systematically analyse the cellular uptake of various NPs as a function of size, concentration and incubation time.

  11. Quantification of carbonate by gas chromatography-mass spectrometry.

    Science.gov (United States)

    Tsikas, Dimitrios; Chobanyan-Jürgens, Kristine

    2010-10-01

    Carbon dioxide and carbonates are widely distributed in nature, are constituents of inorganic and organic matter, and are essential in vegetable and animal organisms. CO(2) is the principal greenhouse gas in the atmosphere. In human blood, CO(2)/HCO(3)(-) is an important buffering system. Quantification of bicarbonate and carbonate in inorganic and organic matter and in biological fluids such as blood or blood plasma by means of the GC-MS technology has been impossible so far, presumably because of the lack of suitable derivatization reactions to produce volatile and thermally stable derivatives. Here, a novel derivatization reaction is described for carbonate that allows for its quantification in aqueous alkaline solutions and alkalinized plasma and urine. Carbonate in acetonic solutions of these matrices (1:4 v/v) and added (13)C-labeled carbonate for use as the internal standard were heated in the presence of the derivatization agent pentafluorobenzyl (PFB) bromide for 60 min and 50 °C. Investigations with (12)CO(3)(2-), (13)CO(3)(2-), (CH(3))(2)CO, and (CD(3))(2)CO in alkaline solutions and GC-MS and GC-MS/MS analyses under negative-ion chemical ionization (NICI) or electron ionization (EI) conditions of toluene extracts of the reactants revealed formation of two minor [i.e., PFB-OCOOH and O=CO(2)-(PFB)(2)] and two major [i.e., CH(3)COCH(2)-C(OH)(OPFB)(2) and CH(3)COCH=C(OPFB)(2)] carbonate derivatives. The latter have different retention times (7.9 and 7.5 min, respectively) but virtually identical EI and NICI mass spectra. It is assumed that CH(3)COCH(2)-C(OH)(OPFB)(2) is formed from the reaction of the carbonate dianion with two molecules of PFB bromide to form the diPFB ester of carbonic acid, which further reacts with one molecule of acetone. Subsequent loss of water finally generates the major derivative CH(3)COCH=C(OPFB)(2). This derivatization reaction was utilized to quantify total CO(2)/HCO(3)(-)/CO(3)(2-) (tCO(2)) in human plasma and urine by GC

  12. The greenhouse gases emissions allowances trading in the Czech Republic

    International Nuclear Information System (INIS)

    Chemisinec, Igor; Marvan, Miroslav; Tuma, Jiri

    2006-01-01

    The energy policy of the State is very important for a state development. The aim of this policy is power energy development, which is essential for improving the quality of life and standards of people's living in every country. Unfortunately, power energy development also has a negative impact; primarily on the environment. Some possible solutions exist for reduction of the power energy negative impacts. This paper deals with reduction of greenhouse gases (GHG) emissions in the Czech Republic according to the Kyoto protocol to the United Nations Framework Convention climate change. The ultimate objective of the United Nations Framework Convention on Climate Change is to achieve stabilization of greenhouse gas concentrations in the atmosphere. The GHG emissions allowances trading as one of the instruments for stabilisation of GHG emissions is described in the paper. (authors)

  13. AICPA allows low-cost options for compiled financial statements.

    Science.gov (United States)

    Reinstein, Alan; Luecke, Randall W

    2002-02-01

    The AICPA Accounting and Review Services Committee's (ARSC) SSARS No. 8, Amendment to Statement on Standards for Accounting and Review Services No. 1, Compilation and Review of Financial Statements, issued in October 2000, allows financial managers to provide plain-paper, compiled financial statements for the exclusive use of management. Such financial statements were disallowed in 1979 when the AICPA issued SSARS No. 1, Compilation and Review of Financial Statements. With the issuance of SSARS No. 8, financial managers can prepare plain-paper, compiled financial statements when third parties are not expected to rely on the financial statements, management acknowledges such restrictions in writing, and management acknowledges its primary responsibility for the adequacy of the financial statements.

  14. Finding words in a language that allows words without vowels.

    Science.gov (United States)

    El Aissati, Abder; McQueen, James M; Cutler, Anne

    2012-07-01

    Across many languages from unrelated families, spoken-word recognition is subject to a constraint whereby potential word candidates must contain a vowel. This constraint minimizes competition from embedded words (e.g., in English, disfavoring win in twin because t cannot be a word). However, the constraint would be counter-productive in certain languages that allow stand-alone vowelless open-class words. One such language is Berber (where t is indeed a word). Berber listeners here detected words affixed to nonsense contexts with or without vowels. Length effects seen in other languages replicated in Berber, but in contrast to prior findings, word detection was not hindered by vowelless contexts. When words can be vowelless, otherwise universal constraints disfavoring vowelless words do not feature in spoken-word recognition. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Short peptides allowing preferential detection of Candida albicans hyphae.

    Science.gov (United States)

    Kaba, Hani E J; Pölderl, Antonia; Bilitewski, Ursula

    2015-09-01

    Whereas the detection of pathogens via recognition of surface structures by specific antibodies and various types of antibody mimics is frequently described, the applicability of short linear peptides as sensor molecules or diagnostic tools is less well-known. We selected peptides which were previously reported to bind to recombinant S. cerevisiae cells, expressing members of the C. albicans Agglutinin-Like-Sequence (ALS) cell wall protein family. We slightly modified amino acid sequences to evaluate peptide sequence properties influencing binding to C. albicans cells. Among the selected peptides, decamer peptides with an "AP"-N-terminus were superior to shorter peptides. The new decamer peptide FBP4 stained viable C. albicans cells more efficiently in their mature hyphal form than in their yeast form. Moreover, it allowed distinction of C. albicans from other related Candida spp. and could thus be the basis for the development of a useful tool for the diagnosis of invasive candidiasis.

  16. Public pensions, family allowances and endogenous demographic change.

    Science.gov (United States)

    Peters, W

    1995-05-01

    "A tax-transfer system deals with redistribution a PAYGmong generations and corrective taxation a PAYGt the same time. Since such a policy is a government's task, we take a normative approach and pose the question: Which tax-transfer system should a government apply to maximize social welfare? The framework we consider allows for endogenous demographic aspects...: first, fertility has a great impact on a PAYG [pay-as-you-go] financed pension insurance; and second, through education human capital is accumulated.... We analyzed the optimal extent of a public pension scheme in the presence of external effects of fertility and education on the net domestic product." Pension schemes in Germany and the United States are compared. excerpt

  17. Gamma camera based Positron Emission Tomography: a study of the viability on quantification

    International Nuclear Information System (INIS)

    Pozzo, Lorena

    2005-01-01

    Positron Emission Tomography (PET) is a Nuclear Medicine imaging modality for diagnostic purposes. Pharmaceuticals labeled with positron emitters are used and images which represent the in vivo biochemical process within tissues can be obtained. The positron/electron annihilation photons are detected in coincidence and this information is used for object reconstruction. Presently, there are two types of systems available for this imaging modality: the dedicated systems and those based on gamma camera technology. In this work, we utilized PET/SPECT systems, which also allows for the traditional Nuclear Medicine studies based on single photon emitters. There are inherent difficulties which affect quantification of activity and other indices. They are related to the Poisson nature of radioactivity, to radiation interactions with patient body and detector, noise due to statistical nature of these interactions and to all the detection processes, as well as the patient acquisition protocols. Corrections are described in the literature and not all of them are implemented by the manufacturers: scatter, attenuation, random, decay, dead time, spatial resolution, and others related to the properties of each equipment. The goal of this work was to assess these methods adopted by two manufacturers, as well as the influence of some technical characteristics of PET/SPECT systems on the estimation of SUV. Data from a set of phantoms were collected in 3D mode by one camera and 2D, by the other. We concluded that quantification is viable in PET/SPECT systems, including the estimation of SUVs. This is only possible if, apart from the above mentioned corrections, the camera is well tuned and coefficients for sensitivity normalization and partial volume corrections are applied. We also verified that the shapes of the sources used for obtaining these factors play a role on the final results and should be delt with carefully in clinical quantification. Finally, the choice of the region

  18. Offshore wind turbine risk quantification/evaluation under extreme environmental conditions

    International Nuclear Information System (INIS)

    Taflanidis, Alexandros A.; Loukogeorgaki, Eva; Angelides, Demos C.

    2013-01-01

    A simulation-based framework is discussed in this paper for quantification/evaluation of risk and development of automated risk assessment tools, focusing on applications to offshore wind turbines under extreme environmental conditions. The framework is founded on a probabilistic characterization of the uncertainty in the models for the excitation, the turbine and its performance. Risk is then quantified as the expected value of some risk consequence measure over the probability distributions considered for the uncertain model parameters. Stochastic simulation is proposed for the risk assessment, corresponding to the evaluation of some associated probabilistic integral quantifying risk, as it allows for the adoption of comprehensive computational models for describing the dynamic turbine behavior. For improvement of the computational efficiency, a surrogate modeling approach is introduced based on moving least squares response surface approximations. The assessment is also extended to a probabilistic sensitivity analysis that identifies the importance of each of the uncertain model parameters, i.e. risk factors, towards the total risk as well as towards each of the failure modes contributing to this risk. The versatility and computational efficiency of the advocated approaches is finally exploited to support the development of standalone risk assessment applets for automated implementation of the probabilistic risk quantification/assessment. -- Highlights: ► A simulation-based risk quantification/assessment framework is discussed. ► Focus is on offshore wind turbines under extreme environmental conditions. ► Approach is founded on probabilistic description of excitation/system model parameters. ► Surrogate modeling is adopted for improved computational efficiency. ► Standalone risk assessment applets for automated implementation are supported

  19. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions.

    Science.gov (United States)

    Johnson, Derek R; Covington, April N; Clark, Nigel N

    2016-06-12

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities.

  20. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions

    Science.gov (United States)

    Johnson, Derek R.; Covington, April N.; Clark, Nigel N.

    2016-01-01

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities. PMID:27341646