WorldWideScience

Sample records for surveys allowed quantification

  1. Survey and Evaluate Uncertainty Quantification Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  2. Accelerator Mass Spectrometry Allows for Cellular Quantification of Doxorubicin at Femtomolar Concentrations

    Energy Technology Data Exchange (ETDEWEB)

    DeGregorio, M W; Dingley, K H; Wurz, G T; Ubick, E; Turteltaub, K W

    2005-04-12

    Accelerator mass spectrometry (AMS) is a highly sensitive analytical methodology used to quantify the content of radioisotopes, such as {sup 14}C, in a sample. The primary goals of this work were to demonstrate the utility of AMS in determining cellular [{sup 14}C]doxorubicin (DOX) concentrations and to develop a sensitive assay that is superior to high performance liquid chromatography (HPLC) for the quantification of DOX at the tumor level. In order to validate the superior sensitivity of AMS versus HPLC with fluorescence detection, we performed three studies comparing the cellular accumulation of DOX: one in vitro cell line study, and two in vivo xenograft mouse studies. Using AMS, we quantified cellular DOX content up to 4 hours following in vitro exposure at concentrations ranging from 0.2 pg/ml (345 fM) to 2 {micro}g/ml (3.45 {micro}M) [{sup 14}C]DOX. The results of this study show that, compared to standard fluorescence-based HPLC, the AMS method was over five orders of magnitude more sensitive. Two in vivo studies compared the sensitivity of AMS to HPLC using a nude mouse xenograft model in which breast cancer cells were implanted subcutaneously. After sufficiently large tumors formed, DOX was administered intravenously at two dose levels. Additionally, we tested the AMS method in a nude mouse xenograft model of multidrug resistance (MDR) in which each mouse was implanted with both wild type and MDR+ cells on opposite flanks. The results of the second and third studies showed that DOX concentrations were significantly higher in the wild type tumors compared to the MDR+ tumors, consistent with the MDR model. The extreme sensitivity of AMS should facilitate similar studies in humans to establish target site drug delivery and to potentially determine the optimal treatment dose and regimen.

  3. The Vital Role of Administrative Cost Allowances to Student Financial Aid Offices: Key Findings from NASFAA's Administrative Cost Allowance Survey, July 2011

    Science.gov (United States)

    National Association of Student Financial Aid Administrators (NJ1), 2011

    2011-01-01

    The National Association of Student Financial Aid Administrators (NASFAA) recently conducted a survey on the 2009-10 award year Administrative Cost Allowances (ACA), which are funds used by colleges and universities to support operations and professional development. Specifically, ACA is often used in essential areas that support the day-to-day…

  4. Predicting medical professionals' intention to allow family presence during resuscitation: A cross sectional survey.

    Science.gov (United States)

    Lai, Meng-Kuan; Aritejo, Bayu Aji; Tang, Jing-Shia; Chen, Chien-Liang; Chuang, Chia-Chang

    2017-05-01

    Family presence during resuscitation is an emerging trend, yet it remains controversial, even in countries with relatively high acceptance of family presence during resuscitation among medical professionals. Family presence during resuscitation is not common in many countries, and medical professionals in these regions are unfamiliar with family presence during resuscitation. Therefore, this study predicted the medical professionals' intention to allow family presence during resuscitation by applying the theory of planned behaviour. A cross-sectional survey. A single medical centre in southern Taiwan. Medical staffs including physicians and nurses in a single medical centre (n=714). A questionnaire was constructed to measure the theory of planned behaviour constructs of attitudes, subjective norms, perceived behavioural control, and behavioural intentions as well as the awareness of family presence during resuscitation and demographics. In total, 950 questionnaires were distributed to doctors and nurses in a medical centre. Among the 714 valid questionnaires, only 11 participants were aware of any association in Taiwan that promotes family presence during resuscitation; 94.7% replied that they were unsure (30.4%) or that their unit did not have a family presence during resuscitation policy (74.8%). Regression analysis was performed to predict medical professionals' intention to allow family presence during resuscitation. The results indicated that only positive attitudes and subjective norms regarding family presence during resuscitation and clinical tenure could predict the intention to allow family presence during resuscitation. Because Family presence during resuscitation practice is not common in Taiwan and only 26.19% of the participants agreed to both items measuring the intention to allow family presence during resuscitation, we recommend the implementation of a family presence during resuscitation education program that will enhance the positive beliefs

  5. Quantification of Small Non-Coding RNAs Allows an Accurate Comparison of miRNA Expression Profiles

    Directory of Open Access Journals (Sweden)

    Andrea Masotti

    2009-01-01

    Full Text Available MicroRNAs (miRNAs are highly conserved ∼22-mer RNA molecules, encoded by plants and animals that regulate the expression of genes binding to the 3′-UTR of specific target mRNAs. The amount of miRNAs in a total RNA sample depends on the recovery efficiency that may be significantly affected by the different purification methods employed. Traditional approaches may be inefficient at recovering small RNAs, and common spectrophotometric determination is not adequate to quantify selectively these low molecular weight (LMW species from total RNA samples. Here, we describe the use of qualitative and quantitative lab-on-a-chip tools for the analysis of these LMW RNA species. Our data emphasize the close correlation of LMW RNAs with the expression levels of some miRNAs. We therefore applied our result to the comparison of some miRNA expression profiles in different tissues. Finally, the methods we used in this paper allowed us to analyze the efficiency of extraction protocols, to study the small (but significant differences among various preparations and to allow a proper comparison of some miRNA expression profiles in various tissues.

  6. Estimating biomass, fishing mortality, and “total allowable discards” for surveyed non-target fish

    OpenAIRE

    Shephard, S.; Reid, D G; Gerritsen, H. D.; Farnsworth, K. D.

    2014-01-01

    Demersal fisheries targeting a few high-value species often catch and discard other “non-target” species. It is difficult to quantify the impact of this incidental mortality when population biomass of a non-target species is unknown. We calculate biomass for 14 demersal fish species in ICES Area VIIg (Celtic Sea) by applying species- and length-based catchability corrections to catch records from the Irish Groundfish Survey (IGFS). We then combine these biomass estimates with records of comme...

  7. The quantification of free Amadori compounds and amino acids allows to model the bound Maillard reaction products formation in soybean products

    NARCIS (Netherlands)

    Troise, Antonio Dario; Wiltafsky, Markus; Fogliano, Vincenzo; Vitaglione, Paola

    2018-01-01

    The quantification of protein bound Maillard reaction products (MRPs) is still a challenge in food chemistry. Protein hydrolysis is the bottleneck step: it is time consuming and the protein degradation is not always complete. In this study, the quantitation of free amino acids and Amadori products

  8. The quantification of free Amadori compounds and amino acids allows to model the bound Maillard reaction products formation in soybean products.

    Science.gov (United States)

    Troise, Antonio Dario; Wiltafsky, Markus; Fogliano, Vincenzo; Vitaglione, Paola

    2018-05-01

    The quantification of protein bound Maillard reaction products (MRPs) is still a challenge in food chemistry. Protein hydrolysis is the bottleneck step: it is time consuming and the protein degradation is not always complete. In this study, the quantitation of free amino acids and Amadori products (APs) was compared to the percentage of blocked lysine by using chemometric tools. Eighty thermally treated soybean samples were analyzed by mass spectrometry to measure the concentration of free amino acids, free APs and the protein-bound markers of the Maillard reaction (furosine, Nε-(carboxymethyl)-l-lysine, Nε-(carboxyethyl)-l-lysine, total lysine). Results demonstrated that Discriminant Analysis (DA) and Correlated Component Regression (CCR) correctly estimated the percent of blocked lysine in a validation and prediction set. These findings indicate that the measure of free markers reflects the extent of protein damage in soybean samples and it suggests the possibility to obtain rapid information on the quality of the industrial processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Inference of pain stimulus level from stereotypical behavioral response of C.elegans allows quantification of effects of anesthesia and mutation

    Science.gov (United States)

    Leung, Kawai; Mohammadi, Aylia; Ryu, William; Nemenman, Ilya

    In animals, we must infer the pain level from experimental characterization of behavior. This is not trivial since behaviors are very complex and multidimensional. To establish C.elegans as a model for pain research, we propose for the first time a quantitative model that allows inference of a thermal nociceptive stimulus level from the behavior of an individual worm. We apply controlled levels of pain by locally heating worms with an infrared laser and capturing the subsequent behavior. We discover that the behavioral response is a product of stereotypical behavior and a nonlinear function of the strength of stimulus. The same stereotypical behavior is observed in normal, anesthetized and mutated worms. From this result we build a Bayesian model to infer the strength of laser stimulus from the behavior. This model allows us to measure the efficacy of anaesthetization and mutation by comparing the inferred strength of stimulus. Based on the measured nociceptive escape of over 200 worms, our model is able to significantly differentiate normal, anaesthetized and mutated worms with 40 worm samples. This work was partially supported by NSF Grant No. IOS/1208126 and HFSP Grant No. RGY0084/.

  10. Socially responsible ethnobotanical surveys in the Cape Floristic Region: ethical principles, methodology and quantification of data

    Directory of Open Access Journals (Sweden)

    Ben-Erik Van Wyk

    2012-03-01

    Full Text Available A broad overview of published and unpublished ethnobotanical surveys in the Cape Floristic Region (the traditional home of the San and Khoi communities shows that the data is incomplete. There is an urgent need to record the rich indigenous knowledge about plants in a systematic and social responsible manner in order to preserve this cultural and scientific heritage for future generations. Improved methods for quantifying data are introduced, with special reference to the simplicity and benefits of the new Matrix Method. This methodology prevents or reduces the number of false negatives, and also ensures the participation of elderly people who might be immobile. It also makes it possible to compare plant uses in different local communities. This method enables the researcher to quantify the knowledge on plant use that was preserved in a community, and to determine the relative importance of a specific plant in a more objective way. Ethical considerations for such ethnobotanical surveys are discussed, through the lens of current ethical codes and international conventions. This is an accessible approach, which can also be used in the life sciences classroom.

  11. Weekly observations of online survey metadata obtained through home computer use allow for detection of changes in everyday cognition before transition to mild cognitive impairment.

    Science.gov (United States)

    Seelye, Adriana; Mattek, Nora; Sharma, Nicole; Riley, Thomas; Austin, Johanna; Wild, Katherine; Dodge, Hiroko H; Lore, Emily; Kaye, Jeffrey

    2018-02-01

    Subtle changes in instrumental activities of daily living often accompany the onset of mild cognitive impairment (MCI) but are difficult to measure using conventional tests. Weekly online survey metadata metrics, annual neuropsychological tests, and an instrumental activity of daily living questionnaire were examined in 110 healthy older adults with intact cognition (mean age = 85 years) followed up for up to 3.6 years; 29 transitioned to MCI during study follow-up. In the baseline period, incident MCI participants completed their weekly surveys 1.4 hours later in the day than stable cognitively intact participants, P = .03, d = 0.47. Significant associations were found between earlier survey start time of day and higher memory (r = -0.34; P metadata allowed for detection of changes in everyday cognition before transition to MCI. Published by Elsevier Inc.

  12. Quantification of free convection for embarked QFN64 electronic package: An experimental and numerical survey

    Science.gov (United States)

    Baïri, A.

    2017-08-01

    Embarked Quad Flat Non-lead (QFN) electronic devices are equipped with a significant number of sensors used for flight parameters measurement purposes. Their accuracy directly depends on the package thermal state. Flight safety therefore depends on the reliability of these QFNs, whose junction temperature must remain as low as possible while operating. The QFN64 is favored for these applications. In the operating power range considered here (0.01-0.1W), the study shows that radiative heat transfer is negligible with respect to natural convective exchanges. It is then essential to quantify the convective heat transfer coefficient on its different areas so that the arrangement is properly dimensioned. This is the objective of this work. The device is welded on a PCB which may be inclined with respect to the horizontal plane by an angle ranging from 0° to 90°. Numerical approach results are confirmed by thermal and electrical measurements carried out on prototypes for all power-inclination angle combinations. The correlations here proposed help determine the natural convective heat transfer coefficient in any area of the assembly. This work allowed to thermally characterize and certify a new QFN64 package equipped with sensors designed for aeronautics, currently under industrialization process.

  13. CHILD ALLOWANCE

    CERN Multimedia

    Human Resources Division

    2001-01-01

    HR Division wishes to clarify to members of the personnel that the allowance for a dependent child continues to be paid during all training courses ('stages'), apprenticeships, 'contrats de qualification', sandwich courses or other courses of similar nature. Any payment received for these training courses, including apprenticeships, is however deducted from the amount reimbursable as school fees. HR Division would also like to draw the attention of members of the personnel to the fact that any contract of employment will lead to the suppression of the child allowance and of the right to reimbursement of school fees.

  14. Uncertainty quantification of seabed parameters for large data volumes along survey tracks with a tempered particle filter

    Science.gov (United States)

    Dettmer, J.; Quijano, J. E.; Dosso, S. E.; Holland, C. W.; Mandolesi, E.

    2016-12-01

    Geophysical seabed properties are important for the detection and classification of unexploded ordnance. However, current surveying methods such as vertical seismic profiling, coring, or inversion are of limited use when surveying large areas with high spatial sampling density. We consider surveys based on a source and receiver array towed by an autonomous vehicle which produce large volumes of seabed reflectivity data that contain unprecedented and detailed seabed information. The data are analyzed with a particle filter, which requires efficient reflection-coefficient computation, efficient inversion algorithms and efficient use of computer resources. The filter quantifies information content of multiple sequential data sets by considering results from previous data along the survey track to inform the importance sampling at the current point. Challenges arise from environmental changes along the track where the number of sediment layers and their properties change. This is addressed by a trans-dimensional model in the filter which allows layering complexity to change along a track. Efficiency is improved by likelihood tempering of various particle subsets and including exchange moves (parallel tempering). The filter is implemented on a hybrid computer that combines central processing units (CPUs) and graphics processing units (GPUs) to exploit three levels of parallelism: (1) fine-grained parallel computation of spherical reflection coefficients with a GPU implementation of Levin integration; (2) updating particles by concurrent CPU processes which exchange information using automatic load balancing (coarse grained parallelism); (3) overlapping CPU-GPU communication (a major bottleneck) with GPU computation by staggering CPU access to the multiple GPUs. The algorithm is applied to spherical reflection coefficients for data sets along a 14-km track on the Malta Plateau, Mediterranean Sea. We demonstrate substantial efficiency gains over previous methods. [This

  15. Detection, quantification and genotyping of Herpes Simplex Virus in cervicovaginal secretions by real-time PCR: a cross sectional survey

    Directory of Open Access Journals (Sweden)

    Natividad-Sancho Angels

    2005-08-01

    Full Text Available Abstract Background Herpes Simplex Virus (HSV Genital Ulcer Disease (GUD is an important public health problem, whose interaction with HIV results in mutually enhancing epidemics. Conventional methods for detecting HSV tend to be slow and insensitive. We designed a rapid PCR-based assay to quantify and type HSV in cervicovaginal lavage (CVL fluid of subjects attending a Genito-Urinary Medicine (GUM clinic. Vaginal swabs, CVL fluid and venous blood were collected. Quantitative detection of HSV was conducted using real time PCR with HSV specific primers and SYBR Green I. Fluorogenic TaqMan Minor Groove Binder (MGB probes designed around a single base mismatch in the HSV DNA polymerase I gene were used to type HSV in a separate reaction. The Kalon test was used to detect anti-HSV-2 IgG antibodies in serum. Testing for HIV, other Sexually Transmitted Infections (STI and related infections was based on standard clinical and laboratory methods. Results Seventy consecutive GUM clinic attendees were studied. Twenty-seven subjects (39% had detectable HSV DNA in CVL fluid; HSV-2 alone was detected in 19 (70% subjects, HSV-1 alone was detected in 4 (15% subjects and both HSV types were detected in 4 (15% subjects. Eleven out of 27 subjects (41% with anti-HSV-2 IgG had detectable HSV-2 DNA in CVL fluid. Seven subjects (10% were HIV-positive. Three of seven (43% HIV-infected subjects and two of five subjects with GUD (40% were secreting HSV-2. None of the subjects in whom HSV-1 was detected had GUD. Conclusion Quantitative real-time PCR and Taqman MGB probes specific for HSV-1 or -2 were used to develop an assay for quantification and typing of HSV. The majority of subjects in which HSV was detected had low levels of CVL fluid HSV, with no detectable HSV-2 antibodies and were asymptomatic.

  16. Wrappers, Aspects, Quantification and Events

    Science.gov (United States)

    Filman, Robert E.

    2005-01-01

    Talk overview: Object infrastructure framework (OIF). A system development to simplify building distributed applications by allowing independent implementation of multiple concern. Essence and state of AOP. Trinity. Quantification over events. Current work on a generalized AOP technology.

  17. Validation of a food quantification picture book targeting children of 0-10 years of age for pan-European and national dietary surveys.

    Science.gov (United States)

    Trolle, Ellen; Vandevijvere, Stefanie; Ruprich, Jiří; Ege, Majken; Dofková, Marcela; de Boer, Evelien; Ocké, Marga

    2013-12-01

    The aim of the present study was to validate thirty-eight picture series of six pictures each developed within the PANCAKE (Pilot study for the Assessment of Nutrient intake and food Consumption Among Kids in Europe) project for portion size estimation of foods consumed by infants, toddlers and children for future pan-European and national dietary surveys. Identical validation sessions were conducted in three European countries. In each country, forty-five foods were evaluated; thirty-eight foods were the same as the depicted foods, and seven foods were different, but meant to be quantified by the use of one of the thirty-eight picture series. Each single picture within a picture series was evaluated six times by means of predefined portions. Therefore, thirty-six pre-weighed portions of each food were evaluated by convenience samples of parents having children aged from 3 months to 10 years. The percentages of participants choosing the correct picture, the picture adjacent to the correct picture or a distant picture were calculated, and the performance of individual pictures within the series was assessed. For twenty foods, the picture series performed acceptably (mean difference between the estimated portion number and the served portion number less than 0.4 (SD picture series were acceptable for inclusion in the PANCAKE picture book. However, the picture series of baby food, salads and cakes either can only be used for foods that are very similar to those depicted or need to be substituted by another quantification tool.

  18. Quantification of mixed chimerism allows early therapeutic interventions

    Directory of Open Access Journals (Sweden)

    Jóice Merzoni

    2014-10-01

    Full Text Available Hematopoietic stem cell transplantation is the curative option for patients with myelodysplastic syndrome; however, it requires a long post-transplantation follow-up. A 53-year-old woman with a diagnosis of myelodysplastic syndrome underwent related donor allogeneic hematopoietic stem cell transplantation in July 2006. Three months after transplantation, a comparative short tandem repeat analysis between donor and recipient revealed full chimerism, indicating complete, healthy bone marrow reconstitution. Three years and ten months after hematopoietic stem cell transplantation, the patient developed leukopenia and thrombocytopenia. Another short tandem repeat analysis was carried out which showed mixed chimerism (52.62%, indicating relapsed disease. A donor lymphocyte infusion was administered. The purpose of donor lymphocyte infusion is to induce a graft-versus-leukemia effect; in fact, this donor's lymphocyte infusion induced full chimerism. Successive short tandem repeat analyses were performed as part of post-transplantation follow-up, and in July 2010, one such analysis again showed mixed chimerism (64.25%. Based on this finding, a second donor lymphocyte infusion was administered, but failed to eradicate the disease. In September 2011, the patient presented with relapsed disease, and a second related donor allogeneic hematopoietic stem cell transplantation was performed. Subsequent short tandem repeat analyses revealed full chimerism, indicating complete bone marrow reconstitution. We conclude that quantitative detection of mixed chimerism is an important diagnostic tool that can guide early therapeutic intervention.

  19. Quantification of physical activity using the QAPACE Questionnaire: a two stage cluster sample design survey of children and adolescents attending urban school.

    Science.gov (United States)

    Barbosa, Nicolas; Sanchez, Carlos E; Patino, Efrain; Lozano, Benigno; Thalabard, Jean C; LE Bozec, Serge; Rieu, Michel

    2016-05-01

    Quantification of physical activity as energy expenditure is important since youth for the prevention of chronic non communicable diseases in adulthood. It is necessary to quantify physical activity expressed in daily energy expenditure (DEE) in school children and adolescents between 8-16 years, by age, gender and socioeconomic level (SEL) in Bogotá. This is a Two Stage Cluster Survey Sample. From a universe of 4700 schools and 760000 students from three existing socioeconomic levels in Bogotá (low, medium and high). The random sample was 20 schools and 1840 students (904 boys and 936 girls). Foreshadowing desertion of participants and inconsistency in the questionnaire responses, the sample size was increased. Thus, 6 individuals of each gender for each of the nine age groups were selected, resulting in a total sample of 2160 individuals. Selected students filled the QAPACE questionnaire under supervision. The data was analyzed comparing means with multivariate general linear model. Fixed factors used were: gender (boys and girls), age (8 to 16 years old) and tri-strata SEL (low, medium and high); as independent variables were assessed: height, weight, leisure time, expressed in hours/day and dependent variable: daily energy expenditure DEE (kJ.kg-1.day-1): during leisure time (DEE-LT), during school time (DEE-ST), during vacation time (DEE-VT), and total mean DEE per year (DEEm-TY) RESULTS: Differences in DEE by gender, in boys, LT and all DEE, with the SEL all variables were significant; but age-SEL was only significant in DEE-VT. In girls, with the SEL all variables were significant. The post hoc multiple comparisons tests were significant with age using Fisher's Least Significant Difference (LSD) test in all variables. For both genders and for all SELs the values in girls had the higher value except SEL high (5-6) The boys have higher values in DEE-LT, DEE-ST, DEE-VT; except in DEEm-TY in SEL (5-6) In SEL (5-6) all DEEs for both genders are highest. For SEL

  20. Superposition Quantification

    Science.gov (United States)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  1. Allowable stress in piles.

    Science.gov (United States)

    1983-12-01

    "This study presents methods for establishing allowable stresses in steel, concrete, and timber piles using load/resistance factor concepts. These methods take into account not only the material properties of the pile itself but also the individual e...

  2. Utilization of tax allowances

    OpenAIRE

    Gunnar Forsling

    1998-01-01

    Swedish tax-paying firms have systematically failed to take full advantage of the allowances granted by the government. The average utilization level varied between 62 and 86 percent in the years 1979-1993. The Swedish tax-cut cum basebroadening tax reform in 1991 meant that the amount eligible for appropriation to untaxed reserves was much reduced. Our results show that the proportion of firms that fully utilize the allowances has increased since the reform. One interpretation of this is tha...

  3. Allowing repeat winners

    OpenAIRE

    Marco D. Huesch; Richard Brady

    2010-01-01

    Unbiased lotteries seem the least unfair and simplest procedures to allocate scarce indivisible resources to those with equal claims. But, when lotteries are repeated, it is not immediately obvious whether prior winners should be included or excluded. As in design questions surrounding single-shot lotteries, considerations of self-interest and distributive social preferences may interact. We investigate preferences for allowing participation of earlier winners in sequential lotteries. We foun...

  4. Allowing repeat winners

    Directory of Open Access Journals (Sweden)

    Marco D. Huesch

    2010-08-01

    Full Text Available Unbiased lotteries seem the least unfair and simplest procedures to allocate scarce indivisible resources to those with equal claims. But, when lotteries are repeated, it is not immediately obvious whether prior winners should be included or excluded. As in design questions surrounding single-shot lotteries, considerations of self-interest and distributive social preferences may interact. We investigate preferences for allowing participation of earlier winners in sequential lotteries. We found a strong preference for exclusion, both in settings where subjects were involved, and those where they were not. Subjects who answered questions about both settings did not differ in their tendency to prefer exclusion. Stated rationales significantly predicted choice but did not predict switching of choices between the two settings.

  5. SURVEY

    DEFF Research Database (Denmark)

    SURVEY er en udbredt metode og benyttes inden for bl.a. samfundsvidenskab, humaniora, psykologi og sundhedsforskning. Også uden for forskningsverdenen er der mange organisationer som f.eks. konsulentfirmaer og offentlige institutioner samt marketingsafdelinger i private virksomheder, der arbejder...

  6. 40 CFR 35.2025 - Allowance and advance of allowance.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Allowance and advance of allowance. 35... ASSISTANCE STATE AND LOCAL ASSISTANCE Grants for Construction of Treatment Works § 35.2025 Allowance and advance of allowance. (a) Allowance. Step 2+3 and Step 3 grant agreements will include an allowance for...

  7. Validation of a food quantification picture book targeting children of 0–10 years of age for pan-European and national dietary surveys

    DEFF Research Database (Denmark)

    Trolle, Ellen; Vandevijvere, Stefanie; Ruprich, Jiří

    2013-01-01

    and children for future pan-European and national dietary surveys. Identical validation sessions were conducted in three European countries. In each country, forty-five foods were evaluated; thirty-eight foods were the same as the depicted foods, and seven foods were different, but meant to be quantified...... by the use of one of the thirty-eight picture series. Each single picture within a picture series was evaluated six times by means of predefined portions. Therefore, thirty-six pre-weighed portions of each food were evaluated by convenience samples of parents having children aged from 3 months to 10 years...

  8. Quantification of 20-hydroxyeicosatetraenoic acid by colorimetric ...

    Indian Academy of Sciences (India)

    Quantification of 20-hydroxyeicosatetraenoic acid by colorimetric competitive enzyme linked immunosorbent assay. Harry E Grates Richard M Mc Gowen ... Assays were developed with and without a proprietary enhancer solution which allows for the extraction-free measurement of 20-HETE in urine samples. The bound ...

  9. Quantification of Endogenous Retinoids

    Science.gov (United States)

    Kane, Maureen A.; Napoli, Joseph L.

    2014-01-01

    Numerous physiological processes require retinoids, including development, nervous system function, immune responsiveness, proliferation, differentiation, and all aspects of reproduction. Reliable retinoid quantification requires suitable handling and, in some cases, resolution of geometric isomers that have different biological activities. Here we describe procedures for reliable and accurate quantification of retinoids, including detailed descriptions for handling retinoids, preparing standard solutions, collecting samples and harvesting tissues, extracting samples, resolving isomers, and detecting with high sensitivity. Sample-specific strategies are provided for optimizing quantification. Approaches to evaluate assay performance also are provided. Retinoid assays described here for mice also are applicable to other organisms including zebrafish, rat, rabbit, and human and for cells in culture. Retinoid quantification, especially that of retinoic acid, should provide insight into many diseases, including Alzheimer’s disease, type 2 diabetes, obesity, and cancer. PMID:20552420

  10. The Tangle of Student Allowances.

    Science.gov (United States)

    Thomson, Norman J.

    1980-01-01

    A discussion of the distribution of student financial aid in Australia focuses on these issues: direct vs. indirect payment to students; inequality in living allowances given to secondary and postsecondary students; and distribution of expense allowances by state government and living allowances by the Commonwealth. (MSE)

  11. Quantification of Noise Sources in EMI Surveys

    Science.gov (United States)

    2012-04-09

    Upper Pleistocene-aged Kent Island Formation occurs only on a peninsula south of Goose Creek. This unit overlies the Maryland Point Formation, and...munitions or debris, and the vegetation ( grass ) was mowed prior to our data collection. 13 Figure 3-5 – Active Range Test Location, L-Range

  12. Introduction to uncertainty quantification

    CERN Document Server

    Sullivan, T J

    2015-01-01

    Uncertainty quantification is a topic of increasing practical importance at the intersection of applied mathematics, statistics, computation, and numerous application areas in science and engineering. This text provides a framework in which the main objectives of the field of uncertainty quantification are defined, and an overview of the range of mathematical methods by which they can be achieved. Complete with exercises throughout, the book will equip readers with both theoretical understanding and practical experience of the key mathematical and algorithmic tools underlying the treatment of uncertainty in modern applied mathematics. Students and readers alike are encouraged to apply the mathematical methods discussed in this book to their own favourite problems to understand their strengths and weaknesses, also making the text suitable as a self-study. This text is designed as an introduction to uncertainty quantification for senior undergraduate and graduate students with a mathematical or statistical back...

  13. Quantification of micro stickies

    Science.gov (United States)

    Mahendra. Doshi; Jeffrey. Dyer; Salman. Aziz; Kristine. Jackson; Said M. Abubakr

    1997-01-01

    The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...

  14. Rapid quantification of DNA libraries for next-generation sequencing.

    Science.gov (United States)

    Buehler, Bernd; Hogrefe, Holly H; Scott, Graham; Ravi, Harini; Pabón-Peña, Carlos; O'Brien, Scott; Formosa, Rachel; Happe, Scott

    2010-04-01

    The next-generation DNA sequencing workflows require an accurate quantification of the DNA molecules to be sequenced which assures optimal performance of the instrument. Here, we demonstrate the use of qPCR for quantification of DNA libraries used in next-generation sequencing. In addition, we find that qPCR quantification may allow improvements to current NGS workflows, including reducing the amount of library DNA required, increasing the accuracy in quantifying amplifiable DNA, and avoiding amplification bias by reducing or eliminating the need to amplify DNA before sequencing. Copyright 2010. Published by Elsevier Inc.

  15. Quantification of birefringence readily measures the level of muscle damage in zebrafish

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Joachim, E-mail: Joachim.Berger@Monash.edu [Australian Regenerative Medicine Institute, EMBL Australia, Monash University, Clayton (Australia); Sztal, Tamar; Currie, Peter D. [Australian Regenerative Medicine Institute, EMBL Australia, Monash University, Clayton (Australia)

    2012-07-13

    Highlights: Black-Right-Pointing-Pointer Report of an unbiased quantification of the birefringence of muscle of fish larvae. Black-Right-Pointing-Pointer Quantification method readily identifies level of overall muscle damage. Black-Right-Pointing-Pointer Compare zebrafish muscle mutants for level of phenotype severity. Black-Right-Pointing-Pointer Proposed tool to survey treatments that aim to ameliorate muscular dystrophy. -- Abstract: Muscular dystrophies are a group of genetic disorders that progressively weaken and degenerate muscle. Many zebrafish models for human muscular dystrophies have been generated and analysed, including dystrophin-deficient zebrafish mutants dmd that model Duchenne Muscular Dystrophy. Under polarised light the zebrafish muscle can be detected as a bright area in an otherwise dark background. This light effect, called birefringence, results from the diffraction of polarised light through the pseudo-crystalline array of the muscle sarcomeres. Muscle damage, as seen in zebrafish models for muscular dystrophies, can readily be detected by a reduction in the birefringence. Therefore, birefringence is a very sensitive indicator of overall muscle integrity within larval zebrafish. Unbiased documentation of the birefringence followed by densitometric measurement enables the quantification of the birefringence of zebrafish larvae. Thereby, the overall level of muscle integrity can be detected, allowing the identification and categorisation of zebrafish muscle mutants. In addition, we propose that the establish protocol can be used to analyse treatments aimed at ameliorating dystrophic zebrafish models.

  16. First combined flux chamber survey of mercury and CO2 emissions from soil diffuse degassing at Solfatara of Pozzuoli crater, Campi Flegrei (Italy): Mapping and quantification of gas release

    Science.gov (United States)

    Bagnato, E.; Barra, M.; Cardellini, C.; Chiodini, G.; Parello, F.; Sprovieri, M.

    2014-12-01

    There have been limited studies to date targeting gaseous elemental mercury (GEM) flux from soil emission in enriched volcanic substrates and its relation with CO2 release and tectonic structures. In order to evaluate and understand the processes of soil-air exchanges involved at Solfatara of Pozzuoli volcano, the most active zone of Campi Flegrei caldera (Italy), an intensive field measurement survey has been achieved in September 2013 by using high-time resolution techniques. Soil-air exchange fluxes of GEM and CO2 have been measured simultaneously at 116 points, widely distributed within the crater. Quantification of gas flux has been assessed by using field accumulation chamber method in conjunction with a Lumex®-RA 915 + portable mercury vapor analyzer and a LICOR for CO2 determination, respectively. The spatial distribution of GEM and CO2 emissions correlated quite closely with the hydrothermal and geological features of the studied area. The highest GEM fluxes (from 4.04 to 5.9 × 10- 5 g m- 2 d- 1) were encountered close to the southern part of the crater interested by an intense fumarolic activity and along the SE-SW tectonic fracture (1.26 × 10- 6-6.91 × 10- 5 g GEM m- 2 d- 1). Conversely, the lowest values have been detected all along the western rim of the crater, characterized by a weak gas flux and a lush vegetation on a very sealed clay soil, which likely inhibited mercury emission (range: 1.5 × 10- 7-7.18 × 10- 6 g GEM m- 2 d- 1). Results indicate that the GEM exchange between soil and air inside the Solfatara crater is about 2-3 orders of magnitude stronger than that in the background areas (10- 8-10- 7 g m- 2 d- 1). CO2 soil diffuse degassing exhibited an analogous spatial pattern to the GEM fluxes, with emission rates ranging from about 15 to ~ 20,000 g CO2 m- 2 d- 1, from the outermost western zones to the south-eastern sector of the crater. The observed significant correlation between GEM and CO2 suggested that in volcanic system GEM

  17. 76 FR 70883 - Clothing Allowance

    Science.gov (United States)

    2011-11-16

    ... orthopedic appliance worn or used by a veteran for a service-connected disability or disabilities that wears... uses more than one qualifying prosthetic or orthopedic appliance, physician- prescribed medication for... entitled to a clothing allowance for each qualifying prosthetic or orthopedic appliance worn or used by a...

  18. Wiki surveys: Open and quantifiable social data collection

    CERN Document Server

    Salganik, Matthew J

    2012-01-01

    Research about attitudes and opinions is central to social science and relies on two common methodological approaches: surveys and interviews. While surveys enable the quantification of large amounts of information quickly and at a reasonable cost, they are routinely criticized for being "top-down" and rigid. In contrast, interviews allow unanticipated information to "bubble up" directly from respondents, but are slow, expensive, and difficult to quantify. Advances in computing technology now enable a hybrid approach that combines the quantifiability of a survey and the openness of an interview; we call this new class of data collection tools wiki surveys. Drawing on principles underlying successful information aggregation projects, such as Wikipedia, we propose three general criteria that wiki surveys should satisfy: they should be greedy, collaborative, and adaptive. We then present results from www.allourideas.org, a free and open-source website we created that enables groups all over the world to deploy w...

  19. Fluorescent quantification of melanin

    OpenAIRE

    Fernandes, Bruno Pacheco; Matamá, Maria Teresa; Guimarães, Diana Isabel Pereira; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-01-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Theref...

  20. Future directions in dialysis quantification.

    Science.gov (United States)

    Lindsay, R M; Sternby, J

    2001-01-01

    The influence of dialysis prescription on outcome is well established, and currently the amount of dialysis prescribed is based on small molecular weight toxin removal as represented by the clearance of urea. The "normalized dose of dialysis" (Kt/V(urea)) concept is well established. Most techniques for dialysis quantification require that blood samples be taken at the beginning and after the completion of dialysis. The postdialysis sample, however, gives cause for concern because of the "rebound phenomenon" due to nonuniform distribution of urea among body compartments. Blood samples give "indirect" measures of dialysis quantification. Thus direct urea concentration measurements in dialysate may be superior in urea kinetic modeling and these may be made "real time" during dialysis. It is with real-time monitoring that future advances in dialysis quantification will take place. These will be of two types. The first will analyze blood water or dialysate samples for urea content multiple times throughout the treatment; the second will assess the on-line clearance of urea using surrogate molecules such as sodium chloride, the clearance being determined by conductivity measurements. On-line urea monitoring is based on the action of urease on urea in a water solution and measurement of the resultant ammonium ions, which are measured directly by a specific electrode or indirectly by conductivity changes. Differences in blood-side versus dialysate-side urea monitors exist which reflect the parameters they can provide, but with both, the standard urea kinetic measurements of Kt/V and nPCR (nPNA) are easily obtainable. A range of additional parameters can be derived from dialysate-side monitoring such as "whole-body Kt/V," "pretreatment urea mass" and "whole-body urea clearance," which are worthy of future studies to determine their roles in adequacy assessment. Conductivity clearance measurements are made by examining the conductivity differences between dialysate inlet

  1. Novel Synovial Fluid Recovery Method Allows for Quantification of a Marker of Arthritis in Mice

    Science.gov (United States)

    Seifer, Daniel R; Furman, Bridgette D; Guilak, Farshid; Olson, Steve A; Brooks, S. Carroll; Kraus, Virginia Byers

    2008-01-01

    Objective We evaluated three methodologies - a calcium sodium alginate compound (CSAC), polyacrylate beads (PAB), and Whatman paper (WPR) - for the ability to recover synovial fluid from mouse knees in a manner that facilitated biochemical marker analysis. Methods Pilot testing of each of these recovery vehicles was conducted using small volumes of waste human synovial fluid. CSAC emerged as the method of choice, and was used to recover and quantify SF from the knees of C57BL/6 mice (n=12), six of which were given left-knee articular fractures. Synovial fluid concentrations of Cartilage Oligomeric Matrix Protein (COMP) were measured by ELISA. Results The mean concentration ratio ([COMP left knee] / [COMP right knee]) was higher in the mice subjected to articular fracture when compared to the non-fracture mice (p=0.026). The mean total COMP ratio (taking into account the quantitative recovery of synovial fluid) best discriminated between fracture and non-fracture knees (p=0.004). Conclusions Our results provide the first direct evidence of accelerated joint tissue turnover in a mouse model responding to acute joint injury. These data strongly suggest that mouse synovial fluid recovery is feasible and that biomarker analysis of collected synovial fluid samples can augment traditional histological analyses in mouse models of arthritis. PMID:18538588

  2. Novel stretch-sensor technology allows quantification of adherence and quality of home-exercises

    DEFF Research Database (Denmark)

    Rathleff, Michael Skovdal; Bandholm, Thomas Quaade; Ahrendt, Peter

    2014-01-01

    OBJECTIVE: To investigate if a new stretch sensor attached to an elastic exercise band can assist health professionals in evaluating adherence to home exercises. More specifically, the study investigated whether health professionals can differentiate elastic band exercises performed as prescribed......, from exercises not performed as prescribed. METHODS: 10 participants performed four different shoulder-abduction exercises in two rounds (80 exercise scenarios in total). The scenarios were (1) low contraction speed, full range of motion (0-90°), (2) high contraction speed, full range of motion (0...... shoulder abduction strength exercises (scenarios 1-3). The next two raters were asked to identify the four different exercise scenarios (scenarios 1-4). RESULTS: The first two raters were able to differentiate between unsystematic pull (scenario 4) from shoulder abduction strength exercises (scenarios 1...

  3. Photoactivatable Drug-Caged Fluorophore Conjugate Allows Direct Quantification of Intracellular Drug Transport

    Science.gov (United States)

    Kohler, Rainer H.; Weissleder, Ralph

    2013-01-01

    We report here a method that utilizes photoactivatable drug-caged fluorophore conjugate to quantify intracellular drug trafficking processes at single cell resolution. Photoactivation is performed in labeled cellular compartments to visualize intracellular drug exchange at physiologic conditions, without the need for washing, facilitating its translation to in vivo cancer models. PMID:24135896

  4. Quantification of Permafrost Creep by Remote Sensing

    Science.gov (United States)

    Roer, I.; Kaeaeb, A.

    2008-12-01

    Rockglaciers and frozen talus slopes are distinct landforms representing the occurrence of permafrost conditions in high mountain environments. The interpretation of ongoing permafrost creep and its reaction times is still limited due to the complex setting of interrelating processes within the system. Therefore, a detailed monitoring of rockglaciers and frozen talus slopes seems advisable to better understand the system as well as to assess possible consequences like rockfall hazards or debris-flow starting zones. In this context, remote sensing techniques are increasingly important. High accuracy techniques and data with high spatial and temporal resolution are required for the quantification of rockglacier movement. Digital Terrain Models (DTMs) derived from optical stereo, synthetic aperture radar (SAR) or laser scanning data are the most important data sets for the quantification of permafrost-related mass movements. Correlation image analysis of multitemporal orthophotos allow for the quantification of horizontal displacements, while vertical changes in landform geometry are computed by DTM comparisons. In the European Alps the movement of rockglaciers is monitored over a period of several decades by the combined application of remote sensing and geodetic methods. The resulting kinematics (horizontal and vertical displacements) as well as spatio-temporal variations thereof are considered in terms of rheology. The distinct changes in process rates or landform failures - probably related to permafrost degradation - are analysed in combination with data on surface and subsurface temperatures and internal structures (e.g., ice content, unfrozen water content).

  5. Adjusting for unrecorded consumption in survey and per capita sales data: quantification of impact on gender- and age-specific alcohol-attributable fractions for oral and pharyngeal cancers in Great Britain.

    Science.gov (United States)

    Meier, Petra Sylvia; Meng, Yang; Holmes, John; Baumberg, Ben; Purshouse, Robin; Hill-McManus, Daniel; Brennan, Alan

    2013-01-01

    Large discrepancies are typically found between per capita alcohol consumption estimated via survey data compared with sales, excise or production figures. This may lead to significant inaccuracies when calculating levels of alcohol-attributable harms. Using British data, we demonstrate an approach to adjusting survey data to give more accurate estimates of per capita alcohol consumption. First, sales and survey data are adjusted to account for potential biases (e.g. self-pouring, under-sampled populations) using evidence from external data sources. Secondly, survey and sales data are aligned using different implementations of Rehm et al.'s method [in (2010) Statistical modeling of volume of alcohol exposure for epidemiological studies of population health: the US example. Pop Health Metrics 8, 1-12]. Thirdly, the impact of our approaches is tested by using our revised survey dataset to calculate alcohol-attributable fractions (AAFs) for oral and pharyngeal cancers. British sales data under-estimate per capita consumption by 8%, primarily due to illicit alcohol. Adjustments to survey data increase per capita consumption estimates by 35%, primarily due to under-sampling of dependent drinkers and under-estimation of home-poured spirits volumes. Before aligning sales and survey data, the revised survey estimate remains 22% lower than the revised sales estimate. Revised AAFs for oral and pharyngeal cancers are substantially larger with our preferred method for aligning data sources, yielding increases in an AAF from the original survey dataset of 0.47-0.60 (males) and 0.28-0.35 (females). It is possible to use external data sources to adjust survey data to reduce the under-estimation of alcohol consumption and then account for residual under-estimation using a statistical calibration technique. These revisions lead to markedly higher estimated levels of alcohol-attributable harm.

  6. Disease quantification in dermatology

    DEFF Research Database (Denmark)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B E

    2013-01-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very...... useful in quantifying disease severity, they require an extensive clinical experience and carry a risk of subjectivity. We explore the opportunity to use in vivo near-infrared (NIR) spectra as an objective and noninvasive method for local disease severity assessment in 31 psoriasis patients in whom...... selected plaques were scored clinically. A partial least squares (PLS) regression model was used to analyze and predict the severity scores on the NIR spectra of psoriatic and uninvolved skin. The correlation between predicted and clinically assigned scores was R=0.94 (RMSE=0.96), suggesting that in vivo...

  7. Advancing agricultural greenhouse gas quantification*

    Science.gov (United States)

    Olander, Lydia; Wollenberg, Eva; Tubiello, Francesco; Herold, Martin

    2013-03-01

    descriptive trends are sufficient or an understanding of drivers and causes are needed. While there are certainly similar needs across uses and users, the necessary methods, data, and models for quantifying GHGs may vary. Common challenges for quantification noted in an informal survey of users of GHG information by Olander et al (2013) include the following. 3.1. Need for user-friendly methods that work across scales, regions, and systems Much of the data gathered and models developed by the research community provide high confidence in data or indicators computed at one place or for one issue, thus they are relevant for only specific uses, not transparent, or not comparable. These research approaches need to be translated to practitioners though the development of farmer friendly, transparent, comparable, and broadly applicable methods. Many users noted the need for quantification data and methods that work and are accurate across region and scales. One of the interviewed users, Charlotte Streck, summed it up nicely: 'A priority would be to produce comparable datasets for agricultural GHG emissions of particular agricultural practices for a broad set of countries ... with a gradual increase in accuracy'. 3.2. Need for lower cost, feasible approaches Concerns about cost and complexity of existing quantification methods were raised by a number of users interviewed in the survey. In the field it is difficult to measure changes in GHGs from agricultural management due to spatial and temporal variability, and the scale of the management-induced changes relative to background pools and fluxes. Many users noted data gaps and inconsistencies and insufficient technical capacity and infrastructure to generate necessary information, particularly in developing countries. The need for creative approaches for data collection and analysis, such as crowd sourcing and mobile technology, were noted. 3.3. Need for methods that can crosswalk between emission-reduction strategy and inventories

  8. Verb aspect, alternations and quantification

    Directory of Open Access Journals (Sweden)

    Svetla Koeva

    2015-11-01

    Full Text Available Verb aspect, alternations and quantification In this paper we are briefly discuss the nature of Bulgarian verb aspect and argue that the verb aspect pairs are different lexical units with different (although related meaning, different argument structure (reflecting categories, explicitness and referential status of arguments and different sets of semantic and syntactic alternations. The verb prefixes resulting in perfective verbs derivation in some cases can be interpreted as lexical quantifiers as well. Thus the Bulgarian verb aspect is related (in different way both with the potential for the generation of alternations and with the prefixal lexical quantification. It is shown that the scope of the lexical quantification by means of verbal prefixes is the quantified verb phrase and the scope remains constant in all derived alternations. The paper concerns the basic issues of these complex problems, while the detailed description of the conditions satisfying particular alternation or particular lexical quantification are subject of a more detailed study.

  9. Rapid and parallel quantification of small and large RNA species.

    Science.gov (United States)

    Speth, Corinna; Laubinger, Sascha

    2014-01-01

    Quantitative real-time PCR (qRT-PCR) is a common technique for mRNA quantification. Several methods have been developed in the past few years in order to adapt qRT-PCR also for small non-coding RNAs (sRNA). We here provide a simple and sensitive protocol that allows quantification of mRNAs, selected sRNAs, and long non-coding RNAs (lncRNA) in one cDNA sample by qRT-PCR.

  10. Training load quantification in triathlon

    OpenAIRE

    Cejuela Anta, Roberto; Esteve-Lanao, Jonathan

    2011-01-01

    There are different Indices of Training Stress of varying complexity, to quantification Training load. Examples include the training impulse (TRIMP), the session (RPE), Lucia’s TRIMP or Summated Zone Score. But the triathlon, a sport to be combined where there are interactions between different segments, is a complication when it comes to quantify the training. The aim of this paper is to review current methods of quantification, and to propose a scale to quantify the training load in triathl...

  11. 38 CFR 21.5822 - Subsistence allowance.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Subsistence allowance. 21... and Subsistence Allowance § 21.5822 Subsistence allowance. (a) Subsistence allowance. Except as provided in paragraph (a)(2) of this section, VA will pay subsistence allowance to a veteran, spouse...

  12. 5 CFR 591.305 - Allowance rates.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Allowance rates. 591.305 Section 591.305 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS ALLOWANCES AND DIFFERENTIALS Allowance Based on Duty at Remote Worksites § 591.305 Allowance rates. (a) General. An allowance rate may...

  13. 44 CFR 208.41 - Administrative allowance.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Administrative allowance. 208... Cooperative Agreements § 208.41 Administrative allowance. (a) The administrative allowance is intended to... administrative allowance will be equal to the following: (1) If total allowable costs are less than $100,000, 3...

  14. Statistical quantification of methylation levels by next-generation sequencing.

    Directory of Open Access Journals (Sweden)

    Guodong Wu

    Full Text Available Recently, next-generation sequencing-based technologies have enabled DNA methylation profiling at high resolution and low cost. Methyl-Seq and Reduced Representation Bisulfite Sequencing (RRBS are two such technologies that interrogate methylation levels at CpG sites throughout the entire human genome. With rapid reduction of sequencing costs, these technologies will enable epigenotyping of large cohorts for phenotypic association studies. Existing quantification methods for sequencing-based methylation profiling are simplistic and do not deal with the noise due to the random sampling nature of sequencing and various experimental artifacts. Therefore, there is a need to investigate the statistical issues related to the quantification of methylation levels for these emerging technologies, with the goal of developing an accurate quantification method.In this paper, we propose two methods for Methyl-Seq quantification. The first method, the Maximum Likelihood estimate, is both conceptually intuitive and computationally simple. However, this estimate is biased at extreme methylation levels and does not provide variance estimation. The second method, based on bayesian hierarchical model, allows variance estimation of methylation levels, and provides a flexible framework to adjust technical bias in the sequencing process.We compare the previously proposed binary method, the Maximum Likelihood (ML method, and the bayesian method. In both simulation and real data analysis of Methyl-Seq data, the bayesian method offers the most accurate quantification. The ML method is slightly less accurate than the bayesian method. But both our proposed methods outperform the original binary method in Methyl-Seq. In addition, we applied these quantification methods to simulation data and show that, with sequencing depth above 40-300 (which varies with different tissue samples per cleavage site, Methyl-Seq offers a comparable quantification consistency as microarrays.

  15. Flexibility in the HCFC Allowance System

    Science.gov (United States)

    The rule that established the HCFC allowance system also created an allowance transfer mechanism to provide flexibility. This fact sheet highlights the flexibilities incorporated into the HCFC allowance system.

  16. Evolution of allowable stresses in shear for lumber

    Science.gov (United States)

    Robert L. Ethington; William L. Galligan; Henry M. Montrey; Alan D. Freas

    1979-01-01

    This paper surveys research leading to allowable shear stress parallel to grain for lumber. In early flexure tests of lumber, some pieces failed in shear. The estimated shear stress at time of failure was generally lower than shear strength measured on small, clear, straight-grained specimens. This and other engineering observations gave rise to adjustments that...

  17. 46 CFR 154.440 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.440 Section 154.440 Shipping COAST... Tank Type A § 154.440 Allowable stress. (a) The allowable stresses for an independent tank type A must... Commandant (CG-522). (b) A greater allowable stress than required in paragraph (a)(1) of this section may be...

  18. 46 CFR 154.421 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.421 Section 154.421 Shipping COAST... § 154.421 Allowable stress. The allowable stress for the integral tank structure must meet the American Bureau of Shipping's allowable stress for the vessel's hull published in “Rules for Building and Classing...

  19. 20 CFR 617.46 - Travel allowance.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Travel allowance. 617.46 Section 617.46... FOR WORKERS UNDER THE TRADE ACT OF 1974 Relocation Allowances § 617.46 Travel allowance. (a) Computation. The amount of travel allowance (including lodging and meals) payable under § 617.45(a)(1) shall...

  20. 20 CFR 617.47 - Moving allowance.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Moving allowance. 617.47 Section 617.47... FOR WORKERS UNDER THE TRADE ACT OF 1974 Relocation Allowances § 617.47 Moving allowance. (a) Computation. The amount of a moving allowance payable under § 617.45(a)(2) shall be 90 percent of the total of...

  1. 30 CFR 220.012 - Overhead allowance.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false Overhead allowance. 220.012 Section 220.012... § 220.012 Overhead allowance. (a) During the capital recovery period the overhead allowance shall be... overhead allowance shall be debited to the NPSL capital account in accordance with § 220.021(b)(2). (b) For...

  2. Convex geometry of quantum resource quantification

    Science.gov (United States)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof–extended negativity, a measure of k-coherence which generalises the \

  3. Quantification of prebiotics in commercial infant formulas.

    Science.gov (United States)

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Quantification of complex modular architecture in plants.

    Science.gov (United States)

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-02-22

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  5. Quantification of ontogenetic allometry in ammonoids.

    Science.gov (United States)

    Korn, Dieter

    2012-01-01

    Ammonoids are well-known objects used for studies on ontogeny and phylogeny, but a quantification of ontogenetic change has not yet been carried out. Their planispirally coiled conchs allow for a study of "longitudinal" ontogenetic data, that is data of ontogenetic trajectories that can be obtained from a single specimen. Therefore, they provide a good model for ontogenetic studies of geometry in other shelled organisms. Using modifications of three cardinal conch dimensions, computer simulations can model artificial conchs. The trajectories of ontogenetic allometry of these simulations can be analyzed in great detail in a theoretical morphospace. A method for the classification of conch ontogeny and quantification of the degree of allometry is proposed. Using high-precision cross-sections, the allometric conch growth of real ammonoids can be documented and compared. The members of the Ammonoidea show a wide variety of allometric growth, ranging from near isometry to monophasic, biphasic, or polyphasic allometry. Selected examples of Palaeozoic and Mesozoic ammonoids are shown with respect to their degree of change during ontogeny of the conch. © 2012 Wiley Periodicals, Inc.

  6. Aspect-Oriented Programming is Quantification and Implicit Invocation

    Science.gov (United States)

    Filman, Robert E.; Friedman, Daniel P.; Koga, Dennis (Technical Monitor)

    2001-01-01

    We propose that the distinguishing characteristic of Aspect-Oriented Programming (AOP) languages is that they allow programming by making quantified programmatic assertions over programs that lack local notation indicating the invocation of these assertions. This suggests that AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the interactions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are metabolism: they are sufficiently expressive to allow straightforwardly programming an AOP system within them.

  7. Micro-RNA quantification using DNA polymerase and pyrophosphate quantification.

    Science.gov (United States)

    Yu, Hsiang-Ping; Hsiao, Yi-Ling; Pan, Hung-Yin; Huang, Chih-Hung; Hou, Shao-Yi

    2011-12-15

    A rapid quantification method for micro-RNA based on DNA polymerase activity and pyrophosphate quantification has been developed. The tested micro-RNA serves as the primer, unlike the DNA primer in all DNA sequencing methods, and the DNA probe serves as the template for DNA replication. After the DNA synthesis, the pyrophosphate detection and quantification indicate the existence and quantity of the tested miRNA. Five femtomoles of the synthetic RNA could be detected. In 20-100 μg RNA samples purified from SiHa cells, the measurement was done using the proposed assay in which hsa-miR-16 and hsa-miR-21 are 0.34 fmol/μg RNA and 0.71 fmol/μg RNA, respectively. This simple and inexpensive assay takes less than 5 min after total RNA purification and preparation. The quantification is not affected by the pre-miRNA which cannot serve as the primer for the DNA synthesis in this assay. This assay is general for the detection of the target RNA or DNA with a known matched DNA template probe, which could be widely used for detection of small RNA, messenger RNA, RNA viruses, and DNA. Therefore, the method could be widely used in RNA and DNA assays. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Allowance Holdings and Transfers Data Inventory

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Allowance Holdings and Transfers Data Inventory contains measured data on holdings and transactions of allowances under the NOx Budget Trading Program (NBP), a...

  9. Clean Air Markets - Allowances Query Wizard

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Allowances Query Wizard is part of a suite of Clean Air Markets-related tools that are accessible at http://camddataandmaps.epa.gov/gdm/index.cfm. The Allowances...

  10. 5 CFR 180.104 - Allowable claims.

    Science.gov (United States)

    2010-01-01

    ... loss of personal property incident to service with OPM may be considered and allowed. The following are... mobile homes may be allowed only in cases of collision, theft, or vandalism. (5) Money. Claims for money...

  11. 44 CFR 11.73 - Allowable claims.

    Science.gov (United States)

    2010-10-01

    ... to service with FEMA may be considered and allowed. The following are examples of the principal types... be allowed only in cases of collision, theft, or vandalism. (5) Money. Claims for money in an amount...

  12. Child allowances, fertility, and chaotic dynamics

    Science.gov (United States)

    Chen, Hung-Ju; Li, Ming-Chia

    2013-06-01

    This paper analyzes the dynamics in an overlapping generations model with the provision of child allowances. Fertility is an increasing function of child allowances and there exists a threshold effect of the marginal effect of child allowances on fertility. We show that if the effectiveness of child allowances is sufficiently high, an intermediate-sized tax rate will be enough to generate chaotic dynamics. Besides, a decrease in the inter-temporal elasticity of substitution will prevent the occurrence of irregular cycles.

  13. 20 CFR 632.258 - Allowable activities.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable activities. 632.258 Section 632.258... EMPLOYMENT AND TRAINING PROGRAMS Summer Youth Employment and Training Programs § 632.258 Allowable activities. Allowable activities are those listed in § 632.78-80 except that community service employment is not...

  14. 46 CFR 154.428 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.428 Section 154.428 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CERTAIN BULK DANGEROUS CARGOES SAFETY STANDARDS FOR... § 154.428 Allowable stress. The membrane tank and the supporting insulation must have allowable stresses...

  15. 46 CFR 154.447 - Allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Allowable stress. 154.447 Section 154.447 Shipping COAST... Tank Type B § 154.447 Allowable stress. (a) An independent tank type B designed from bodies of revolution must have allowable stresses 3 determined by the following formulae: 3 See Appendix B for stress...

  16. 24 CFR 85.22 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable...

  17. 36 CFR 1207.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable...

  18. 32 CFR 33.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined...

  19. 38 CFR 21.260 - Subsistence allowance.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Subsistence allowance. 21... Monetary Assistance Services § 21.260 Subsistence allowance. (a) General. A veteran participating in a rehabilitation program under 38 U.S.C. Chapter 31 will receive a monthly subsistence allowance at the rates in...

  20. 27 CFR 40.472 - Allowance.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Allowance. 40.472 Section... PROCESSED TOBACCO Manufacture of Cigarette Papers and Tubes Claims by Manufacturers § 40.472 Allowance... allowance of the tax where the cigarette papers and tubes, after removal from the factory upon determination...

  1. 19 CFR 191.101 - Drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Drawback allowance. 191.101 Section 191.101... Preparations (Including Perfumery) Manufactured From Domestic Tax-Paid Alcohol § 191.101 Drawback allowance. (a....C. 7653(c)). However, there is no authority of law for the allowance of drawback of internal-revenue...

  2. 20 CFR 211.9 - Dismissal allowance.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Dismissal allowance. 211.9 Section 211.9 Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD RETIREMENT ACT CREDITABLE RAILROAD COMPENSATION § 211.9 Dismissal allowance. Dismissal allowances paid to an employee under a...

  3. 38 CFR 3.954 - Burial allowance.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Burial allowance. 3.954..., Compensation, and Dependency and Indemnity Compensation Protection § 3.954 Burial allowance. When any person... benefits dies, the burial allowance will be paid, if otherwise in order, even though such status does not...

  4. 75 FR 4098 - Utility Allowance Adjustments

    Science.gov (United States)

    2010-01-26

    ... URBAN DEVELOPMENT Utility Allowance Adjustments AGENCY: Office of the Chief Information Officer, HUD... are required to advise the Secretary of the need for and request of a new utility allowance for... lists the following information: Title of Proposal: Utility Allowance Adjustments. OMB Approval Number...

  5. 20 CFR 211.8 - Displacement allowance.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Displacement allowance. 211.8 Section 211.8 Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD RETIREMENT ACT CREDITABLE RAILROAD COMPENSATION § 211.8 Displacement allowance. An allowance paid to an employee because he has been...

  6. 19 CFR 191.121 - Drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Drawback allowance. 191.121 Section 191.121... TREASURY (CONTINUED) DRAWBACK Meats Cured With Imported Salt § 191.121 Drawback allowance. Section 313(f) of the Act, as amended (19 U.S.C. 1313(f)), provides for the allowance of drawback upon the...

  7. 34 CFR 642.40 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Allowable costs. 642.40 Section 642.40 Education...? § 642.40 Allowable costs. Allowable project costs may include the following costs reasonably related to carrying out a Training Program project: (a) Rental of space, if space is not available at a sponsoring...

  8. 34 CFR 675.33 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... costs. (a)(1) Allowable and unallowable costs. Except as provided in paragraph (a)(2) of this section, costs reasonably related to carrying out the programs described in § 675.32 are allowable. (2) Costs... 34 Education 3 2010-07-01 2010-07-01 false Allowable costs. 675.33 Section 675.33 Education...

  9. 5 CFR Appendix A to Subpart B of... - Places and Rates at Which Allowances Are Paid

    Science.gov (United States)

    2010-01-01

    ... CIVIL SERVICE REGULATIONS ALLOWANCES AND DIFFERENTIALS Cost-of-Living Allowance and Post Differential... Allowances Are Paid This appendix lists the places approved for a cost-of-living allowance and shows the... rate of basic pay. The rates are subject to change based on the results of future surveys. Geographic...

  10. NGS Survey Control Map

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NGS Survey Control Map provides a map of the US which allows you to find and display geodetic survey control points stored in the database of the National...

  11. 561 SOURCE SPECIFIC QUANTIFICATION, CHARACTERISATION ...

    African Journals Online (AJOL)

    Osondu

    2013-09-02

    Sep 2, 2013 ... efficient and sustainable waste management. This study is the quantification, characterisation ... For efficient and sustainable solid waste management in Lapai it is recommended that Lapai Local Government Area Council .... a shop, a market stall, an eatery/restaurant, a hotel or any commercial enterprise.

  12. Adenovirus Particle Quantification in Cell Lysates Using Light Scattering.

    Science.gov (United States)

    Hohl, Adrian; Ramms, Anne Sophie; Dohmen, Christian; Mantwill, Klaus; Bielmeier, Andrea; Kolk, Andreas; Ruppert, Andreas; Nawroth, Roman; Holm, Per Sonne

    2017-10-01

    Adenoviral vector production for therapeutic applications is a well-established routine process. However, current methods for measurement of adenovirus particle titers as a quality characteristic require highly purified virus preparations. While purified virus is typically obtained in the last step of downstream purification, rapid and reliable methods for adenovirus particle quantification in intermediate products and crude lysates to allow for optimization and validation of cell cultures and intermediate downstream processing steps are currently not at hand. Light scattering is an established process to measure virus particles' size, though due to cell impurities, adequate quantification of adenovirus particles in cell lysates by light scattering has been impossible until today. This report describes a new method using light scattering to measure virus concentration in nonpurified cell lysates. Here we report application of light scattering, a routine method to measure virus particle size, to virus quantification in enzymatically conditioned crude lysates. Samples are incubated with phospholipase A2 and benzonase and filtered through a 0.22 μm filter cartridge prior to quantification by light scattering. Our results show that this treatment provides a precise method for fast and easy determination of total adenovirus particle numbers in cell lysates and is useful to monitor virus recovery throughout all downstream processing.

  13. Rapid Quantification and Validation of Lipid Concentrations within Liposomes

    Directory of Open Access Journals (Sweden)

    Carla B. Roces

    2016-09-01

    Full Text Available Quantification of the lipid content in liposomal adjuvants for subunit vaccine formulation is of extreme importance, since this concentration impacts both efficacy and stability. In this paper, we outline a high performance liquid chromatography-evaporative light scattering detector (HPLC-ELSD method that allows for the rapid and simultaneous quantification of lipid concentrations within liposomal systems prepared by three liposomal manufacturing techniques (lipid film hydration, high shear mixing, and microfluidics. The ELSD system was used to quantify four lipids: 1,2-dimyristoyl-sn-glycero-3-phosphocholine (DMPC, cholesterol, dimethyldioctadecylammonium (DDA bromide, and ᴅ-(+-trehalose 6,6′-dibehenate (TDB. The developed method offers rapidity, high sensitivity, direct linearity, and a good consistency on the responses (R2 > 0.993 for the four lipids tested. The corresponding limit of detection (LOD and limit of quantification (LOQ were 0.11 and 0.36 mg/mL (DMPC, 0.02 and 0.80 mg/mL (cholesterol, 0.06 and 0.20 mg/mL (DDA, and 0.05 and 0.16 mg/mL (TDB, respectively. HPLC-ELSD was shown to be a rapid and effective method for the quantification of lipids within liposome formulations without the need for lipid extraction processes.

  14. Comparison of DNA Quantification Methods for Next Generation Sequencing.

    Science.gov (United States)

    Robin, Jérôme D; Ludlow, Andrew T; LaRanger, Ryan; Wright, Woodring E; Shay, Jerry W

    2016-04-06

    Next Generation Sequencing (NGS) is a powerful tool that depends on loading a precise amount of DNA onto a flowcell. NGS strategies have expanded our ability to investigate genomic phenomena by referencing mutations in cancer and diseases through large-scale genotyping, developing methods to map rare chromatin interactions (4C; 5C and Hi-C) and identifying chromatin features associated with regulatory elements (ChIP-seq, Bis-Seq, ChiA-PET). While many methods are available for DNA library quantification, there is no unambiguous gold standard. Most techniques use PCR to amplify DNA libraries to obtain sufficient quantities for optical density measurement. However, increased PCR cycles can distort the library's heterogeneity and prevent the detection of rare variants. In this analysis, we compared new digital PCR technologies (droplet digital PCR; ddPCR, ddPCR-Tail) with standard methods for the titration of NGS libraries. DdPCR-Tail is comparable to qPCR and fluorometry (QuBit) and allows sensitive quantification by analysis of barcode repartition after sequencing of multiplexed samples. This study provides a direct comparison between quantification methods throughout a complete sequencing experiment and provides the impetus to use ddPCR-based quantification for improvement of NGS quality.

  15. 20 CFR 632.37 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ..., weatherization and rehabilitation are allowable when the work is performed on low income housing as defined in... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable costs. 632.37 Section 632.37 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR INDIAN AND NATIVE AMERICAN...

  16. 40 CFR 35.6245 - Allowable activities.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Allowable activities. 35.6245 Section... Actions Support Agency Cooperative Agreements § 35.6245 Allowable activities. Support agency activities are those activities conducted by the recipient to ensure its meaningful and substantial involvement...

  17. 21 CFR 1303.24 - Inventory allowance.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Inventory allowance. 1303.24 Section 1303.24 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE QUOTAS Individual Manufacturing Quotas § 1303.24 Inventory allowance. (a) For the purpose of determining individual manufacturing quotas...

  18. 40 CFR 31.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... the grantee or sub-grantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in...

  19. 15 CFR 24.22 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... Part 31. Contract Cost Principles and Procedures, or uniform cost accounting standards that comply with...) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in accordance with the cost principles...

  20. 45 CFR 2541.220 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in...

  1. 38 CFR 43.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... accounting standards that comply with cost principles acceptable to the Federal agency. ... the grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in...

  2. 45 CFR 92.22 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... to that circular 48 CFR Part 31. Contract Cost Principles and Procedures, or uniform cost accounting... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in accordance with...

  3. 21 CFR 1403.22 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    .... Contract Cost Principles and Procedures, or uniform cost accounting standards that comply with cost...) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in accordance with the cost principles...

  4. 13 CFR 143.22 - Allowable costs.

    Science.gov (United States)

    2010-01-01

    ... to that circular 48 CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in accordance with...

  5. 28 CFR 66.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... part 31. Contract Cost Principles and Procedures, or uniform cost accounting standards that comply with...) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in accordance with the cost principles...

  6. 29 CFR 1470.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... to that circular 48 CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting... grantee or subgrantee. (b) Applicable cost principles. For each kind of organization, there is a set of Federal principles for determining allowable costs. Allowable costs will be determined in accordance with...

  7. 20 CFR 631.84 - Allowable projects.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable projects. 631.84 Section 631.84... THE JOB TRAINING PARTNERSHIP ACT Disaster Relief Employment Assistance § 631.84 Allowable projects...) Shall be used exclusively to provide employment on projects that provide food, clothing, shelter and...

  8. 33 CFR 136.235 - Compensation allowable.

    Science.gov (United States)

    2010-07-01

    ... allowable. The amount of compensation allowable is limited to the actual net reduction or loss of earnings or profits suffered. Calculations for net reductions or losses must clearly reflect adjustments for— (a) All income resulting from the incident; (b) All income from alternative employment or business...

  9. 27 CFR 28.334 - Credit allowance.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Credit allowance. 28.334 Section 28.334 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY LIQUORS EXPORTATION OF ALCOHOL Action on Claims § 28.334 Credit allowance. Where the...

  10. 19 CFR 191.111 - Drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Drawback allowance. 191.111 Section 191.111 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) DRAWBACK Supplies for Certain Vessels and Aircraft § 191.111 Drawback allowance...

  11. 19 CFR 191.151 - Drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Drawback allowance. 191.151 Section 191.151 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE... allowance. (a) Eligibility of entered or withdrawn merchandise—(1) Under 19 U.S.C. 1557(a). Section 557(a...

  12. 19 CFR 191.181 - Drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Drawback allowance. 191.181 Section 191.181 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE....181 Drawback allowance. The fourth proviso of § 3 of the Foreign Trade Zones Act of June 18, 1934, as...

  13. 19 CFR 191.131 - Drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Drawback allowance. 191.131 Section 191.131 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE... Foreign Ownership and Account § 191.131 Drawback allowance. Section 313(g) of the Act, as amended (19 U.S...

  14. 21 CFR 1315.24 - Inventory allowance.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Inventory allowance. 1315.24 Section 1315.24 Food and Drugs DRUG ENFORCEMENT ADMINISTRATION, DEPARTMENT OF JUSTICE IMPORTATION AND PRODUCTION QUOTAS FOR... allowance. (a) For the purpose of determining individual manufacturing quotas pursuant to § 1315.23, each...

  15. 19 CFR 191.141 - Drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Drawback allowance. 191.141 Section 191.141 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE... Drawback allowance. Section 313(h) of the Act, as amended (19 U.S.C. 1313(h)), provides for drawback on the...

  16. 34 CFR 304.21 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... DEVELOPMENT TO IMPROVE SERVICES AND RESULTS FOR CHILDREN WITH DISABILITIES Conditions That Must be Met by...) Tuition and fees. (2) An allowance for books, supplies, transportation, and miscellaneous personal expenses. (3) An allowance for room and board. (b) Stipends. (c) Travel in conjunction with training...

  17. RNA quantification using gold nanoprobes - application to cancer diagnostics

    Directory of Open Access Journals (Sweden)

    Baptista Pedro V

    2010-02-01

    Full Text Available Abstract Molecular nanodiagnostics applied to cancer may provide rapid and sensitive detection of cancer related molecular alterations, which would enable early detection even when those alterations occur only in a small percentage of cells. The use of gold nanoparticles derivatized with thiol modified oligonucleotides (Au-nanoprobes for the detection of specific nucleic acid targets has been gaining momentum as an alternative to more traditional methodologies. Here, we present an Au-nanoparticles based approach for the molecular recognition and quantification of the BCR-ABL fusion transcript (mRNA, which is responsible for chronic myeloid leukemia (CML, and to the best of our knowledge it is the first time quantification of a specific mRNA directly in cancer cells is reported. This inexpensive and very easy to perform Au-nanoprobe based method allows quantification of unamplified total human RNA and specific detection of the oncogene transcript. The sensitivity settled by the Au-nanoprobes allows differential gene expression from 10 ng/μl of total RNA and takes less than 30 min to complete after total RNA extraction, minimizing RNA degradation. Also, at later stages, accumulation of malignant mutations may lead to resistance to chemotherapy and consequently poor outcome. Such a method, allowing for fast and direct detection and quantification of the chimeric BCR-ABL mRNA, could speed up diagnostics and, if appropriate, revision of therapy. This assay may constitute a promising tool in early diagnosis of CML and could easily be extended to further target genes with proven involvement in cancer development.

  18. Generalizing cell segmentation and quantification.

    Science.gov (United States)

    Wang, Zhenzhou; Li, Haixing

    2017-03-23

    In recent years, the microscopy technology for imaging cells has developed greatly and rapidly. The accompanying requirements for automatic segmentation and quantification of the imaged cells are becoming more and more. After studied widely in both scientific research and industrial applications for many decades, cell segmentation has achieved great progress, especially in segmenting some specific types of cells, e.g. muscle cells. However, it lacks a framework to address the cell segmentation problems generally. On the contrary, different segmentation methods were proposed to address the different types of cells, which makes the research work divergent. In addition, most of the popular segmentation and quantification tools usually require a great part of manual work. To make the cell segmentation work more convergent, we propose a framework that is able to segment different kinds of cells automatically and robustly in this paper. This framework evolves the previously proposed method in segmenting the muscle cells and generalizes it to be suitable for segmenting and quantifying a variety of cell images by adding more union cases. Compared to the previous methods, the segmentation and quantification accuracy of the proposed framework is also improved by three novel procedures: (1) a simplified calibration method is proposed and added for the threshold selection process; (2) a noise blob filter is proposed to get rid of the noise blobs. (3) a boundary smoothing filter is proposed to reduce the false seeds produced by the iterative erosion. As it turned out, the quantification accuracy of the proposed framework increases from 93.4 to 96.8% compared to the previous method. In addition, the accuracy of the proposed framework is also better in quantifying the muscle cells than two available state-of-the-art methods. The proposed framework is able to automatically segment and quantify more types of cells than state-of-the-art methods.

  19. 29 CFR 95.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... JURISDICTION OF FOREIGN GOVERNMENTS, AND INTERNATIONAL ORGANIZATIONS Post-Award Requirements Financial and Program Management § 95.27 Allowable costs. For each kind of recipient, there is a set of Federal...

  20. Higher Education Tax Allowances: An Analysis

    Science.gov (United States)

    Leslie, Larry L.

    1976-01-01

    Tax allowances are receiving renewed attention at the federal level. Various forms are evaluated that would aid middle-income students and private institutions, and specific bills and proposals are examined. (Editor/LBH)

  1. 42 CFR 61.8 - Benefits: Stipends; dependency allowances; travel allowances; vacation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Benefits: Stipends; dependency allowances; travel allowances; vacation. 61.8 Section 61.8 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN...; dependency allowances; travel allowances; vacation. Individuals awarded regular fellowships shall be entitled...

  2. 76 FR 16629 - Federal Travel Regulation (FTR); Relocation Allowances-Relocation Income Tax Allowance (RITA) Tables

    Science.gov (United States)

    2011-03-24

    ... ADMINISTRATION Federal Travel Regulation (FTR); Relocation Allowances-- Relocation Income Tax Allowance (RITA... provides the annual changes to the RIT allowance tables necessary for calculating the amount of a... RIT allowance tables are located at http://www.gsa.gov/relocationpolicy . DATES: This notice is...

  3. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    Science.gov (United States)

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  4. Quantification of multidimensional entanglement stored in a crystal

    Science.gov (United States)

    Tiranov, Alexey; Designolle, Sébastien; Cruzeiro, Emmanuel Zambrini; Lavoie, Jonathan; Brunner, Nicolas; Afzelius, Mikael; Huber, Marcus; Gisin, Nicolas

    2017-10-01

    The use of multidimensional entanglement opens new perspectives for quantum information processing. However, an important challenge in practice is to certify and characterize multidimensional entanglement from measurement data that are typically limited. Here, we report the certification and quantification of two-photon multidimensional energy-time entanglement between many temporal modes, after one photon has been stored in a crystal. We develop a method for entanglement quantification which makes use of only sparse data obtained with limited resources. This allows us to efficiently certify an entanglement of formation of 1.18 ebits after performing quantum storage. The theoretical methods we develop can be readily extended to a wide range of experimental platforms, while our experimental results demonstrate the suitability of energy-time multidimensional entanglement for a quantum repeater architecture.

  5. Improved perfusion quantification in FAIR imaging by offset correction

    DEFF Research Database (Denmark)

    Sidaros, K; Andersen, I K; Gesmar, H

    2001-01-01

    Perfusion quantification using pulsed arterial spin labeling has been shown to be sensitive to the RF pulse slice profiles. Therefore, in Flow-sensitive Alternating-Inversion Recovery (FAIR) imaging the slice selective (ss) inversion slab is usually three to four times thicker than the imaging sl...... model is presented that allows the use of thinner ss inversion slabs by taking into account the offset of RF slice profiles between ss and nonselective inversion slabs. This model was tested in both phantom and human studies. Magn Reson Med 46:193-197, 2001....... slice. However, this reduces perfusion sensitivity due to the increased transit delay of the incoming blood with unperturbed spins. In the present article, the dependence of the magnetization on the RF pulse slice profiles is inspected both theoretically and experimentally. A perfusion quantification...

  6. Standardless quantification by parameter optimization in electron probe microanalysis

    Energy Technology Data Exchange (ETDEWEB)

    Limandri, Silvina P. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Bonetto, Rita D. [Centro de Investigacion y Desarrollo en Ciencias Aplicadas Dr. Jorge Ronco (CINDECA), CONICET, 47 Street 257, (1900) La Plata (Argentina); Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1 and 47 Streets (1900) La Plata (Argentina); Josa, Victor Galvan; Carreras, Alejo C. [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina); Trincavelli, Jorge C., E-mail: trincavelli@famaf.unc.edu.ar [Instituto de Fisica Enrique Gaviola (IFEG), CONICET (Argentina); Facultad de Matematica, Astronomia y Fisica, Universidad Nacional de Cordoba, Medina Allende s/n, (5016) Cordoba (Argentina)

    2012-11-15

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum Registered-Sign for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively. - Highlights: Black-Right-Pointing-Pointer A method for standardless quantification in EPMA is presented. Black-Right-Pointing-Pointer It gives better results than the commercial software GENESIS Spectrum. Black-Right-Pointing-Pointer It gives better results than the software DTSA. Black-Right-Pointing-Pointer It allows the determination of the conductive coating thickness. Black-Right-Pointing-Pointer It gives an estimation for the concentration uncertainties.

  7. Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems

    Science.gov (United States)

    2017-11-27

    Computational costs of stochastic collocation at the highest sparse grid level used, given emulator of refined computational model ............... 46 vii...Research and Development, Phimeca Engineering, and Ingénierie Math - ématique et Calcul Scientifique.40 Users utilize OpenTURNS by writing Python scripts... levels , that is, the number of discrete values that each parameter can take on. For each parameter, the algorithm computes 2 indices, µ∗, a first-order

  8. 45 CFR 1183.22 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Allowable costs. 1183.22 Section 1183.22 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL FOUNDATION ON THE ARTS AND THE... Part 31. Contract Cost Principles and Procedures, or uniform cost accounting standards that comply with...

  9. 45 CFR 1174.22 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false Allowable costs. 1174.22 Section 1174.22 Public Welfare Regulations Relating to Public Welfare (Continued) NATIONAL FOUNDATION ON THE ARTS AND THE... Part 31. Contract Cost Principles and Procedures, or uniform cost accounting standards that comply with...

  10. 77 FR 34218 - Clothing Allowance; Correction

    Science.gov (United States)

    2012-06-11

    ...)(2)(ii) explained that a veteran who uses more than one prosthetic or orthopedic appliance or medication would be eligible for a clothing allowance for each such appliance or medication if each appliance... that a veteran who uses more than one appliance or medication would be eligible for a clothing...

  11. 29 CFR 15.41 - Allowable claims.

    Science.gov (United States)

    2010-07-01

    ... Arising Out of the Operation of the Job Corps § 15.41 Allowable claims. (a)(1) A claim for damage to persons or property arising out of an act or omission of a student enrolled in the Job Corps may be... student enrolled in the Job Corps. (b) A claim for damage to person or property hereunder may not be paid...

  12. Allowance trading: Market operations and regulatory response

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, K.A.; South, D.W.; McDermott, K.A.

    1992-01-01

    The use of the SO[sub 2] allowance system as defined by Title IV of the 1990 Clean Air Act Amendments offers utilities greater compliance flexibility than EPA technology standards, State Implementation Plan (SEP) performance standards, or EPA bubble/offset strategies. Traditional methods at best offered the utility the ability to trade emissions between different units at a particular plant. The SO[sub 2] emissions trading system advocated under Title IV will allow a utility to trade emissions across its utility system, and/or trade emissions between utilities to take advantage of interfirm control cost differences. The use of transferable emission allowances offers utilities greater flexibility in the choice of how to control emissions: the choices include fuel switching, flue gas scrubbing, environmental dispatch, repowering, and even the choice not to control emissions [as long as the New Source Performance Standards (NSPS) and Prevention of Significant Deterioration (PSD) requirements are met]. The added flexibility allows utilities to choose the least cost manner of compliance with Title IV requirements. It is hoped (intended) that pollution control cost-minimization by individual utilities will in turn reduce the cost of controlling SO[sub 2] for the electric utility industry in aggregate. In addition, through the use of NO[sub x] emission averaging, the utility would average NO[sub x] emissions from different point sources in order to comply with the prescribed emission standard.

  13. 28 CFR 70.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... indirect costs on the same basis as the allocation of indirect costs to sponsored research and development. The costs of independent research and development, including its proportionate share of indirect costs... allowable as indirect costs. Bid and proposal costs of past accounting periods are unallowable in the...

  14. 44 CFR 13.22 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... HOMELAND SECURITY GENERAL UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND COOPERATIVE AGREEMENTS TO STATE AND LOCAL GOVERNMENTS Post-Award Requirements Financial Administration § 13.22 Allowable costs. (a.... For the costs of a— Use the principles in— State, local or Indian tribal government OMB Circular A-87...

  15. 22 CFR 135.22 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... Procedures, or uniform cost accounting standards that comply with cost principles acceptable to the Federal... costs. Allowable costs will be determined in accordance with the cost principles applicable to the... principles. For the costs of a— Use the principles in— State, local or Indian tribal government OMB Circular...

  16. 34 CFR 80.22 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... CFR part 31. Contract Cost Principles and Procedures, or uniform cost accounting standards that comply... kind of organization, there is a set of Federal principles for determining allowable costs. For the costs of a State, local, or Indian tribal government, the Secretary applies the cost principles in OMB...

  17. 43 CFR 12.62 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... uniform cost accounting standards that comply with cost principles acceptable to the Federal agency. ... COST PRINCIPLES FOR ASSISTANCE PROGRAMS Uniform Administrative Requirements for Grants and Cooperative... increment above allowable costs) to the grantee or subgrantee. (b) Applicable cost principles. For each kind...

  18. 24 CFR 84.27 - Allowable costs.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Allowable costs. 84.27 Section 84.27 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development UNIFORM ADMINISTRATIVE REQUIREMENTS FOR GRANTS AND AGREEMENTS WITH INSTITUTIONS OF HIGHER EDUCATION...

  19. 34 CFR 74.27 - Allowable costs.

    Science.gov (United States)

    2010-07-01

    ... OF HIGHER EDUCATION, HOSPITALS, AND OTHER NON-PROFIT ORGANIZATIONS Post-Award Requirements Financial...— Private nonprofit organization other than (1) An institution of higher education; (2) a hospital; or (3... 34 Education 1 2010-07-01 2010-07-01 false Allowable costs. 74.27 Section 74.27 Education Office...

  20. 77 FR 46987 - Utility Allowances Submetering

    Science.gov (United States)

    2012-08-07

    ... commentator requested that ratio utility billing systems (commonly known as RUBS) be treated like submetering. Unlike submetering, RUBS use a formula that allocates a property's utility bill among its units based on... proposed regulations do not permit utility allowances for RUBS. A commentator recommended that the...

  1. International carbon trade with constrained allowance choices

    NARCIS (Netherlands)

    Yu, S.; Weikard, H.P.; Zhu, X.; Ierland, van E.C.

    2017-01-01

    International carbon markets are advocated in order to involve more countries in an agreement for the mitigation of greenhouse gas emissions and to reduce the costs of mitigation. In this paper we develop a model where allowances are endogenously determined by each member of a carbon trade

  2. Children and Family Finances. Kid's Allowance

    OpenAIRE

    Jones, Sheree; Hayhoe, Celia Ray

    2009-01-01

    There are seven main categories on which the USDA bases its calculations for raising a child: housing, food, transportation, clothing, health care, childcare, education, and miscellaneous goods and services. This is an overview of kid's allowance. This is one fact sheet in a series entitled Children and Family Finances.

  3. Family allowance and family planning in Chile.

    Science.gov (United States)

    Plank, S J

    1978-10-01

    Family allowances designed to promote maternal and child health and welfare could be self-defeating if they stimulated otherwise unwanted births, as often assumed. That assumption, with its public health and demographic implications, needs testing. An attempt to test it was made in Chile in 1969--1970 through interviews with 945 wives receiving an allowance and 690 non-recipients. Recipients practiced contraception significantly more than did non-recipients. This was not explained by wives' educational attainment or employment, the couples' earnings, or number of living children, but was associated with a 50 per cent greater utilization of professional prenatal care by recipients during the most recent pregnancy; women with such care (regardless of allowance status) were 75 per cent more likely than others to control their fertility. Prenatal care was probably sought more by recipients in part because an additional stipend was provided as soon as pregnancy was confirmed, usually at clinics with integrated family planning. Greater family income, attributable to the allowance, probably also contributed to the recipients' better prenatal attention and to contraceptive practice. Noteworthy, too, was the finding that with the number of living children controlled, contraceptive practice was significantly greater amoung couples who had never lost a child.

  4. 42 CFR 417.534 - Allowable costs.

    Science.gov (United States)

    2010-10-01

    ... typical “provider” costs, and costs (such as marketing, enrollment, membership, and operation of the HMO... principles applicable to provider costs, as set forth in § 417.536. (2) The allowability of other costs is determined in accordance with principles set forth in §§ 417.538 through 417.550. (3) Costs for covered...

  5. Measuring and managing ratio compression for accurate iTRAQ/TMT quantification.

    Science.gov (United States)

    Savitski, Mikhail M; Mathieson, Toby; Zinn, Nico; Sweetman, Gavain; Doce, Carola; Becher, Isabelle; Pachl, Fiona; Kuster, Bernhard; Bantscheff, Marcus

    2013-08-02

    Isobaric mass tagging (e.g., TMT and iTRAQ) is a precise and sensitive multiplexed peptide/protein quantification technique in mass spectrometry. However, accurate quantification of complex proteomic samples is impaired by cofragmentation of peptides, leading to systematic underestimation of quantitative ratios. Label-free quantification strategies do not suffer from such an accuracy bias but cannot be multiplexed and are less precise. Here, we compared protein quantification results obtained with these methods for a chemoproteomic competition binding experiment and evaluated the utility of measures of spectrum purity in survey spectra for estimating the impact of cofragmentation on measured TMT-ratios. While applying stringent interference filters enables substantially more accurate TMT quantification, this came at the expense of 30%-60% fewer proteins quantified. We devised an algorithm that corrects experimental TMT ratios on the basis of determined peptide interference levels. The quantification accuracy achieved with this correction was comparable to that obtained with stringent spectrum filters but limited the loss in coverage to <10%. The generic applicability of the fold change correction algorithm was further demonstrated by spiking of chemoproteomics samples into excess amounts of E. coli tryptic digests.

  6. 75 FR 14442 - Federal Travel Regulation (FTR); Relocation Allowances-Relocation Income Tax Allowance (RITA) Tables

    Science.gov (United States)

    2010-03-25

    ... From the Federal Register Online via the Government Publishing Office GENERAL SERVICES ADMINISTRATION Federal Travel Regulation (FTR); Relocation Allowances-- Relocation Income Tax Allowance (RITA) Tables AGENCY: Office of Governmentwide Policy, General Services Administration (GSA). ] ACTION: Notice...

  7. What parents say about the allowance: Function of the allowance for parents of different economic incomes

    OpenAIRE

    Irani Lauer Lellis; Celina Maria Colino Magalhães; Iani Dias Lauer Leite

    2012-01-01

    The practice of giving allowance is used by several parents in different parts of the world and can contribute to the economic education of children. This study aimed to investigate the purposes of the allowance with 32 parents of varying incomes. We used the focus group technique and Alceste software to analyze the data. The results involved two classes related to the process of using the allowance. These classes have covered aspects of the role of socialization and education allowance, serv...

  8. Comparative cytotoxic and spectrophotometric quantification of ...

    African Journals Online (AJOL)

    The comparative cytotoxic and spectrophotometric quantification of phytochemicals of the methanol extracts of the leaf and root bark of Securinega virosa was carried out. Phytochemical screening and spectrophotometric quantification of total flavonoids and phenolics of the extracts were carried out using standard reported ...

  9. Judicial Deference Allows European Consensus to Emerge

    DEFF Research Database (Denmark)

    Dothan, Shai

    2018-01-01

    conceived as competing doctrines: the more there is of one, the less there is of another. This paper suggests a novel rationale for the emerging consensus doctrine: the doctrine can allow the ECHR to make good policies by drawing on the independent decision-making of many similar countries. In light of that......, the paper demonstrates that a correct application of the margin of appreciation doctrine actually helps emerging consensus reach optimal results, by giving countries an incentive to make their policies independently....

  10. Making It Personal: Per Capita Carbon Allowances

    DEFF Research Database (Denmark)

    Fawcett, Tina; Hvelplund, Frede; Meyer, Niels I

    2009-01-01

    The Chapter highligts the importance of introducing new, efficient schemes for mitigation of global warming. One such scheme is Personal Carbon Allowances (PCA), whereby individuals are allotted a tradable ration of CO2 emission per year.This chapter reviews the fundamentals of PCA and analyzes its...... merits and problems. The United Kingdom and Denmark have been chosen as case studies because the energy situation and the institutional setup are quite different between the two countries....

  11. Standardless quantification by parameter optimization in electron probe microanalysis

    Science.gov (United States)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  12. Cross recurrence quantification for cover song identification

    Energy Technology Data Exchange (ETDEWEB)

    Serra, Joan; Serra, Xavier; Andrzejak, Ralph G [Department of Information and Communication Technologies, Universitat Pompeu Fabra, Roc Boronat 138, 08018 Barcelona (Spain)], E-mail: joan.serraj@upf.edu

    2009-09-15

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Roessler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  13. Quantification of biological aging in young adults.

    Science.gov (United States)

    Belsky, Daniel W; Caspi, Avshalom; Houts, Renate; Cohen, Harvey J; Corcoran, David L; Danese, Andrea; Harrington, HonaLee; Israel, Salomon; Levine, Morgan E; Schaefer, Jonathan D; Sugden, Karen; Williams, Ben; Yashin, Anatoli I; Poulton, Richie; Moffitt, Terrie E

    2015-07-28

    Antiaging therapies show promise in model organism research. Translation to humans is needed to address the challenges of an aging global population. Interventions to slow human aging will need to be applied to still-young individuals. However, most human aging research examines older adults, many with chronic disease. As a result, little is known about aging in young humans. We studied aging in 954 young humans, the Dunedin Study birth cohort, tracking multiple biomarkers across three time points spanning their third and fourth decades of life. We developed and validated two methods by which aging can be measured in young adults, one cross-sectional and one longitudinal. Our longitudinal measure allows quantification of the pace of coordinated physiological deterioration across multiple organ systems (e.g., pulmonary, periodontal, cardiovascular, renal, hepatic, and immune function). We applied these methods to assess biological aging in young humans who had not yet developed age-related diseases. Young individuals of the same chronological age varied in their "biological aging" (declining integrity of multiple organ systems). Already, before midlife, individuals who were aging more rapidly were less physically able, showed cognitive decline and brain aging, self-reported worse health, and looked older. Measured biological aging in young adults can be used to identify causes of aging and evaluate rejuvenation therapies.

  14. Cross recurrence quantification for cover song identification

    Science.gov (United States)

    Serrà, Joan; Serra, Xavier; Andrzejak, Ralph G.

    2009-09-01

    There is growing evidence that nonlinear time series analysis techniques can be used to successfully characterize, classify, or process signals derived from real-world dynamics even though these are not necessarily deterministic and stationary. In the present study, we proceed in this direction by addressing an important problem our modern society is facing, the automatic classification of digital information. In particular, we address the automatic identification of cover songs, i.e. alternative renditions of a previously recorded musical piece. For this purpose, we here propose a recurrence quantification analysis measure that allows the tracking of potentially curved and disrupted traces in cross recurrence plots (CRPs). We apply this measure to CRPs constructed from the state space representation of musical descriptor time series extracted from the raw audio signal. We show that our method identifies cover songs with a higher accuracy as compared to previously published techniques. Beyond the particular application proposed here, we discuss how our approach can be useful for the characterization of a variety of signals from different scientific disciplines. We study coupled Rössler dynamics with stochastically modulated mean frequencies as one concrete example to illustrate this point.

  15. Low cost high performance uncertainty quantification

    KAUST Repository

    Bekas, C.

    2009-01-01

    Uncertainty quantification in risk analysis has become a key application. In this context, computing the diagonal of inverse covariance matrices is of paramount importance. Standard techniques, that employ matrix factorizations, incur a cubic cost which quickly becomes intractable with the current explosion of data sizes. In this work we reduce this complexity to quadratic with the synergy of two algorithms that gracefully complement each other and lead to a radically different approach. First, we turned to stochastic estimation of the diagonal. This allowed us to cast the problem as a linear system with a relatively small number of multiple right hand sides. Second, for this linear system we developed a novel, mixed precision, iterative refinement scheme, which uses iterative solvers instead of matrix factorizations. We demonstrate that the new framework not only achieves the much needed quadratic cost but in addition offers excellent opportunities for scaling at massively parallel environments. We based our implementation on BLAS 3 kernels that ensure very high processor performance. We achieved a peak performance of 730 TFlops on 72 BG/P racks, with a sustained performance 73% of theoretical peak. We stress that the techniques presented in this work are quite general and applicable to several other important applications. Copyright © 2009 ACM.

  16. Aspect-Oriented Programming is Quantification and Obliviousness

    Science.gov (United States)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  17. ASVCP guidelines: Allowable total error hematology.

    Science.gov (United States)

    Nabity, Mary B; Harr, Kendal E; Camus, Melinda S; Flatland, Bente; Vap, Linda M

    2018-02-11

    The purpose of this document is to provide total allowable error (TE a ) recommendations for commonly analyzed hematology measurands for veterinary personnel. These guidelines define relevant terminology and highlight considerations specific to hematology measurands. They also provide reasons and guidelines for using TE a in instrument performance evaluation, including recommendations for when the total observed error exceeds the recommended TE a . Biological variation-based quality specifications are briefly discussed. The appendix describes the derivation of the hematology TE a recommendations and provides resources for external quality assurance/proficiency testing programs and a worksheet for implementation of the guidelines. © 2018 American Society for Veterinary Clinical Pathology.

  18. What parents say about the allowance: Function of the allowance for parents of different economic incomes

    Directory of Open Access Journals (Sweden)

    Irani Lauer Lellis

    2012-06-01

    Full Text Available The practice of giving allowance is used by several parents in different parts of the world and can contribute to the economic education of children. This study aimed to investigate the purposes of the allowance with 32 parents of varying incomes. We used the focus group technique and Alceste software to analyze the data. The results involved two classes related to the process of using the allowance. These classes have covered aspects of the role of socialization and education allowance, serving as an instrument of reward, but sometimes encouraging bad habits in children. The justification of the fathers concerning the amount of money to be given to the children and when to stop giving allowance were also highlighted.   Keywords: allowance; economic socialization; parenting practices.

  19. Source-Code Instrumentation and Quantification of Events

    Science.gov (United States)

    Filman, Robert E.; Havelund, Klaus; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Aspect Oriented Programming (AOP) is making quantified programmatic assertions over programs that otherwise are not annotated to receive these assertions. Varieties of AOP systems are characterized by which quantified assertions they allow, what they permit in the actions of the assertions (including how the actions interact with the base code), and what mechanisms they use to achieve the overall effect. Here, we argue that all quantification is over dynamic events, and describe our preliminary work in developing a system that maps dynamic events to transformations over source code. We discuss possible applications of this system, particularly with respect to debugging concurrent systems.

  20. Development of a test method that will allow evaluation and quantification of the effects of healing on asphalt mixture [summary].

    Science.gov (United States)

    2012-01-01

    Top-down cracking in flexible pavement is one of the most common and crucial modes of pavement distress in Florida, reducing both service quality and life of flexible pavement. The process begins with micro-cracks (micro-damage), which grow and merge...

  1. AxonQuant: A Microfluidic Chamber Culture-Coupled Algorithm That Allows High-Throughput Quantification of Axonal Damage

    National Research Council Canada - National Science Library

    Li, Yang; Yang, Mengxue; Huang, Zhuo; Chen, Xiaoping; Maloney, Michael T; Zhu, Li; Liu, Jianghong; Yang, Yanmin; Du, Sidan; Jiang, Xingyu; Wu, Jane Y

    2014-01-01

    Published methods for imaging and quantitatively analyzing morphological changes in neuronal axons have serious limitations because of their small sample sizes, and their time-consuming and nonobjective nature...

  2. Quantification of residual host cell DNA in adenoviral vectors produced on PER.C6 cells

    NARCIS (Netherlands)

    Gijsbers, Linda; Koel, Björn; Weggeman, Miranda; Goudsmit, Jaap; Havenga, Menzo; Marzio, Giuseppe

    2005-01-01

    Recombinant adenoviral vectors for gene therapy and vaccination are routinely prepared on cultures of immortalized cells, allowing the production of vector batches of high titer and consistent quality. Quantification of residual DNA from the producing cell line is part of the purity tests for

  3. Spectrophotometric Quantification of Reactive Oxygen, Nitrogen and Sulfur Species in Plant Samples.

    Science.gov (United States)

    Antoniou, Chrystalla; Savvides, Andreas; Georgiadou, Egli C; Fotopoulos, Vasileios

    2018-01-01

    Reactive oxygen, nitrogen and sulfur species are key signalling molecules involved in multiple physiological processes that can be examined in qualitative and quantitative manners. Here, we describe simple spectrophotometric assays that allow the quantification of hydrogen peroxide, nitrite-derived nitric oxide and hydrogen sulphide from plant tissues.

  4. Quantification of virus syndrome in chili peppers | González-Pérez ...

    African Journals Online (AJOL)

    One of the most important problems to produce chili crops is the presence of diseases caused by pathogen agents, such as viruses, therefore, there is a substantial necessity to better predict the behavior of the diseases of these crops, determining a more precise quantification of the disease's syndrome that allows the ...

  5. A K-sample Homogeneity Test based on the Quantification of the p-p Plot

    NARCIS (Netherlands)

    Hinloopen, Jeroen; Wagenvoort, Rien; Marrewijk, van Charles

    2008-01-01

    We propose a quantification of the p-p plot that assigns equal weight to all distances between the respective distributions: the surface between the p-p plot and the diagonal. This surface is labelled the Harmonic Weighted Mass (HWM) index. We introduce the diagonal-deviation (d-d) plot that allows

  6. Spacecraft Maximum Allowable Concentrations for Airborne Contaminants

    Science.gov (United States)

    James, John T.

    2008-01-01

    The enclosed table lists official spacecraft maximum allowable concentrations (SMACs), which are guideline values set by the NASA/JSC Toxicology Group in cooperation with the National Research Council Committee on Toxicology (NRCCOT). These values should not be used for situations other than human space flight without careful consideration of the criteria used to set each value. The SMACs take into account a number of unique factors such as the effect of space-flight stress on human physiology, the uniform good health of the astronauts, and the absence of pregnant or very young individuals. Documentation of the values is given in a 5 volume series of books entitled "Spacecraft Maximum Allowable Concentrations for Selected Airborne Contaminants" published by the National Academy Press, Washington, D.C. These books can be viewed electronically at http://books.nap.edu/openbook.php?record_id=9786&page=3. Short-term (1 and 24 hour) SMACs are set to manage accidental releases aboard a spacecraft and permit risk of minor, reversible effects such as mild mucosal irritation. In contrast, the long-term SMACs are set to fully protect healthy crewmembers from adverse effects resulting from continuous exposure to specific air pollutants for up to 1000 days. Crewmembers with allergies or unusual sensitivity to trace pollutants may not be afforded complete protection, even when long-term SMACs are not exceeded. Crewmember exposures involve a mixture of contaminants, each at a specific concentration (C(sub n)). These contaminants could interact to elicit symptoms of toxicity even though individual contaminants do not exceed their respective SMACs. The air quality is considered acceptable when the toxicity index (T(sub grp)) for each toxicological group of compounds is less than 1, where T(sub grp), is calculated as follows: T(sub grp) = C(sub 1)/SMAC(sub 1) + C(sub 2/SMAC(sub 2) + ...+C(sub n)/SMAC(sub n).

  7. 42 CFR 61.9 - Payments: Stipends; dependency allowances; travel allowances.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Payments: Stipends; dependency allowances; travel allowances. 61.9 Section 61.9 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.9 Payments: Stipends; dependency...

  8. 40 CFR 82.8 - Grant of essential use allowances and critical use allowances.

    Science.gov (United States)

    2010-07-01

    ... use allowances for pre-plant uses *(kilograms) 2010 critical use allowances for post-harvest uses... of Class I, Group VI controlled substance exclusively for the Pre-Plant or Post-Harvest uses... Calendar Year 2010 (i) Metered Dose Inhalers (for oral inhalation) for Treatment of Asthma and Chronic...

  9. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  10. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  11. Uncertainty Quantification in Aerodynamics Simulations Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  12. Quantification of biofilm exopolysaccharides using an in situ assay with periodic acid-Schiff reagent.

    Science.gov (United States)

    Randrianjatovo-Gbalou, I; Girbal-Neuhauser, E; Marcato-Romain, C-E

    2016-05-01

    A novel approach to the quantification of extracellular polysaccharides in miniaturized biofilms presenting a wide variety of extracellular matrices was developed. The assay used the periodic acid-Schiff reagent and was first calibrated on dextran and alginate solutions. Then it was implemented on 24-h and 48-h biofilms from three strains known to produce different exopolymeric substances (Pseudomonas aeruginosa, Bacillus licheniformis, Weissella confusa). The assay allowed quantification of the total exopolysaccharides, taking into account possible interferences due to cells or other main expolymers of the matrix (eDNA, proteins). Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Reproducible Research, Uncertainty Quantification, and Verification & Validation

    OpenAIRE

    Barba, Lorena A.

    2014-01-01

    Slides used with my presentation in the SIAM Uncertainty Quantification Conference 2014, Minisymposium on "The Reliability of Computational Research Findings: Reproducible Research, Uncertainty Quantification, and Verification & Validation." The talk used an audience response system to collect True/False or Yes/No opinions on 13 statements/questions: 1) Computer simulations create scientific knowledge.  2) Simulation is a method 3) A reproducible simulation does not need to be acc...

  14. Mixture quantification using PLS in plastic scintillation measurements

    Energy Technology Data Exchange (ETDEWEB)

    Bagan, H.; Tarancon, A.; Rauret, G. [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain); Garcia, J.F., E-mail: jfgarcia@ub.ed [Departament de Quimica Analitica, Universitat de Barcelona, Diagonal 647, E-08028 Barcelona (Spain)

    2011-06-15

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ({sup 241}Am, {sup 137}Cs and {sup 90}Sr/{sup 90}Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter.

  15. Accessible quantification of multiparticle entanglement

    Science.gov (United States)

    Cianciaruso, Marco; Bromley, Thomas R.; Adesso, Gerardo

    2016-10-01

    Entanglement is a key ingredient for quantum technologies and a fundamental signature of quantumness in a broad range of phenomena encompassing many-body physics, thermodynamics, cosmology and life sciences. For arbitrary multiparticle systems, entanglement quantification typically involves nontrivial optimisation problems, and it may require demanding tomographical techniques. Here, we develop an experimentally feasible approach to the evaluation of geometric measures of multiparticle entanglement. Our framework provides analytical results for particular classes of mixed states of N qubits, and computable lower bounds to global, partial, or genuine multiparticle entanglement of any general state. For global and partial entanglement, useful bounds are obtained with minimum effort, requiring local measurements in just three settings for any N. For genuine entanglement, a number of measurements scaling linearly with N are required. We demonstrate the power of our approach to estimate and quantify different types of multiparticle entanglement in a variety of N-qubit states useful for quantum information processing and recently engineered in laboratories with quantum optics and trapped ion setups.

  16. Inverse Problems and Uncertainty Quantification

    KAUST Repository

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  17. Inverse problems and uncertainty quantification

    KAUST Repository

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  18. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  19. Identification of Spectral Regions for Quantification of Red Wine Tannins with Fourier Transform Mid-Infrared Spectroscopy

    DEFF Research Database (Denmark)

    Jensen, Jacob Skibsted; Egebo, Max; Meyer, Anne S.

    2008-01-01

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due...... to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included...... to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69−79 mg of CE/L; r = 0...

  20. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    Science.gov (United States)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  1. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  2. Normalizing computed tomography data reconstructed with different filter kernels: effect on emphysema quantification.

    Science.gov (United States)

    Gallardo-Estrella, Leticia; Lynch, David A; Prokop, Mathias; Stinson, Douglas; Zach, Jordan; Judy, Philip F; van Ginneken, Bram; van Rikxoort, Eva M

    2016-02-01

    To propose and evaluate a method to reduce variability in emphysema quantification among different computed tomography (CT) reconstructions by normalizing CT data reconstructed with varying kernels. We included 369 subjects from the COPDGene study. For each subject, spirometry and a chest CT reconstructed with two kernels were obtained using two different scanners. Normalization was performed by frequency band decomposition with hierarchical unsharp masking to standardize the energy in each band to a reference value. Emphysema scores (ES), the percentage of lung voxels below -950 HU, were computed before and after normalization. Bland-Altman analysis and correlation between ES and spirometry before and after normalization were compared. Two mixed cohorts, containing data from all scanners and kernels, were created to simulate heterogeneous acquisition parameters. The average difference in ES between kernels decreased for the scans obtained with both scanners after normalization (7.7 ± 2.7 to 0.3 ± 0.7; 7.2 ± 3.8 to -0.1 ± 0.5). Correlation coefficients between ES and FEV1, and FEV1/FVC increased significantly for the mixed cohorts. Normalization of chest CT data reduces variation in emphysema quantification due to reconstruction filters and improves correlation between ES and spirometry. • Emphysema quantification is sensitive to the reconstruction kernel used. • Normalization allows comparison of emphysema quantification from images reconstructed with varying kernels. • Normalization allows comparison of emphysema quantification obtained with scanners from different manufacturers. • Normalization improves correlation of emphysema quantification with spirometry. • Normalization can be used to compare data from different studies and centers.

  3. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    Science.gov (United States)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  4. Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.

    Science.gov (United States)

    Mujika, Iñigo

    2017-04-01

    Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.

  5. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical

  6. 3D automatic quantification applied to optically sectioned images to improve microscopy analysis

    Directory of Open Access Journals (Sweden)

    JE Diaz-Zamboni

    2009-08-01

    Full Text Available New fluorescence microscopy techniques, such as confocal or digital deconvolution microscopy, allow to easily obtain three-dimensional (3D information from specimens. However, there are few 3D quantification tools that allow extracting information of these volumes. Therefore, the amount of information acquired by these techniques is difficult to manipulate and analyze manually. The present study describes a model-based method, which for the first time shows 3D visualization and quantification of fluorescent apoptotic body signals, from optical serial sections of porcine hepatocyte spheroids correlating them to their morphological structures. The method consists on an algorithm that counts apoptotic bodies in a spheroid structure and extracts information from them, such as their centroids in cartesian and radial coordinates, relative to the spheroid centre, and their integrated intensity. 3D visualization of the extracted information, allowed us to quantify the distribution of apoptotic bodies in three different zones of the spheroid.

  7. Separation and quantification of microalgal carbohydrates.

    Science.gov (United States)

    Templeton, David W; Quinn, Matthew; Van Wychen, Stefanie; Hyman, Deborah; Laurens, Lieve M L

    2012-12-28

    Structural carbohydrates can constitute a large fraction of the dry weight of algal biomass and thus accurate identification and quantification is important for summative mass closure. Two limitations to the accurate characterization of microalgal carbohydrates are the lack of a robust analytical procedure to hydrolyze polymeric carbohydrates to their respective monomers and the subsequent identification and quantification of those monosaccharides. We address the second limitation, chromatographic separation of monosaccharides, here by identifying optimum conditions for the resolution of a synthetic mixture of 13 microalgae-specific monosaccharides, comprised of 8 neutral, 2 amino sugars, 2 uronic acids and 1 alditol (myo-inositol as an internal standard). The synthetic 13-carbohydrate mix showed incomplete resolution across 11 traditional high performance liquid chromatography (HPLC) methods, but showed improved resolution and accurate quantification using anion exchange chromatography (HPAEC) as well as alditol acetate derivatization followed by gas chromatography (for the neutral- and amino-sugars only). We demonstrate the application of monosaccharide quantification using optimized chromatography conditions after sulfuric acid analytical hydrolysis for three model algae strains and compare the quantification and complexity of monosaccharides in analytical hydrolysates relative to a typical terrestrial feedstock, sugarcane bagasse. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Technological and Analytical Methods for Arabinoxylan Quantification from Cereals.

    Science.gov (United States)

    Döring, Clemens; Jekle, Mario; Becker, Thomas

    2016-01-01

    Arabinoxylan (AX) is the major nonstarch polysaccharide contained in various types of grains. AX consists of a backbone of β1.4D-xylopyranosyl residues with randomly linked αlarabinofuranosyl units. Once isolated and included as food additive, AX affects foodstuff attributes and has positive effects on human health. AX can be classified into waterextractable and waterunextractable AX. For isolating AX out of their natural matrix, a range of methods was developed, adapted, and improved. This review presents a survey of the commonly used extraction methods for AX by the influence of different techniques. It also provides a brief overview of the structural and technological impact of AX as a dough additive. A concluding section summarizes different detection methods for analyzing and quantification AX.

  9. 2012 Global Management Education Graduate Survey. Survey Report

    Science.gov (United States)

    Leach, Laura

    2012-01-01

    Each year for the past 13 years, the Graduate Management Admission Council (GMAC) has conducted a survey of graduate management education students in their final year of business school. The Global Management Education Graduate Survey is distributed to students at participating schools. The survey allows students to express their opinions about…

  10. Global Management Education Graduate Survey, 2011. Survey Report

    Science.gov (United States)

    Schoenfeld, Gregg

    2011-01-01

    Each year for the past 12 years, the Graduate Management Admission Council[R] (GMAC[R]) has conducted a survey of graduate management education students in their final year of business school. This Global Management Education Graduate Survey is distributed to students at participating business schools. The survey allows students to express their…

  11. Comparison of five DNA quantification methods

    DEFF Research Database (Denmark)

    Nielsen, Karsten; Mogensen, Helle Smidt; Hedman, Johannes

    2008-01-01

    Six commercial preparations of human genomic DNA were quantified using five quantification methods: UV spectrometry, SYBR-Green dye staining, slot blot hybridization with the probe D17Z1, Quantifiler Human DNA Quantification kit and RB1 rt-PCR. All methods measured higher DNA concentrations than...... expected based on the information by the manufacturers. UV spectrometry, SYBR-Green dye staining, slot blot and RB1 rt-PCR gave 39, 27, 11 and 12%, respectively, higher concentrations than expected based on the manufacturers' information. The DNA preparations were quantified using the Quantifiler Human DNA...... Quantification kit in two experiments. The measured DNA concentrations with Quantifiler were 125 and 160% higher than expected based on the manufacturers' information. When the Quantifiler human DNA standard (Raji cell line) was replaced by the commercial human DNA preparation G147A (Promega) to generate the DNA...

  12. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  13. Cell-based quantification of molecular biomarkers in histopathology specimens.

    Science.gov (United States)

    Al-Kofahi, Yousef; Lassoued, Wiem; Grama, Kedar; Nath, Sumit K; Zhu, Jianliang; Oueslati, Ridha; Feldman, Michael; Lee, William M F; Roysam, Badrinath

    2011-07-01

    To investigate the use of a computer-assisted technology for objective, cell-based quantification of molecular biomarkers in specified cell types in histopathology specimens, with the aim of advancing current visual estimation and pixel-level (rather than cell-based) quantification methods. Tissue specimens were multiplex-immunostained to reveal cell structures, cell type markers, and analytes, and imaged with multispectral microscopy. The image data were processed with novel software that automatically delineates and types each cell in the field, measures morphological features, and quantifies analytes in different subcellular compartments of specified cells.The methodology was validated with the use of cell blocks composed of differentially labelled cultured cells mixed in known proportions, and evaluated on human breast carcinoma specimens for quantifying human epidermal growth factor receptor 2, estrogen receptor, progesterone receptor, Ki67, phospho-extracellular signal-related kinase, and phospho-S6. Automated cell-level analyses closely matched human assessments, but, predictably, differed from pixel-level analyses of the same images. Our method reveals the type, distribution, morphology and biomarker state of each cell in the field, and allows multiple biomarkers to be quantified over specified cell types, regardless of their abundance. It is ideal for studying specimens from patients in clinical trials of targeted therapeutic agents, for investigating minority stromal cell subpopulations, and for phenotypic characterization to personalize therapy and prognosis. © 2011 Blackwell Publishing Limited.

  14. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  15. Detection and quantification of hydrocarbons in sediments

    Science.gov (United States)

    Wynn, Jeff; Williamson, Mike; Frank, Jeff

    2016-01-01

    A new technology developed by the US Geological Survey now allows for fast, direct detection of hydrocarbon plumes both in rivers and drifting in the deep ocean. Recent experiments show that the method can also detect and quantify hydrocarbons buried in river sediments and estuaries. This approach uses a variant of induced polarization, a surface-sensitive physical property of certain polarizable materials immersed in an electrolyte that can accept and adsorb charge under an inducing voltage. Known polarizable materials include most sulfides, ilmenite (FeTiO3), metallic objects such as buried wrecks and pipelines, and now hydrocarbons. The hydrocarbon-in-water response to induced polarization is in fact nearly two orders of magnitude greater than the IP response of any of the hard minerals. The oil:water detection limit for hydrocarbons so far is down to 0.0002% in the laboratory.

  16. Predicting human age with bloodstains by sjTREC quantification.

    Directory of Open Access Journals (Sweden)

    Xue-ling Ou

    Full Text Available The age-related decline of signal joint T-cell receptor rearrangement excision circles (sjTRECs in human peripheral blood has been demonstrated in our previous study and other reports. Until now, only a few studies on sjTREC detection in bloodstain samples were reported, which were based on a small sample of subjects of a limited age range, although bloodstains are much more frequently encountered in forensic practice. In this present study, we adopted the sensitive Taqman real-time quantitative polymerase chain reaction (qPCR method to perform sjTREC quantification in bloodstains from individuals ranging from 0-86 years old (n = 264. The results revealed that sjTREC contents in human bloodstains were declined in an age-dependent manner (r = -0.8712. The formula of age estimation was Age = -7.1815Y-42.458 ± 9.42 (Y dCt(TBP-sjTREC; 9.42 standard error. Furthermore, we tested for the influence of short- or long- storage time by analyzing fresh and stored bloodstains from the same individuals. Remarkably, no statistically significant difference in sjTREC contents was found between the fresh and old DNA samples over a 4-week of storage time. However, significant loss (0.16-1.93 dCt in sjTREC contents was detected after 1.5 years of storage in 31 samples. Moreover, preliminary sjTREC quantification from up to 20-year-old bloodstains showed that though the sjTREC contents were detectable in all samples and highly correlated with donor age, a time-dependent decrease in the correlation coefficient r was found, suggesting the predicting accuracy of this described assay would be deteriorated in aged samples. Our findings show that sjTREC quantification might be also suitable for age prediction in bloodstains, and future researches into the time-dependent or other potential impacts on sjTREC quantification might allow further improvement of the predicting accuracy.

  17. Quantification model for energy consumption in edification

    Directory of Open Access Journals (Sweden)

    Mercader, Mª P.

    2012-12-01

    Full Text Available The research conducted in this paper focuses on the generation of a model for the quantification of energy consumption in building. This is to be done through one of the most relevant environmental impact indicators associated with weight per m2 of construction, as well as the energy consumption resulting from the manufacturing process of materials used in building construction. The practical application of the proposed model on different buildings typologies in Seville, will provide information regarding the building materials, the subsystems and the most relevant construction elements. Hence, we will be able to observe the impact the built surface has on the environment. The results obtained aim to reference the scientific community, providing quantitative data comparable to other types of buildings and geographical areas. Furthermore, it may also allow the analysis and the characterization of feasible solutions to reduce the environmental impact generated by the different materials, subsystems and construction elements commonly used in the different building types defined in this study.

    La investigación realizada en el presente trabajo plantea la generación de un modelo de cuantificación del consumo energético en edificación, a través de uno de los indicadores de impacto ambiental más relevantes asociados al peso por m2 de construcción, el consumo energético derivado del proceso de fabricación de los materiales de construcción empleados en edificación. La aplicación práctica del modelo propuesto sobre diferentes tipologías edificatorias en Sevilla aportará información respecto a los materiales de construcción, subsistemas y elementos constructivos más impactantes, permitiendo visualizar la influencia que presenta la superficie construida en cuanto al impacto ambiental generado. Los resultados obtenidos pretenden servir de referencia a la comunidad científica, aportando datos num

  18. 49 CFR 230.24 - Maximum allowable stress.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Maximum allowable stress. 230.24 Section 230.24... Allowable Stress § 230.24 Maximum allowable stress. (a) Maximum allowable stress value. The maximum allowable stress value on any component of a steam locomotive boiler shall not exceed 1/4 of the ultimate...

  19. 40 CFR 73.27 - Special allowance reserve.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Special allowance reserve. 73.27... (CONTINUED) SULFUR DIOXIDE ALLOWANCE SYSTEM Allowance Allocations § 73.27 Special allowance reserve. (a) Establishment of Reserve. (1) The Administrator will allocate 150,000 allowances annually for calendar years...

  20. 40 CFR 96.42 - NOX allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false NOX allowance allocations. 96.42... NOX Allowance Allocations § 96.42 NOX allowance allocations. (a)(1) The heat input (in mmBtu) used for calculating NOX allowance allocations for each NOX Budget unit under § 96.4 will be: (i) For a NOX allowance...

  1. 40 CFR 73.30 - Allowance tracking system accounts.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Allowance tracking system accounts. 73... (CONTINUED) SULFUR DIOXIDE ALLOWANCE SYSTEM Allowance Tracking System § 73.30 Allowance tracking system... for all affected sources pursuant to § 73.31 (a) and (b). All allocations of allowances pursuant to...

  2. 20 CFR 218.30 - Separation, displacement or dismissal allowance.

    Science.gov (United States)

    2010-04-01

    ... allowance, the employee gives up his or her job rights. Regardless of whether a separation allowance is paid... allowance. 218.30 Section 218.30 Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE... allowance. (a) General. When an employee receives a separation, displacement or dismissal allowance from a...

  3. 5 CFR 591.307 - Payment of allowance rate.

    Science.gov (United States)

    2010-01-01

    ... regardless of eligibility for the transportation expense part of the allowance rate when the employee is... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Payment of allowance rate. 591.307... ALLOWANCES AND DIFFERENTIALS Allowance Based on Duty at Remote Worksites § 591.307 Payment of allowance rate...

  4. Survey results for oblique field magnetic flux leakage survey in comparison to axial field

    Energy Technology Data Exchange (ETDEWEB)

    Simek, James [T.D. Williamson, Inc., Tulsa, OK (United States)

    2012-07-01

    Pipeline operators worldwide have implemented integrity management programs in an effort to improve operation and maintenance efficiency along with continued safe operation of pipeline systems. Several types of monitoring and data collection activities are incorporated into these programs, with in line inspection (ILI) tools providing data for detection and quantification of features that may impact the integrity of the pipeline system. Magnetic flux leakage (MFL) ILI tools are among the most widely used in pipeline systems. Primarily used for metal loss detection and quantification, these tools are extremely robust, performing successfully in the harsh environments found in operating pipelines, with the majority of MFL tools in service today relying upon axially oriented magnetic fields. For feature classes whose principal axis is aligned parallel to the pipe axis, the use of an axially applied magnetic field may quite often result in decreased performance due to difficulties in detection and sizing. Through the use of fields applied either perpendicular or in an oblique direction to the principal axis, the magnetic leakage levels generated at feature locations are increased, providing usable signal levels. When used concurrently with an axially oriented magnetizer, an obliquely applied magnetic field may provide the ability to detect, quantify, or otherwise aid in discrimination of volumetric versus non-volumetric features. Providing the ability to collect both of these data sets in a single survey would allow operators to minimize the number of surveys required to address all categories of metal loss features that may be present within pipeline systems. This paper will discuss some of the variables that affect detection and sizing of metal loss zones with respect to the applied field direction, including graphs and tables to quantify the effects of angular displacement for specific feature shapes. Several classes of features have been chosen for evaluation

  5. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    Science.gov (United States)

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Quantification of benzoxazinoids and their metabolites in Nordic breads

    DEFF Research Database (Denmark)

    Dihm, Katharina; Vendelbo Lind, Mads; Sundén, Henrik

    2017-01-01

    ) and 2-hydroxy-N-(2-hydroxyphenyl) acetamide (HHPAA-d4) were synthesized, to allow quantification of nine Bx and their metabolites in 30 breads and flours from Nordic countries by UHPLC-MS/MS. Samples containing rye had larger amounts of Bx (143–3560 µg/g DM) than the ones containing wheat (11–449 µg....../g DM). More Bx were found in whole grain wheat (57–449 µg/g DM) compared to refined wheat (11–92 µg/g DM) breads. Finnish sourdough rye breads were notably high in their 2-hydroxy-N-(2-hydroxyphenyl) acetamide (HHPAA) concentration (40–48 µg/g DM). This new information on Bx content in flours...

  7. Differential reproductive timing in Echinocardium spp.: the first Mediterranean survey allows interoceanic and interspecific comparisons.

    Science.gov (United States)

    Egea, Emilie; Mérigot, Bastien; Mahé-Bézac, Chantal; Féral, Jean-Pierre; Chenuil, Anne

    2011-01-01

    Echinocardium cordatum had long been considered as cosmopolitan, but molecular data revealed it is a complex of cryptic species, with two non-hybridizing species (B1 & B2) in the Mediterranean Sea living in syntopy with Echinocardium mediterraneum. Histological analyses of the gonads from a 17-month sampling period revealed a statistically significant time lag between the Maturity Indices of E. cordatum and E. mediterraneum. The main environmental stimulus may be different for the two nominal species, possibly seawater temperature for E. cordatum and chlorophyll a concentration for E. mediterraneum. Within the E. cordatum complex, spawning timing and synchrony are different according to major geographic areas (Atlantic/Pacific/Mediterranean) and/or the corresponding genetic subdivision [A/P/(B1 & B2)]. In contrast, the effects of temperature on the reproductive cycle seem rather to mirror the genetic lineages than environmental similarities of the different localities. Between the sister species (B1 & B2) no differences could be detected, maybe due to small sample sizes. Copyright © 2010 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  8. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    Energy Technology Data Exchange (ETDEWEB)

    Paulson, Patrick R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Purohit, Sumit [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rodriguez, Luke R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  9. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    Science.gov (United States)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  10. Quantification of virus syndrome in chili peppers

    African Journals Online (AJOL)

    Jane

    2011-06-15

    Jun 15, 2011 ... One of the most important problems to produce chili crops is the presence of diseases caused by pathogen agents, such as viruses, therefore, there is a substantial necessity to better predict the behavior of the diseases of these crops, determining a more precise quantification of the disease's syndrome that ...

  11. The quantificational asymmetry: A comparative look

    NARCIS (Netherlands)

    van Koert, M.; Koeneman, O.; Weerman, F.; Hulk, A.

    2015-01-01

    The traditional account of the Delay of Principle B Effect (DPBE) predicts that all languages that show a DPBE will also reveal a Quantificational Asymmetry (QA). Children's performance on object-pronouns must therefore improve when a QP-subject replaces the NP-subject. These QA results have been

  12. Quantification of glycyrrhizin biomarker in Glycyrrhiza glabra ...

    African Journals Online (AJOL)

    Background: A simple and sensitive thin-layer chromatographic method has been established for quantification of glycyrrhizin in Glycyrrhiza glabra rhizome and baby herbal formulations by validated Reverse Phase HPTLC method. Materials and Methods: RP-HPTLC Method was carried out using glass coated with RP-18 ...

  13. Colour thresholding and objective quantification in bioimaging

    Science.gov (United States)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  14. Quantification of Cannabinoid Content in Cannabis

    Science.gov (United States)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  15. FRANX. Application for analysis and quantification of the APS fire; FRANK. Aplicacion para el analisis y cuantificacion de los APS de incendios

    Energy Technology Data Exchange (ETDEWEB)

    Snchez, A.; Osorio, F.; Ontoso, N.

    2014-07-01

    The FRANX application has been developed by EPRI within the Risk and Reliability User Group in order to facilitate the process of quantification and updating APS Fire (also covers floods and earthquakes). By applying fire scenarios are quantified in the central integrating the tasks performed during the APS fire. This paper describes the main features of the program to allow quantification of an APS Fire. (Author)

  16. 17 CFR 190.07 - Calculation of allowed net equity.

    Science.gov (United States)

    2010-04-01

    ... equity. 190.07 Section 190.07 Commodity and Securities Exchanges COMMODITY FUTURES TRADING COMMISSION BANKRUPTCY § 190.07 Calculation of allowed net equity. Allowed net equity shall be computed as follows: (a) Allowed claim. The allowed net equity claim of a customer shall be equal to the aggregate of the funded...

  17. 25 CFR 117.4 - Disbursement of allowance funds.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Disbursement of allowance funds. 117.4 Section 117.4... COMPETENCY § 117.4 Disbursement of allowance funds. Except as provided in § 117.5, all allowance funds shall... may recognize a power of attorney executed by the Indian and may disburse the allowance funds of the...

  18. 7 CFR 97.102 - Amendments after allowance.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Amendments after allowance. 97.102 Section 97.102... PLANT VARIETY AND PROTECTION Examinations, Allowances, and Denials § 97.102 Amendments after allowance. Amendments to the application, after the notice of allowance is issued, may be made, if the certificate has...

  19. 38 CFR 21.4145 - Work-study allowance.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Work-study allowance. 21...; Educational Assistance Allowance § 21.4145 Work-study allowance. (a) Eligibility. (1) A veteran or reservist... rate of three-quarter time or full time is eligible to receive a work-study allowance. (2) An eligible...

  20. 38 CFR 21.5131 - Educational assistance allowance.

    Science.gov (United States)

    2010-07-01

    ... allowance. 21.5131 Section 21.5131 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... 38 U.S.C. Chapter 32 Payments; Educational Assistance Allowance § 21.5131 Educational assistance allowance. (a) General. Statements in this section concerning payments of educational assistance allowance...

  1. 7 CFR 97.101 - Notice of allowance.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Notice of allowance. 97.101 Section 97.101 Agriculture... PLANT VARIETY AND PROTECTION Examinations, Allowances, and Denials § 97.101 Notice of allowance. If, on examination, it shall appear that the applicant is entitled to a certificate, a notice of allowance shall be...

  2. 38 CFR 21.268 - Employment adjustment allowance.

    Science.gov (United States)

    2010-07-01

    ... allowance. 21.268 Section 21.268 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS.... Chapter 31 Monetary Assistance Services § 21.268 Employment adjustment allowance. (a) General. A veteran... employment adjustment allowance for a period of two months at the full-time subsistence allowance rate for...

  3. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    Directory of Open Access Journals (Sweden)

    Ruolin Liu

    2017-11-01

    Full Text Available We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression.Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  4. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Optical coherence tomography assessment and quantification of intracoronary thrombus: Status and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Porto, Italo, E-mail: italo.porto@gmail.com [Interventional Cardiology Unit, San Donato Hospital, Arezzo (Italy); Mattesini, Alessio; Valente, Serafina [Interventional Cardiology Unit, Careggi Hospital, Florence (Italy); Prati, Francesco [Interventional Cardiology San Giovanni Hospital, Rome (Italy); CLI foundation (Italy); Crea, Filippo [Department of Cardiovascular Sciences, Catholic University of the Sacred Heart, Rome (Italy); Bolognese, Leonardo [Interventional Cardiology Unit, San Donato Hospital, Arezzo (Italy)

    2015-04-15

    Coronary angiography is the “golden standard” imaging technique in interventional cardiology and it is still widely used to guide interventions. A major drawback of this technique, however, is that it is inaccurate in the evaluation and quantification of intracoronary thrombus burden, a critical prognosticator and predictor of intraprocedural complications in acute coronary syndromes. The introduction of optical coherence tomography (OCT) holds the promise of overcoming this important limitation, as near-infrared light is uniquely sensitive to hemoglobin, the pigment of red blood cells trapped in the thrombus. This narrative review will focus on the use of OCT for the assessment, evaluation and quantification of intracoronary thrombosis. - Highlights: • Thrombotic burden in acute coronary syndromes Is not adequately evaluated by standard coronary angiography, whereas Optical Coherence Tomography is exquisitely sensitive to the hemoglobin contained in red blood cells and can be used to precisely quantify thrombus. • Both research and clinical applications have been developed using the OCT-based evaluation of thrombus. In particular, whereas precise quantification scores are useful for comparing antithrombotic therapies in randomized trials, both pharmacological and mechanical, the most important practical applications for OCT-based assessment of thrombus are the individuation of culprit lesions in the context of diffuse atheromata in acute coronary syndromes, and the so-called “delayed stenting” strategies. • Improvements in 3D rendering techniques are on the verge of revolutionizing OCT-based thrombus assessment, allowing extremely precise quantification of the thrombotic burden.

  6. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    Science.gov (United States)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  7. Validated method for phytohormone quantification in plants

    Directory of Open Access Journals (Sweden)

    Marilia eAlmeida-Trapp

    2014-08-01

    Full Text Available Phytohormones are long time known as important components of signalling cascades in plant development and plant responses to various abiotic and biotic challenges. Quantifications of phytohormone levels in plants are typically carried out using GC or LC-MS/MS systems, due to their high sensitivity, specificity, and the fact that not much sample preparation is needed. However, mass spectrometer-based analyses are often affected by the particular sample type (different matrices, extraction procedure, and experimental setups, i.e. the chromatographic separation system and/or mass spectrometer analyser (Triple-quadrupole, Iontrap, TOF, Orbitrap. For these reasons, a validated method is required in order to enable comparison of data that are generated in different laboratories, under different experimental set-ups, and in different matrices.So far, many phytohormone quantification studies were done using either QTRAP or Triple-quadrupole mass spectrometers. None of them was performed under the regime of a fully-validated method. Therefore, we developed and established such validated method for quantification of stress-related phytohormones such as jasmonates, abscisic acid, salicylic acid, IAA, in the model plant Arabidopsis thaliana and the fruit crop Citrus sinensis, using an Iontrap mass spectrometer. All parameters recommended by FDA (US Food and Drug Administration or EMEA (European Medicines Evaluation Agency for validation of analytical methods were evaluated: sensitivity, selectivity, repeatability and reproducibility (accuracy and precision.

  8. Developmental validation of the Quantifiler(®) HP and Trio Kits for human DNA quantification in forensic samples.

    Science.gov (United States)

    Holt, Allison; Wootton, Sharon Chao; Mulero, Julio J; Brzoska, Pius M; Langit, Emanuel; Green, Robert L

    2016-03-01

    The quantification of human genomic DNA is a necessary first step in the DNA casework sample analysis workflow. DNA quantification determines optimal sample input amounts for subsequent STR (short tandem repeat) genotyping procedures, as well as being a useful screening tool to identify samples most likely to provide probative genotypic evidence. To better mesh with the capabilities of newest-generation STR analysis assays, the Quantifiler(®) HP and Quantifiler(®) Trio DNA Quantification Kits were designed for greater detection sensitivity and more robust performance with samples that contain PCR inhibitors or degraded DNA. The new DNA quantification kits use multiplex TaqMan(®) assay-based fluorescent probe technology to simultaneously quantify up to three human genomic targets, allowing samples to be assessed for total human DNA, male contributor (i.e., Y-chromosome) DNA, as well as a determination of DNA degradation state. The Quantifiler HP and Trio Kits use multiple-copy loci to allow for significantly improved sensitivity compared to earlier-generation kits that employ single-copy target loci. The kits' improved performance provides better predictive ability for results with downstream, newest-generation STR assays, and their shortened time-to-result allows more efficient integration into the forensic casework analysis workflow. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. 41 CFR 101-27.503 - Allowable credit.

    Science.gov (United States)

    2010-07-01

    ... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Allowable credit. 101-27...-Return of GSA Stock Items § 101-27.503 Allowable credit. Allowable credit for activities returning... condition of the material received. (a) Credit will be granted at the rate of 80 percent of the current GSA...

  10. 14 CFR 151.125 - Allowable advance planning costs.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Allowable advance planning costs. 151.125... (CONTINUED) AIRPORTS FEDERAL AID TO AIRPORTS Rules and Procedures for Advance Planning and Engineering Proposals § 151.125 Allowable advance planning costs. (a) The United States' share of the allowable costs of...

  11. 7 CFR 3560.205 - Rent and utility allowance changes.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 15 2010-01-01 2010-01-01 false Rent and utility allowance changes. 3560.205 Section..., DEPARTMENT OF AGRICULTURE DIRECT MULTI-FAMILY HOUSING LOANS AND GRANTS Rents § 3560.205 Rent and utility allowance changes. (a) General. Borrowers must fully document that changes to rents and utility allowances...

  12. 48 CFR 2152.231-70 - Accounting and allowable cost.

    Science.gov (United States)

    2010-10-01

    ... allowable cost. As prescribed in 2131.270, insert the following clause: Accounting and Allowable Cost (OCT... cost; (ii) Incurred with proper justification and accounting support; (iii) Determined in accordance... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Accounting and allowable...

  13. 46 CFR 54.25-5 - Corrosion allowance.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Corrosion allowance. 54.25-5 Section 54.25-5 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Construction With Carbon, Alloy, and Heat Treated Steels § 54.25-5 Corrosion allowance. The corrosion allowance...

  14. 46 CFR 154.412 - Cargo tank corrosion allowance.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 5 2010-10-01 2010-10-01 false Cargo tank corrosion allowance. 154.412 Section 154.412... Containment Systems § 154.412 Cargo tank corrosion allowance. A cargo tank must be designed with a corrosion...) carries a cargo that corrodes the tank material. Note: Corrosion allowance for independent tank type C is...

  15. 42 CFR 50.504 - Allowable cost of drugs.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Allowable cost of drugs. 50.504 Section 50.504... APPLICABILITY Maximum Allowable Cost for Drugs § 50.504 Allowable cost of drugs. (a) The maximum amount which may be expended from program funds for the acquisition of any drug shall be the lowest of (1) The...

  16. 20 CFR 627.435 - Cost principles and allowable costs.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Cost principles and allowable costs. 627.435... principles and allowable costs. (a) General. To be allowable, a cost shall be necessary and reasonable for... treatment through application of generally accepted accounting principles appropriate to the JTPA program...

  17. 32 CFR 842.35 - Depreciation and maximum allowances.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 6 2010-07-01 2010-07-01 false Depreciation and maximum allowances. 842.35... LITIGATION ADMINISTRATIVE CLAIMS Personnel Claims (31 U.S.C. 3701, 3721) § 842.35 Depreciation and maximum allowances. The military services have jointly established the “Allowance List-Depreciation Guide” to...

  18. 45 CFR 1801.43 - Allowance for books.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Allowance for books. 1801.43 Section 1801.43... HARRY S. TRUMAN SCHOLARSHIP PROGRAM Payments to Finalists and Scholars § 1801.43 Allowance for books. The cost allowance for a Scholar's books is $1000 per year, or such higher amount published on the...

  19. 26 CFR 1.6411-3 - Allowance of adjustments.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 13 2010-04-01 2010-04-01 false Allowance of adjustments. 1.6411-3 Section 1... (CONTINUED) INCOME TAXES Abatements, Credits, and Refunds § 1.6411-3 Allowance of adjustments. (a) Time... payable on or after the date of the allowance of the decrease; and (iii) An amount (not including an...

  20. 27 CFR 20.24 - Allowance of claims.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Allowance of claims. 20.24 Section 20.24 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT... Authorities § 20.24 Allowance of claims. The appropriate TTB officer is authorized to allow claims for losses...

  1. 19 CFR 191.171 - General; drawback allowance.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false General; drawback allowance. 191.171 Section 191...; drawback allowance. (a) General. Section 313(p) of the Act, as amended (19 U.S.C. 1313(p)), provides for... drawback under the manufacturing drawback law (19 U.S.C. 1313(a) or (b)). (b) Allowance of drawback...

  2. 75 FR 54069 - U.S. Paralympics Monthly Assistance Allowance

    Science.gov (United States)

    2010-09-03

    .... Paralympics Monthly Assistance Allowance AGENCY: Department of Veterans Affairs. ACTION: Proposed rule. SUMMARY: This document proposes to establish regulations for the payment of a monthly assistance allowance... of a monthly assistance allowance to a veteran with a service-connected or nonservice-connected...

  3. 37 CFR 1.312 - Amendments after allowance.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Amendments after allowance. 1..., DEPARTMENT OF COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Allowance and Issue of Patent § 1.312 Amendments after allowance. No amendment may be made as a matter of right in an...

  4. 40 CFR 97.42 - NOX allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false NOX allowance allocations. 97.42... (CONTINUED) FEDERAL NOX BUDGET TRADING PROGRAM AND CAIR NOX AND SO2 TRADING PROGRAMS NOX Allowance Allocations § 97.42 NOX allowance allocations. (a)(1) The heat input (in mmBtu) used for calculating NOX...

  5. 19 CFR 144.3 - Allowance for damage.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Allowance for damage. 144.3 Section 144.3 Customs... (CONTINUED) WAREHOUSE AND REWAREHOUSE ENTRIES AND WITHDRAWALS General Provisions § 144.3 Allowance for damage. No abatement or allowance of duties shall be made on account of damage, loss, or deterioration of the...

  6. 27 CFR 22.23 - Allowance of claims.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Allowance of claims. 22.23 Section 22.23 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT... Authorities § 22.23 Allowance of claims. The appropriate TTB officer is authorized to allow claims for losses...

  7. 40 CFR 97.142 - CAIR NOX allowance allocations.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false CAIR NOX allowance allocations. 97.142... (CONTINUED) FEDERAL NOX BUDGET TRADING PROGRAM AND CAIR NOX AND SO2 TRADING PROGRAMS CAIR NOX Allowance Allocations § 97.142 CAIR NOX allowance allocations. (a)(1) The baseline heat input (in mmBtu) used with...

  8. 14 CFR 1261.109 - Computation of allowance.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Computation of allowance. 1261.109 Section... CLAIMS (GENERAL) Employees' Personal Property Claims § 1261.109 Computation of allowance. (a) The amount... exchange). There will be no allowance for replacement cost or for appreciation in the value of the property...

  9. 38 CFR 38.629 - Outer Burial Receptacle Allowance.

    Science.gov (United States)

    2010-07-01

    ... Allowance. 38.629 Section 38.629 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS... Allowance. (a) Definitions—Outer burial receptacle. For purposes of this section, an outer burial receptacle... section provides for payment of a monetary allowance for an outer burial receptacle for any interment in a...

  10. 27 CFR 40.282 - Allowance of tax.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Allowance of tax. 40.282... PROCESSED TOBACCO Claims by Manufacturers General § 40.282 Allowance of tax. Relief from the payment of tax on tobacco products may be extended to a manufacturer by allowance of the tax where the tobacco...

  11. 38 CFR 21.5130 - Payments; educational assistance allowance.

    Science.gov (United States)

    2010-07-01

    ... assistance allowance. 21.5130 Section 21.5130 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS... Assistance Under 38 U.S.C. Chapter 32 Payments; Educational Assistance Allowance § 21.5130 Payments; educational assistance allowance. VA will apply the following sections in administering benefits payable under...

  12. 32 CFR 584.7 - Basic allowance for quarters.

    Science.gov (United States)

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Basic allowance for quarters. 584.7 Section 584.7... CUSTODY, AND PATERNITY § 584.7 Basic allowance for quarters. (a) Eligibility. (1) Soldiers entitled to... dependents” under certain conditions. The Department of Defense Military Pay and Allowances Entitlements...

  13. 47 CFR 32.1171 - Allowance for doubtful accounts.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Allowance for doubtful accounts. 32.1171....1171 Allowance for doubtful accounts. (a) This account shall be credited with amounts charged to... amounts covered thereby which have been found to be impracticable of collection. (b) If no such allowance...

  14. 40 CFR 72.95 - Allowance deduction formula.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Allowance deduction formula. 72.95... (CONTINUED) PERMITS REGULATION Compliance Certification § 72.95 Allowance deduction formula. The following formula shall be used to determine the total number of allowances to be deducted for the calendar year...

  15. 76 FR 14282 - U.S. Paralympics Monthly Assistance Allowance

    Science.gov (United States)

    2011-03-16

    ... AFFAIRS 38 CFR Part 76 RIN 2900-AN43 U.S. Paralympics Monthly Assistance Allowance AGENCY: Department of...) regulations regarding the payment of a monthly assistance allowance to veterans training to make the United.... The rule requires submission of an application to establish eligibility for the allowance and...

  16. 34 CFR 673.7 - Administrative cost allowance.

    Science.gov (United States)

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false Administrative cost allowance. 673.7 Section 673.7... Federal Perkins Loan, FWS, and FSEOG Programs § 673.7 Administrative cost allowance. (a) An institution... allowance for an award year if it advances funds under the Federal Perkins Loan Program, provides FWS...

  17. 37 CFR 1.311 - Notice of allowance.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Notice of allowance. 1.311... COMMERCE GENERAL RULES OF PRACTICE IN PATENT CASES National Processing Provisions Allowance and Issue of Patent § 1.311 Notice of allowance. (a) If, on examination, it appears that the applicant is entitled to...

  18. 38 CFR 21.332 - Payments of subsistence allowance.

    Science.gov (United States)

    2010-07-01

    ... allowance. 21.332 Section 21.332 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS.... Chapter 31 Authorization of Subsistence Allowance and Training and Rehabilitation Services § 21.332 Payments of subsistence allowance. (a) Eligibility. At the end of the month, VA shall pay to an eligible...

  19. 38 CFR 21.9670 - Work-study allowance.

    Science.gov (United States)

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Work-study allowance. 21...-study allowance. An eligible individual pursuing a program of education under 38 U.S.C. chapter 33 at a rate of pursuit of at least 75 percent may receive a work-study allowance in accordance with the...

  20. 26 CFR 1.6425-3 - Allowance of adjustments.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 13 2010-04-01 2010-04-01 false Allowance of adjustments. 1.6425-3 Section 1... (CONTINUED) INCOME TAXES Abatements, Credits, and Refunds § 1.6425-3 Allowance of adjustments. (a) Limitation... application and the allowance of the adjustment shall not prejudice any right of the Service to claim later...

  1. 20 CFR 702.507 - Vocational rehabilitation; maintenance allowance.

    Science.gov (United States)

    2010-04-01

    ... allowance. 702.507 Section 702.507 Employees' Benefits EMPLOYMENT STANDARDS ADMINISTRATION, DEPARTMENT OF... PROCEDURE Vocational Rehabilitation § 702.507 Vocational rehabilitation; maintenance allowance. (a) An... section 44 of the Act, 33 U.S.C. 944. The maximum maintenance allowance shall not be provided on an...

  2. 20 CFR 638.524 - Allowances and allotments.

    Science.gov (United States)

    2010-04-01

    ... medical termination, he/she shall be eligible for the accrued readjustment allowance, regardless of length... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowances and allotments. 638.524 Section... PROGRAM UNDER TITLE IV-B OF THE JOB TRAINING PARTNERSHIP ACT Center Operations § 638.524 Allowances and...

  3. OVERSEAS HOUSING ALLOWANCE FOR GUAM: A NEW WAY FORWARD

    Science.gov (United States)

    2016-04-01

    before changes can be implemented. Perhaps the Service members wanted equality rather than inequality in housing allowances . Regardless , it is...AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY OVERSEAS HOUSING ALLOWANCE FOR GUAM: A NEW WAY FORWARD by Veronica...4 Brief Overview of OHA and BAH Military Housing Allowances ......................................... 4 Law and Policies

  4. 50 CFR 665.227 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Fisheries § 665.227 Allowable gear and gear restrictions. (a) Hawaii coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...

  5. 50 CFR 665.627 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Island Area Fisheries § 665.627 Allowable gear and gear restrictions. (a) Coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...

  6. 50 CFR 665.127 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Fisheries § 665.127 Allowable gear and gear restrictions. (a) American Samoa coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp gun; (4...

  7. 50 CFR 665.427 - Allowable gear and gear restrictions.

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Allowable gear and gear restrictions. 665... Archipelago Fisheries § 665.427 Allowable gear and gear restrictions. (a) Mariana coral reef ecosystem MUS may be taken only with the following allowable gear and methods: (1) Hand harvest; (2) Spear; (3) Slurp...

  8. Self-digitization microfluidic chip for absolute quantification of mRNA in single cells.

    Science.gov (United States)

    Thompson, Alison M; Gansen, Alexander; Paguirigan, Amy L; Kreutz, Jason E; Radich, Jerald P; Chiu, Daniel T

    2014-12-16

    Quantification of mRNA in single cells provides direct insight into how intercellular heterogeneity plays a role in disease progression and outcomes. Quantitative polymerase chain reaction (qPCR), the current gold standard for evaluating gene expression, is insufficient for providing absolute measurement of single-cell mRNA transcript abundance. Challenges include difficulties in handling small sample volumes and the high variability in measurements. Microfluidic digital PCR provides far better sensitivity for minute quantities of genetic material, but the typical format of this assay does not allow for counting of the absolute number of mRNA transcripts samples taken from single cells. Furthermore, a large fraction of the sample is often lost during sample handling in microfluidic digital PCR. Here, we report the absolute quantification of single-cell mRNA transcripts by digital, one-step reverse transcription PCR in a simple microfluidic array device called the self-digitization (SD) chip. By performing the reverse transcription step in digitized volumes, we find that the assay exhibits a linear signal across a wide range of total RNA concentrations and agrees well with standard curve qPCR. The SD chip is found to digitize a high percentage (86.7%) of the sample for single-cell experiments. Moreover, quantification of transferrin receptor mRNA in single cells agrees well with single-molecule fluorescence in situ hybridization experiments. The SD platform for absolute quantification of single-cell mRNA can be optimized for other genes and may be useful as an independent control method for the validation of mRNA quantification techniques.

  9. Dietary Sugars Analysis: Quantification of Fructooligossacharides during Fermentation by HPLC-RI Method.

    Science.gov (United States)

    Correia, Daniela M; Dias, Luís G; Veloso, Ana C A; Dias, Teresa; Rocha, Isabel; Rodrigues, Lígia R; Peres, António M

    2014-01-01

    In this work, a simple chromatographic method is proposed and in-house validated for the quantification of total and individual fructooligossacharides (e.g., 1-kestose, nystose, and 1(F)-fructofuranosylnystose). It was shown that a high-performance liquid chromatography with refractive index detector could be used to monitor the dynamic of fructooligossacharides production via sucrose fermentation using Aspergillus aculeatus. This analytical technique may be easily implemented at laboratorial or industrial scale for fructooligossacharides mass-production monitoring allowing also controlling the main substrate (sucrose) and the secondary by-products (glucose and fructose). The proposed chromatographic method had a satisfactory intra- and inter-day variability (in general, with a relative standard deviation lower than 5%), high sensitivity for each sugar (usually, with a relative error lower than 5%), and low detection (lower than 0.06 ± 0.04 g/L) and quantification (lower than 0.2 ± 0.1 g/L) limits. The correct quantification of fructooligossacharides in fermentative media may allow a more precise nutritional formulation of new functional foods, since it is reported that different fructooligossacharides exhibit different biological activities and effects.

  10. Quantification of Nociceptive Escape Response in C.elegans

    Science.gov (United States)

    Leung, Kawai; Mohammadi, Aylia; Ryu, William; Nemenman, Ilya

    2013-03-01

    Animals cannot rank and communicate their pain consciously. Thus in pain studies on animal models, one must infer the pain level from high precision experimental characterization of behavior. This is not trivial since behaviors are very complex and multidimensional. Here we explore the feasibility of C.elegans as a model for pain transduction. The nematode has a robust neurally mediated noxious escape response, which we show to be partially decoupled from other sensory behaviors. We develop a nociceptive behavioral response assay that allows us to apply controlled levels of pain by locally heating worms with an IR laser. The worms' motions are captured by machine vision programming with high spatiotemporal resolution. The resulting behavioral quantification allows us to build a statistical model for inference of the experienced pain level from the behavioral response. Based on the measured nociceptive escape of over 400 worms, we conclude that none of the simple characteristics of the response are reliable indicators of the laser pulse strength. Nonetheless, a more reliable statistical inference of the pain stimulus level from the measured behavior is possible based on a complexity-controlled regression model that takes into account the entire worm behavioral output. This work was partially supported by NSF grant No. IOS/1208126 and HFSP grant No. RGY0084/2011.

  11. Temporal Delineation and Quantification of Short Term Clustered Mining Seismicity

    Science.gov (United States)

    Woodward, Kyle; Wesseloo, Johan; Potvin, Yves

    2017-07-01

    The assessment of the temporal characteristics of seismicity is fundamental to understanding and quantifying the seismic hazard associated with mining, the effectiveness of strategies and tactics used to manage seismic hazard, and the relationship between seismicity and changes to the mining environment. This article aims to improve the accuracy and precision in which the temporal dimension of seismic responses can be quantified and delineated. We present a review and discussion on the occurrence of time-dependent mining seismicity with a specific focus on temporal modelling and the modified Omori law (MOL). This forms the basis for the development of a simple weighted metric that allows for the consistent temporal delineation and quantification of a seismic response. The optimisation of this metric allows for the selection of the most appropriate modelling interval given the temporal attributes of time-dependent mining seismicity. We evaluate the performance weighted metric for the modelling of a synthetic seismic dataset. This assessment shows that seismic responses can be quantified and delineated by the MOL, with reasonable accuracy and precision, when the modelling is optimised by evaluating the weighted MLE metric. Furthermore, this assessment highlights that decreased weighted MLE metric performance can be expected if there is a lack of contrast between the temporal characteristics of events associated with different processes.

  12. Quantification of sea ice production in Weddell Sea polynyas

    Science.gov (United States)

    Zentek, Rolf; Heinemann, Günther; Paul, Stephan; Stulic, Lukrecia; Timmermann, Ralph

    2017-04-01

    The regional climate model COSMO-CLM was used to perform simulations the Weddell Sea region in Antarctica for the time period 2002-2015 with the focus on atmosphere-ocean-sea ice interactions. The original model was adapted to polar regions by the use of a thermodynamic sea ice module with snow cover and an temperature-dependent albedo scheme for sea ice. The recently published topography RTopo2 was used. The model was run with nesting in ERA-Interim data in a forecast mode. Sea ice concentrations were taken from satellite measurements (AMSR-E, SSMI/S, AMSR2) and were updated daily to allow for a close-to-reality hindcast. Simulations were done with 15 km resolution for the whole period 2002-2015 with the goal to force the sea-ice ocean model FESOM. In a second step a 5 km simulation was one-way nested for the winter period (April - September) 2002-2015 to allow for a better quantification of sea ice production in the Weddell Sea. Estimates of sea ice production and comparisons of the results to remote sensing data will be presented.

  13. Dielectrophoretic immobilization of proteins: Quantification by atomic force microscopy.

    Science.gov (United States)

    Laux, Eva-Maria; Knigge, Xenia; Bier, Frank F; Wenger, Christian; Hölzel, Ralph

    2015-09-01

    The combination of alternating electric fields with nanometer-sized electrodes allows the permanent immobilization of proteins by dielectrophoretic force. Here, atomic force microscopy is introduced as a quantification method, and results are compared with fluorescence microscopy. Experimental parameters, for example the applied voltage and duration of field application, are varied systematically, and the influence on the amount of immobilized proteins is investigated. A linear correlation to the duration of field application was found by atomic force microscopy, and both microscopical methods yield a square dependence of the amount of immobilized proteins on the applied voltage. While fluorescence microscopy allows real-time imaging, atomic force microscopy reveals immobilized proteins obscured in fluorescence images due to low S/N. Furthermore, the higher spatial resolution of the atomic force microscope enables the visualization of the protein distribution on single nanoelectrodes. The electric field distribution is calculated and compared to experimental results with very good agreement to atomic force microscopy measurements. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Quantification of metformin and glyburide in blood for paediatric endocrinology

    Directory of Open Access Journals (Sweden)

    Radamés Alemón-Medina

    2014-10-01

    Full Text Available Background: The recent use of antidiabetic drugs such as metformin and glyburide for the treatment and control of childhood obesity, insulin resistance and type II diabetes mellitus in children and adolescents, has encouraged physicians to determine plasma levels of these drugs for the right dose adjustment. Objective: To implement and validate a UPLC-UV method to quantify metformin and glyburide in blood samples. Materials and methods: Only a 0.1 mL-volume blood sample was used. Both drugs are removed by precipitation with methanol. Quantitation was carried out with mobile phase of 4.6 mM potassium phosphate monobasic (KH2PO4 0.1 M pH = 6.5, sodium dodecyl sulphate (SDS and acetonitrile (63:7:30, at 0.8 mL/min through a VARIAN Pursuit® C8 150 x 3.9 mm column at 40°C, 236 nm. Results: The method allows the measurement of 20 to 600 nanograms of metformin and from 100 to 2 000 nanograms of glyburide per milliliter of blood. Both drugs are physicochemically stable in blood samples for up to 30 days at 4°C. Conclusion: Our method allows quantification of metformin and gly- buride in paediatric blood samples, to support the clinicians to monitor treatment compliance, bioavailability and pharmacokinetic profiles.

  15. Computer-assisted quantification of CD3+ T cells in follicular lymphoma.

    Science.gov (United States)

    Abas, Fazly S; Shana'ah, Arwa; Christian, Beth; Hasserjian, Robert; Louissaint, Abner; Pennell, Michael; Sahiner, Berkman; Chen, Weijie; Niazi, Muhammad Khalid Khan; Lozanski, Gerard; Gurcan, Metin

    2017-06-01

    The advance of high resolution digital scans of pathology slides allowed development of computer based image analysis algorithms that may help pathologists in IHC stains quantification. While very promising, these methods require further refinement before they are implemented in routine clinical setting. Particularly critical is to evaluate algorithm performance in a setting similar to current clinical practice. In this article, we present a pilot study that evaluates the use of a computerized cell quantification method in the clinical estimation of CD3 positive (CD3+) T cells in follicular lymphoma (FL). Our goal is to demonstrate the degree to which computerized quantification is comparable to the practice of estimation by a panel of expert pathologists. The computerized quantification method uses entropy based histogram thresholding to separate brown (CD3+) and blue (CD3-) regions after a color space transformation. A panel of four board-certified hematopathologists evaluated a database of 20 FL images using two different reading methods: visual estimation and manual marking of each CD3+ cell in the images. These image data and the readings provided a reference standard and the range of variability among readers. Sensitivity and specificity measures of the computer's segmentation of CD3+ and CD- T cell are recorded. For all four pathologists, mean sensitivity and specificity measures are 90.97 and 88.38%, respectively. The computerized quantification method agrees more with the manual cell marking as compared to the visual estimations. Statistical comparison between the computerized quantification method and the pathologist readings demonstrated good agreement with correlation coefficient values of 0.81 and 0.96 in terms of Lin's concordance correlation and Spearman's correlation coefficient, respectively. These values are higher than most of those calculated among the pathologists. In the future, the computerized quantification method may be used to investigate

  16. "Suntelligence" Survey

    Science.gov (United States)

    ... the American Academy of Dermatology's "Suntelligence" sun-smart survey. Please answer the following questions to measure your ... how you incorporate it into your life. The survey will take 5 to 7 minutes to complete. ...

  17. Quantification of structural uncertainties in multi-scale models; case study of the Lublin Basin, Poland

    Science.gov (United States)

    Małolepszy, Zbigniew; Szynkaruk, Ewa

    2015-04-01

    The multiscale static modeling of regional structure of the Lublin Basin is carried on in the Polish Geological Institute, in accordance with principles of integrated 3D geological modelling. The model is based on all available geospatial data from Polish digital databases and analogue archives. Mapped regional structure covers the area of 260x80 km located between Warsaw and Polish-Ukrainian border, along NW-SE-trending margin of the East European Craton. Within the basin, the Paleozoic beds with coalbearing Carboniferous and older formations containing hydrocarbons and unconventional prospects are covered unconformably by Permo-Mesozoic and younger rocks. Vertical extent of the regional model is set from topographic surface to 6000 m ssl and at the bottom includes some Proterozoic crystalline formations of the craton. The project focuses on internal consistency of the models built at different scales - from basin (small) scale to field-scale (large-scale). The models, nested in the common structural framework, are being constructed with regional geological knowledge, ensuring smooth transition in the 3D model resolution and amount of geological detail. Major challenge of the multiscale approach to subsurface modelling is the assessment and consistent quantification of various types of geological uncertainties tied to those various scale sub-models. Decreasing amount of information with depth and, particularly, very limited data collected below exploration targets, as well as accuracy and quality of data, all have the most critical impact on the modelled structure. In deeper levels of the Lublin Basin model, seismic interpretation of 2D surveys is sparsely tied to well data. Therefore time-to-depth conversion carries one of the major uncertainties in the modeling of structures, especially below 3000 m ssl. Furthermore, as all models at different scales are based on the same dataset, we must deal with different levels of generalization of geological structures. The

  18. Quantification of viral DNA during HIV-1 infection: A review of relevant clinical uses and laboratory methods.

    Science.gov (United States)

    Alidjinou, E K; Bocket, L; Hober, D

    2015-02-01

    Effective antiretroviral therapy usually leads to undetectable HIV-1 RNA in the plasma. However, the virus persists in some cells of infected patients as various DNA forms, both integrated and unintegrated. This reservoir represents the greatest challenge to the complete cure of HIV-1 infection and its characteristics highly impact the course of the disease. The quantification of HIV-1 DNA in blood samples constitutes currently the most practical approach to measure this residual infection. Real-time quantitative PCR (qPCR) is the most common method used for HIV-DNA quantification and many strategies have been developed to measure the different forms of HIV-1 DNA. In the literature, several "in-house" PCR methods have been used and there is a need for standardization to have comparable results. In addition, qPCR is limited for the precise quantification of low levels by background noise. Among new assays in development, digital PCR was shown to allow an accurate quantification of HIV-1 DNA. Total HIV-1 DNA is most commonly measured in clinical routine. The absolute quantification of proviruses and unintegrated forms is more often used for research purposes. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  19. Rapid capillary electrophoresis approach for the quantification of ewe milk adulteration with cow milk.

    Science.gov (United States)

    Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico

    2017-10-13

    The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. 40 CFR 82.10 - Availability of consumption allowances in addition to baseline consumption allowances for class I...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Availability of consumption allowances in addition to baseline consumption allowances for class I controlled substances. 82.10 Section 82.10... STRATOSPHERIC OZONE Production and Consumption Controls § 82.10 Availability of consumption allowances in...

  1. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  2. Survey Researchers and Minority Communities

    Science.gov (United States)

    Weiss, Carol H.

    1977-01-01

    Survey research has not tried hard to benefit poor people by allowing community members to help shape the design and interpretation of research, upgrading the job skills and employability of indigenous interviewers, or providing referrals or other services to survey respondents. (Author)

  3. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses methods for efficient quantification of margins and uncertainties (QMU) for models that couple multiple, large-scale commercial or...

  4. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  5. Efficient Quantification of Uncertainties in Complex Computer Code Results Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Propagation of parameter uncertainties through large computer models can be very resource intensive. Frameworks and tools for uncertainty quantification are...

  6. Quantification of competitive value of documents

    Directory of Open Access Journals (Sweden)

    Pavel Šimek

    2009-01-01

    Full Text Available The majority of Internet users use the global network to search for different information using fulltext search engines such as Google, Yahoo!, or Seznam. The web presentation operators are trying, with the help of different optimization techniques, to get to the top places in the results of fulltext search engines. Right there is a great importance of Search Engine Optimization and Search Engine Marketing, because normal users usually try links only on the first few pages of the fulltext search engines results on certain keywords and in catalogs they use primarily hierarchically higher placed links in each category. Key to success is the application of optimization methods which deal with the issue of keywords, structure and quality of content, domain names, individual sites and quantity and reliability of backward links. The process is demanding, long-lasting and without a guaranteed outcome. A website operator without advanced analytical tools do not identify the contribution of individual documents from which the entire web site consists. If the web presentation operators want to have an overview of their documents and web site in global, it is appropriate to quantify these positions in a specific way, depending on specific key words. For this purpose serves the quantification of competitive value of documents, which consequently sets global competitive value of a web site. Quantification of competitive values is performed on a specific full-text search engine. For each full-text search engine can be and often are, different results. According to published reports of ClickZ agency or Market Share is according to the number of searches by English-speaking users most widely used Google search engine, which has a market share of more than 80%. The whole procedure of quantification of competitive values is common, however, the initial step which is the analysis of keywords depends on a choice of the fulltext search engine.

  7. Advances in forensic DNA quantification: a review.

    Science.gov (United States)

    Lee, Steven B; McCord, Bruce; Buel, Eric

    2014-11-01

    This review focuses upon a critical step in forensic biology: detection and quantification of human DNA from biological samples. Determination of the quantity and quality of human DNA extracted from biological evidence is important for several reasons. Firstly, depending on the source and extraction method, the quality (purity and length), and quantity of the resultant DNA extract can vary greatly. This affects the downstream method as the quantity of input DNA and its relative length can determine which genotyping procedure to use-standard short-tandem repeat (STR) typing, mini-STR typing or mitochondrial DNA sequencing. Secondly, because it is important in forensic analysis to preserve as much of the evidence as possible for retesting, it is important to determine the total DNA amount available prior to utilizing any destructive analytical method. Lastly, results from initial quantitative and qualitative evaluations permit a more informed interpretation of downstream analytical results. Newer quantitative techniques involving real-time PCR can reveal the presence of degraded DNA and PCR inhibitors, that provide potential reasons for poor genotyping results and may indicate methods to use for downstream typing success. In general, the more information available, the easier it is to interpret and process the sample resulting in a higher likelihood of successful DNA typing. The history of the development of quantitative methods has involved two main goals-improving precision of the analysis and increasing the information content of the result. This review covers advances in forensic DNA quantification methods and recent developments in RNA quantification. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Unintended Consequences of Price Controls: An Application to Allowance Markets

    OpenAIRE

    Stocking, Andrew

    2010-01-01

    Price controls established in an emissions allowance market to constrain allowance prices between a ceiling and a floor offer a mechanism to reduce cost uncertainty in a cap-and-trade program; however, they could provide opportunities for strategic actions by firms that would result in lower government revenue and greater emissions than in the absence of controls. In particular, when the ceiling price is supported by introducing new allowances into the market, firms could choose to buy allowa...

  9. Perfusion Quantification Using Gaussian Process Deconvolution

    DEFF Research Database (Denmark)

    Andersen, Irene Klærke; Have, Anna Szynkowiak; Rasmussen, Carl Edward

    2002-01-01

    The quantification of perfusion using dynamic susceptibility contrast MRI (DSC-MRI) requires deconvolution to obtain the residual impulse response function (IRF). In this work, a method using the Gaussian process for deconvolution (GPD) is proposed. The fact that the IRF is smooth is incorporated....... GPD provides a better estimate of the entire IRF. As the signal-to-noise ratio (SNR) increases or the time resolution of the measurements increases, GPD is shown to be superior to SVD. This is also found for large distribution volumes....

  10. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  11. Tutorial examples for uncertainty quantification methods.

    Energy Technology Data Exchange (ETDEWEB)

    De Bord, Sarah [Univ. of California, Davis, CA (United States)

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  12. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  13. The Modest, or Quantificational, Account of Truth

    Directory of Open Access Journals (Sweden)

    Wolfgang Künne

    2008-12-01

    Full Text Available Truth is a stable, epistemically unconstrained property of propositions, and the concept of truth admits of a non-reductive explanation: that, in a nutshell, is the view for which I argued in Conceptions of Truth. In this paper I try to explain that explanation in a more detailed and, hopefully, more perspicuous way than I did in Ch. 6.2 of the book and to defend its use of sentential quantification against some of the criticisms it has has come in for.

  14. Preclinical imaging characteristics and quantification of Platinum-195m SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Aalbersberg, E.A.; Wit-van der Veen, B.J. de; Vegt, E.; Vogel, Wouter V. [The Netherlands Cancer Institute (NKI-AVL), Department of Nuclear Medicine, Amsterdam (Netherlands); Zwaagstra, O.; Codee-van der Schilden, K. [Nuclear Research and Consultancy Group (NRG), Petten (Netherlands)

    2017-08-15

    In vivo biodistribution imaging of platinum-based compounds may allow better patient selection for treatment with chemo(radio)therapy. Radiolabeling with Platinum-195m ({sup 195m}Pt) allows SPECT imaging, without altering the chemical structure or biological activity of the compound. We have assessed the feasibility of {sup 195m}Pt SPECT imaging in mice, with the aim to determine the image quality and accuracy of quantification for current preclinical imaging equipment. Enriched (>96%) {sup 194}Pt was irradiated in the High Flux Reactor (HFR) in Petten, The Netherlands (NRG). A 0.05 M HCl {sup 195m}Pt-solution with a specific activity of 33 MBq/mg was obtained. Image quality was assessed for the NanoSPECT/CT (Bioscan Inc., Washington DC, USA) and U-SPECT{sup +}/CT (MILabs BV, Utrecht, the Netherlands) scanners. A radioactivity-filled rod phantom (rod diameter 0.85-1.7 mm) filled with 1 MBq {sup 195m}Pt was scanned with different acquisition durations (10-120 min). Four healthy mice were injected intravenously with 3-4 MBq {sup 195m}Pt. Mouse images were acquired with the NanoSPECT for 120 min at 0, 2, 4, or 24 h after injection. Organs were delineated to quantify {sup 195m}Pt concentrations. Immediately after scanning, the mice were sacrificed, and the platinum concentration was determined in organs using a gamma counter and graphite furnace - atomic absorption spectroscopy (GF-AAS) as reference standards. A 30-min acquisition of the phantom provided visually adequate image quality for both scanners. The smallest visible rods were 0.95 mm in diameter on the NanoSPECT and 0.85 mm in diameter on the U-SPECT{sup +}. The image quality in mice was visually adequate. Uptake was seen in the kidneys with excretion to the bladder, and in the liver, blood, and intestine. No uptake was seen in the brain. The Spearman correlation between SPECT and gamma counter was 0.92, between SPECT and GF-AAS it was 0.84, and between GF-AAS and gamma counter it was0.97 (all p < 0

  15. Bathymetric survey and estimation of the water balance of Lake ...

    African Journals Online (AJOL)

    Quantification of the water balance components and bathymetric survey is very crucial for sustainable management of lake waters. This paper focuses on the bathymetry and the water balance of the crater Lake Ardibo, recently utilized for irrigation. The bathymetric map of the lake is established at a contour interval of 10 ...

  16. 40 CFR 86.1725-01 - Allowable maintenance.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Allowable maintenance. 86.1725-01... Trucks § 86.1725-01 Allowable maintenance. This section includes text that specifies requirements that... are subject to the applicable Otto-cycle or diesel engine maintenance requirements of § 86.1834-01(b...

  17. Radical reconciliation: The TRC should have allowed Zacchaeus to ...

    African Journals Online (AJOL)

    2016-06-30

    Jun 30, 2016 ... Lephakga 2015). Therefore, this article will argue that if Jesus Christ and the Holy Spirit were invited into the processes of this commission, then Zacchaeus too should have been allowed to testify – so to say. Radical reconciliation: The TRC should have allowed Zacchaeus to testify? Read online: Scan this ...

  18. 46 CFR 310.62 - Allowances and expenses; required deposit.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 8 2010-10-01 2010-10-01 false Allowances and expenses; required deposit. 310.62... Allowances and expenses; required deposit. (a) Items furnished. Each midshipman shall receive: Free tuition... orders. (b) Required Deposit. Prior to admission to the Academy, each midshipman shall make a specified...

  19. 20 CFR 631.51 - Allowable substate program activities.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable substate program activities. 631.51 Section 631.51 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR PROGRAMS UNDER TITLE III OF THE JOB TRAINING PARTNERSHIP ACT Substate Programs § 631.51 Allowable substate...

  20. 30 CFR 33.33 - Allowable limits of dust concentration.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Allowable limits of dust concentration. 33.33... MINES Test Requirements § 33.33 Allowable limits of dust concentration. (a) The concentration of dust determined by the control sample shall be subtracted from the average concentration of dust determined by the...

  1. Federal Aid to Postsecondary Students: Tax Allowances and Alternative Subsidies.

    Science.gov (United States)

    Congress of the U.S., Washington, DC. Congressional Budget Office.

    Various aspects of tax allowances for the expenses of higher education, and alternative subsidies are analyzed. A tax allowance for education is presented as one way to give more financial relief to middle-income families. The current distribution of student aid among income groups is discussed and data on college enrollment rates, family incomes,…

  2. 42 CFR 447.54 - Maximum allowable and nominal charges.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Maximum allowable and nominal charges. 447.54 Section 447.54 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... Deductible, Coinsurance, Co-Payment Or Similar Cost-Sharing Charge § 447.54 Maximum allowable and nominal...

  3. 46 CFR 64.15 - Allowable stress; framework.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 2 2010-10-01 2010-10-01 false Allowable stress; framework. 64.15 Section 64.15... AND CARGO HANDLING SYSTEMS Standards for an MPT § 64.15 Allowable stress; framework. The calculated stress for the framework must be 80 percent or less of the minimum yield stress of the framework material...

  4. 27 CFR 70.131 - Conditions to allowance.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Conditions to allowance. 70.131 Section 70.131 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU... Excise and Special (Occupational) Tax Rule of Special Application § 70.131 Conditions to allowance. (a...

  5. 25 CFR 117.6 - Allowance for minors.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Allowance for minors. 117.6 Section 117.6 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES DEPOSIT AND EXPENDITURE OF... Allowance for minors. The superintendent may disburse from the surplus funds of an Indian under 21 years of...

  6. 47 CFR 32.1191 - Accounts receivable allowance-other.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Accounts receivable allowance-other. 32.1191....1191 Accounts receivable allowance—other. (a) This account shall be credited with amounts charged to... collection. (b) If no such allowance is maintained, uncollectible amounts shall be charged directly to...

  7. 27 CFR 44.232 - Allowance of claim.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 2 2010-04-01 2010-04-01 false Allowance of claim. 44.232 Section 44.232 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT..., WITHOUT PAYMENT OF TAX, OR WITH DRAWBACK OF TAX Drawback of Tax § 44.232 Allowance of claim. On receipt of...

  8. 14 CFR 14.05 - Allowance fees and expenses.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Allowance fees and expenses. 14.05 Section 14.05 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION PROCEDURAL RULES RULES IMPLEMENTING THE EQUAL ACCESS TO JUSTICE ACT OF 1980 General Provisions § 14.05 Allowance...

  9. 27 CFR 17.148 - Allowance of claims.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Allowance of claims. 17.148 Section 17.148 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU... PRODUCTS Claims for Drawback § 17.148 Allowance of claims. (a) General. Except in the case of fraudulent...

  10. 42 CFR 489.31 - Allowable charges: Blood.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Allowable charges: Blood. 489.31 Section 489.31... Allowable charges: Blood. (a) Limitations on charges. (1) A provider may charge the beneficiary (or other person on his or her behalf) only for the first three pints of blood or units of packed red cells...

  11. Poisson Plus Quantification for Digital PCR Systems.

    Science.gov (United States)

    Majumdar, Nivedita; Banerjee, Swapnonil; Pallas, Michael; Wessel, Thomas; Hegerich, Patricia

    2017-08-29

    Digital PCR, a state-of-the-art nucleic acid quantification technique, works by spreading the target material across a large number of partitions. The average number of molecules per partition is estimated using Poisson statistics, and then converted into concentration by dividing by partition volume. In this standard approach, identical partition sizing is assumed. Violations of this assumption result in underestimation of target quantity, when using Poisson modeling, especially at higher concentrations. The Poisson-Plus Model accommodates for this underestimation, if statistics of the volume variation are well characterized. The volume variation was measured on the chip array based QuantStudio 3D Digital PCR System using the ROX fluorescence level as a proxy for effective load volume per through-hole. Monte Carlo simulations demonstrate the efficacy of the proposed correction. Empirical measurement of model parameters characterizing the effective load volume on QuantStudio 3D Digital PCR chips is presented. The model was used to analyze digital PCR experiments and showed improved accuracy in quantification. At the higher concentrations, the modeling must take effective fill volume variation into account to produce accurate estimates. The extent of the difference from the standard to the new modeling is positively correlated to the extent of fill volume variation in the effective load of your reactions.

  12. CT quantification of central airway in tracheobronchomalacia

    Energy Technology Data Exchange (ETDEWEB)

    Im, Won Hyeong; Jin, Gong Yong; Han, Young Min; Kim, Eun Young [Dept. of Radiology, Chonbuk National University Hospital, Jeonju (Korea, Republic of)

    2016-05-15

    To know which factors help to diagnose tracheobronchomalacia (TBM) using CT quantification of central airway. From April 2013 to July 2014, 19 patients (68.0 ± 15.0 years; 6 male, 13 female) were diagnosed as TBM on CT. As case-matching, 38 normal subjects (65.5 ± 21.5 years; 6 male, 13 female) were selected. All 57 subjects underwent CT with end-inspiration and end-expiration. Airway parameters of trachea and both main bronchus were assessed using software (VIDA diagnostic). Airway parameters of TBM patients and normal subjects were compared using the Student t-test. In expiration, both wall perimeter and wall thickness in TBM patients were significantly smaller than normal subjects (wall perimeter: trachea, 43.97 mm vs. 49.04 mm, p = 0.020; right main bronchus, 33.52 mm vs. 42.69 mm, p < 0.001; left main bronchus, 26.76 mm vs. 31.88 mm, p = 0.012; wall thickness: trachea, 1.89 mm vs. 2.22 mm, p = 0.017; right main bronchus, 1.64 mm vs. 1.83 mm, p = 0.021; left main bronchus, 1.61 mm vs. 1.75 mm, p = 0.016). Wall thinning and decreased perimeter of central airway of expiration by CT quantification would be a new diagnostic indicators in TBM.

  13. Sky Surveys

    OpenAIRE

    Djorgovski, S. G.; Mahabal, A. A.; Drake, A.J.; Graham, M. J.; C. Donalek

    2012-01-01

    Sky surveys represent a fundamental data basis for astronomy. We use them to map in a systematic way the universe and its constituents, and to discover new types of objects or phenomena. We review the subject, with an emphasis on the wide-field imaging surveys, placing them in a broader scientific and historical context. Surveys are the largest data generators in astronomy, propelled by the advances in information and computation technology, and have transformed the ways in which astronomy is...

  14. Development of a VHH-Based Erythropoietin Quantification Assay

    DEFF Research Database (Denmark)

    Kol, Stefan; Beuchert Kallehauge, Thomas; Adema, Simon

    2015-01-01

    human EPO was evaluated as a capturing antibody in a label-free biolayer interferometry-based quantification assay. Human recombinant EPO can be specifically detected in Chinese hamster ovary cell supernatants in a sensitive and pH-dependent manner. This method enables rapid and robust quantification...

  15. Molecular quantification of genes encoding for green-fluorescent proteins

    DEFF Research Database (Denmark)

    Felske, A; Vandieken, V; Pauling, B V

    2003-01-01

    A quantitative PCR approach is presented to analyze the amount of recombinant green fluorescent protein (gfp) genes in environmental DNA samples. The quantification assay is a combination of specific PCR amplification and temperature gradient gel electrophoresis (TGGE). Gene quantification is pro...... PCR strategy is a highly specific and sensitive way to monitor recombinant DNA in environments like the efflux of a biotechnological plant....

  16. 40 CFR 82.20 - Availability of consumption allowances in addition to baseline consumption allowances for class...

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Availability of consumption allowances in addition to baseline consumption allowances for class II controlled substances. 82.20 Section 82...) PROTECTION OF STRATOSPHERIC OZONE Production and Consumption Controls § 82.20 Availability of consumption...

  17. Localization and in situ absolute quantification of chlordecone in the mouse liver by MALDI imaging.

    Science.gov (United States)

    Lagarrigue, Mélanie; Lavigne, Régis; Tabet, Elise; Genet, Valentine; Thomé, Jean-Pierre; Rondel, Karine; Guével, Blandine; Multigner, Luc; Samson, Michel; Pineau, Charles

    2014-06-17

    Chlordecone is an organochlorine pesticide that was extensively used in the French West Indies to fight weevils in banana plantations from 1973 to 1993. This has led to a persistent pollution of the environment and to the contamination of the local population for several decades with effects demonstrated on human health. Chlordecone accumulates mainly in the liver where it is known to potentiate the action of hepatotoxic agents. However, there is currently no information on its in situ localization in the liver. We have thus evaluated a matrix-assisted laser desorption ionization (MALDI) imaging quantification method based on labeled normalization for the in situ localization and quantification of chlordecone. After validating the linearity and the reproducibility of this method, quantitative MALDI imaging was used to study the accumulation of chlordecone in the mouse liver. Our results revealed that normalized intensities measured by MALDI imaging could be first converted in quantitative units. These quantities appeared to be different from absolute quantities of chlordecone determined by gas chromatography (GC), but they were perfectly correlated (R(2) = 0.995). The equation of the corresponding correlation curve was thus efficiently used to convert quantities measured by MALDI imaging into absolute quantities. Our method combining labeled normalization and calibration with an orthogonal technique allowed the in situ absolute quantification of chlordecone by MALDI imaging. Finally, our results obtained on the pathological mouse liver illustrate the advantages of quantitative MALDI imaging which preserves information on in situ localization without radioactive labeling and with a simple sample preparation.

  18. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Lines, Amanda M. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Adami, Susan R. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Sinkov, Sergey I. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Lumetta, Gregg J. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States; Bryan, Samuel A. [Energy and Environment Directorate, Pacific Northwest National Laboratory, Richland, Washington 99352, United States

    2017-08-09

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example of a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.

  19. Application of adaptive hierarchical sparse grid collocation to the uncertainty quantification of nuclear reactor simulators

    Energy Technology Data Exchange (ETDEWEB)

    Yankov, A.; Downar, T. [University of Michigan, 2355 Bonisteel Blvd, Ann Arbor, MI 48109 (United States)

    2013-07-01

    Recent efforts in the application of uncertainty quantification to nuclear systems have utilized methods based on generalized perturbation theory and stochastic sampling. While these methods have proven to be effective they both have major drawbacks that may impede further progress. A relatively new approach based on spectral elements for uncertainty quantification is applied in this paper to several problems in reactor simulation. Spectral methods based on collocation attempt to couple the approximation free nature of stochastic sampling methods with the determinism of generalized perturbation theory. The specific spectral method used in this paper employs both the Smolyak algorithm and adaptivity by using Newton-Cotes collocation points along with linear hat basis functions. Using this approach, a surrogate model for the outputs of a computer code is constructed hierarchically by adaptively refining the collocation grid until the interpolant is converged to a user-defined threshold. The method inherently fits into the framework of parallel computing and allows for the extraction of meaningful statistics and data that are not within reach of stochastic sampling and generalized perturbation theory. This paper aims to demonstrate the advantages of spectral methods-especially when compared to current methods used in reactor physics for uncertainty quantification-and to illustrate their full potential. (authors)

  20. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    Science.gov (United States)

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  1. Identification and Quantification of Celery Allergens Using Fiber Optic Surface Plasmon Resonance PCR

    Directory of Open Access Journals (Sweden)

    Devin Daems

    2017-07-01

    Full Text Available Abstract: Accurate identification and quantification of allergens is key in healthcare, biotechnology and food quality and safety. Celery (Apium graveolens is one of the most important elicitors of food allergic reactions in Europe. Currently, the golden standards to identify, quantify and discriminate celery in a biological sample are immunoassays and two-step molecular detection assays in which quantitative PCR (qPCR is followed by a high-resolution melting analysis (HRM. In order to provide a DNA-based, rapid and simple detection method suitable for one-step quantification, a fiber optic PCR melting assay (FO-PCR-MA was developed to determine different concentrations of celery DNA (1 pM–0.1 fM. The presented method is based on the hybridization and melting of DNA-coated gold nanoparticles to the FO sensor surface in the presence of the target gene (mannitol dehydrogenase, Mtd. The concept was not only able to reveal the presence of celery DNA, but also allowed for the cycle-to-cycle quantification of the target sequence through melting analysis. Furthermore, the developed bioassay was benchmarked against qPCR followed by HRM, showing excellent agreement (R2 = 0.96. In conclusion, this innovative and sensitive diagnostic test could further improve food quality control and thus have a large impact on allergen induced healthcare problems.

  2. Cell Image Velocimetry (CIV): boosting the automated quantification of cell migration in wound healing assays.

    Science.gov (United States)

    Milde, Florian; Franco, Davide; Ferrari, Aldo; Kurtcuoglu, Vartan; Poulikakos, Dimos; Koumoutsakos, Petros

    2012-11-01

    Cell migration is commonly quantified by tracking the speed of the cell layer interface in wound healing assays. This quantification is often hampered by low signal to noise ratio, in particular when complex substrates are employed to emulate in vivo cell migration in geometrically complex environments. Moreover, information about the cell motion, readily available inside the migrating cell layers, is not usually harvested. We introduce Cell Image Velocimetry (CIV), a combination of cell layer segmentation and image velocimetry algorithms, to drastically enhance the quantification of cell migration by wound healing assays. The resulting software analyses the speed of the interface as well as the detailed velocity field inside the cell layers in an automated fashion. CIV is shown to be highly robust for images with low signal to noise ratio, low contrast and frame shifting and it is portable across various experimental settings. The modular design and parametrization of CIV is not restricted to wound healing assays and allows for the exploration and quantification of flow phenomena in any optical microscopy dataset. Here, we demonstrate the capabilities of CIV in wound healing assays over topographically engineered surfaces and quantify the relative merits of differently aligned gratings on cell migration.

  3. Atomic Resolution Imaging and Quantification of Chemical Functionality of Surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Udo D. [Yale Univ., New Haven, CT (United States). Dept. of Mechanical Engineering and Materials Science; Altman, Eric I. [Yale Univ., New Haven, CT (United States). Dept. of Chemical and Environmental Engineering

    2014-12-10

    The work carried out from 2006-2014 under DoE support was targeted at developing new approaches to the atomic-scale characterization of surfaces that include species-selective imaging and an ability to quantify chemical surface interactions with site-specific accuracy. The newly established methods were subsequently applied to gain insight into the local chemical interactions that govern the catalytic properties of model catalysts of interest to DoE. The foundation of our work was the development of three-dimensional atomic force microscopy (3DAFM), a new measurement mode that allows the mapping of the complete surface force and energy fields with picometer resolution in space (x, y, and z) and piconewton/millielectron volts in force/energy. From this experimental platform, we further expanded by adding the simultaneous recording of tunneling current (3D-AFM/STM) using chemically well-defined tips. Through comparison with simulations, we were able to achieve precise quantification and assignment of local chemical interactions to exact positions within the lattice. During the course of the project, the novel techniques were applied to surface-oxidized copper, titanium dioxide, and silicon oxide. On these materials, defect-induced changes to the chemical surface reactivity and electronic charge density were characterized with site-specific accuracy.

  4. Quantification of deep medullary veins at 7 T brain MRI

    Energy Technology Data Exchange (ETDEWEB)

    Kuijf, Hugo J.; Viergever, Max A.; Vincken, Koen L. [University Medical Center Utrecht, Image Sciences Institute, Utrecht (Netherlands); Bouvy, Willem H.; Razoux Schultz, Tom B.; Biessels, Geert Jan [University Medical Center Utrecht, Department of Neurology, Brain Center Rudolf Magnus, Utrecht (Netherlands); Zwanenburg, Jaco J.M. [University Medical Center Utrecht, Image Sciences Institute, Utrecht (Netherlands); University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands)

    2016-10-15

    Deep medullary veins support the venous drainage of the brain and may display abnormalities in the context of different cerebrovascular diseases. We present and evaluate a method to automatically detect and quantify deep medullary veins at 7 T. Five participants were scanned twice, to assess the robustness and reproducibility of manual and automated vein detection. Additionally, the method was evaluated on 24 participants to demonstrate its application. Deep medullary veins were assessed within an automatically created region-of-interest around the lateral ventricles, defined such that all veins must intersect it. A combination of vesselness, tubular tracking, and hysteresis thresholding located individual veins, which were quantified by counting and computing (3-D) density maps. Visual assessment was time-consuming (2 h/scan), with an intra-/inter-observer agreement on absolute vein count of ICC = 0.76 and 0.60, respectively. The automated vein detection showed excellent inter-scan reproducibility before (ICC = 0.79) and after (ICC = 0.88) visually censoring false positives. It had a positive predictive value of 71.6 %. Imaging at 7 T allows visualization and quantification of deep medullary veins. The presented method offers fast and reliable automated assessment of deep medullary veins. (orig.)

  5. Investigation of Nonlinear Pupil Dynamics by Recurrence Quantification Analysis

    Directory of Open Access Journals (Sweden)

    L. Mesin

    2013-01-01

    Full Text Available Pupil is controlled by the autonomous nervous system (ANS. It shows complex movements and changes of size even in conditions of constant stimulation. The possibility of extracting information on ANS by processing data recorded during a short experiment using a low cost system for pupil investigation is studied. Moreover, the significance of nonlinear information contained in the pupillogram is investigated. We examined 13 healthy subjects in different stationary conditions, considering habitual dental occlusion (HDO as a weak stimulation of the ANS with respect to the maintenance of the rest position (RP of the jaw. Images of pupil captured by infrared cameras were processed to estimate position and size on each frame. From such time series, we extracted linear indexes (e.g., average size, average displacement, and spectral parameters and nonlinear information using recurrence quantification analysis (RQA. Data were classified using multilayer perceptrons and support vector machines trained using different sets of input indexes: the best performance in classification was obtained including nonlinear indexes in the input features. These results indicate that RQA nonlinear indexes provide additional information on pupil dynamics with respect to linear descriptors, allowing the discrimination of even a slight stimulation of the ANS. Their use in the investigation of pathology is suggested.

  6. Quantification of Focal Outflow Enhancement Using Differential Canalograms

    Science.gov (United States)

    Loewen, Ralitsa T.; Brown, Eric N.; Scott, Gordon; Parikh, Hardik; Schuman, Joel S.; Loewen, Nils A.

    2016-01-01

    Purpose To quantify regional changes of conventional outflow caused by ab interno trabeculectomy (AIT). Methods Gonioscopic, plasma-mediated AIT was established in enucleated pig eyes. We developed a program to automatically quantify outflow changes (R, package eye-canalogram, github.com) using a fluorescent tracer reperfusion technique. Trabecular meshwork (TM) ablation was demonstrated with fluorescent spheres in six eyes before formal outflow quantification with two-dye reperfusion canalograms in six additional eyes. Eyes were perfused with a central, intracameral needle at 15 mm Hg. Canalograms and histology were correlated for each eye. Results The pig eye provided a model with high similarity to AIT in human patients. Histology indicated ablation of TM and unroofing of most Schlemm's canal segments. Spheres highlighted additional circumferential and radial outflow beyond the immediate area of ablation. Differential canalograms showed that AIT caused an increase of outflow of 17 ± 5-fold inferonasally, 14 ± 3-fold superonasally, and also an increase in the opposite quadrants with a 2 ± 1-fold increase superotemporally, and 3 ± 3 inferotemporally. Perilimbal specific flow image analysis showed an accelerated nasal filling with an additional perilimbal flow direction into adjacent quadrants. Conclusions A quantitative, differential canalography technique was developed that allows us to quantify supraphysiological outflow enhancement by AIT. PMID:27227352

  7. Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals.

    Science.gov (United States)

    Crooks, Kevin R; Burdett, Christopher L; Theobald, David M; King, Sarah R B; Di Marco, Moreno; Rondinini, Carlo; Boitani, Luigi

    2017-07-18

    Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world's terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world's terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation.

  8. Quantification of (1)H NMR Spectra from Human Plasma.

    Science.gov (United States)

    de Graaf, Robin A; Prinsen, Hetty; Giannini, Cosimo; Caprio, Sonia; Herzog, Raimund I

    2015-12-01

    Human plasma is a biofluid that is high in information content, making it an excellent candidate for metabolomic studies. (1)H NMR has been a popular technique to detect several dozen metabolites in blood plasma. In order for (1)H NMR to become an automated, high-throughput method, challenges related to (1) the large signal from lipoproteins and (2) spectral overlap between different metabolites have to be addressed. Here diffusion-weighted (1)H NMR is used to separate lipoprotein and metabolite signals based on their large difference in translational diffusion. The metabolite (1)H NMR spectrum is then quantified through spectral fitting utilizing full prior knowledge on the metabolite spectral signatures. Extension of the scan time by 3 minutes or 15% per sample allowed the acquisition of a (1)H NMR spectrum with high diffusion weighting. The metabolite (1)H NMR spectra could reliably be modeled with 28 metabolites. Excellent correlation was found between results obtained with diffusion NMR and ultrafiltration. The combination of minimal sample preparation together with minimal user interaction during processing and quantification provides a metabolomics technique for automated, quantitative (1)H NMR of human plasma.

  9. Uncertainty quantification for large-scale ocean circulation predictions.

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik

    2010-09-01

    Uncertainty quantificatio in climate models is challenged by the sparsity of the available climate data due to the high computational cost of the model runs. Another feature that prevents classical uncertainty analyses from being easily applicable is the bifurcative behavior in the climate data with respect to certain parameters. A typical example is the Meridional Overturning Circulation in the Atlantic Ocean. The maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO{sub 2} forcing. We develop a methodology that performs uncertainty quantificatio in the presence of limited data that have discontinuous character. Our approach is two-fold. First we detect the discontinuity location with a Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve location in presence of arbitrarily distributed input parameter values. Furthermore, we developed a spectral approach that relies on Polynomial Chaos (PC) expansions on each sides of the discontinuity curve leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification and propagation. The methodology is tested on synthetic examples of discontinuous data with adjustable sharpness and structure.

  10. Serendipity: Global Detection and Quantification of Plant Stress

    Science.gov (United States)

    Schimel, D.; Verma, M.; Drewry, D.

    2016-12-01

    Detecting and quantifying plant stress is a grand challenge for remote sensing, and is important for understanding climate impacts on ecosystems broadly and also for early warning systems supporting food security. The long record from moderate resolution sensors providing frequent data has allowed using phenology to detect stress in forest and agroecosystems, but can fail or give ambiguous results when stress occurs during later phases of growth and in high leaf area systems. The recent recognition that greenhouse gas satellites such as GOSAT and OCO-2 observe Solar-Induced Fluorescence has added a new and complementary tool for the quantification of stress but algorithms to detect and quantify stress using SIF are in their infancy. Here we report new results showing a more complex response of SIF to stress by evaluating spaceborne SIF against in situ eddy covariance data. The response observed is as predicted by theory, and shows that SIF, used in conjunction with moderate resolution remote sensing, can detect and likely quantify stress by indexing the nonlinear part of the SIF-GPP relationship using the photochemical reflectance index and remotely observed light absorption. There are several exciting opportunities on the near horizon for the implementation of SIF, together with syngeristic measurements such as PRI and evapotranspiration that suggest the next few years will be a golden age for global ecology. Adancing the science and associated algorithms now is essential to fully exploiting the next wave of missions.

  11. EEI contests N.Y. concerns with allowance trading scheme

    Energy Technology Data Exchange (ETDEWEB)

    Lobsenz, G.

    1993-07-07

    A utility industry analyst has challenged charges that sulfur dioxide emissions allowance trading may be environmentally damaging, citing studies showing no harmful increases in acid deposition due to simulated interstate allowance transactions. Contrary to allegations by New York state officials, John Kinsman of the Edison Electric Institute said existing studies indicate the Adirondacks and other sensitive Northeast ecosystems actually may benefit from SO{sub 2} emissions allowance trading. In particular, studies done by the National Acid Precipitation Assessment Program suggest allowance trading could marginally reduce acid deposition in the Adirondacks below levels already expected under the federal SO{sub 2} emissions reduction program for utilities, said Kinsman, an environmental scientists at EEI.

  12. Translation-dependent bioassay for amino acid quantification using auxotrophic microbes as biocatalysts of protein synthesis.

    Science.gov (United States)

    Kameya, Masafumi; Asano, Yasuhisa

    2017-03-01

    Bioassay for amino acid quantification is an important technology for a variety of fields, which allows for easy, inexpensive, and high-throughput analyses. Here, we describe a novel translation-dependent bioassay for the quantification of amino acids. For this, the gene encoding firefly luciferase was introduced into Lactococcus lactis auxotrophic to Glu, His, Ile, Leu, Pro, Val, and Arg. After a preculture where luciferase expression was repressed, the cells were mixed with analytes, synthetic medium, and an inducer for luciferase expression. Luminescence response to the target amino acid appeared just after mixing, and linear standard curves for these amino acids were obtained during 15-60-min incubation periods. The rapid quantification of amino acids has neither been reported in previous works on bioassays nor is it theoretically feasible with conventional methods, which require incubation times of more than 4 h to allow for the growth of the microbe used. In contrast, our assay was shown to depend on protein translation, rather than on cell growth. Furthermore, replacement of the luciferase gene with that of the green fluorescent protein (GFP) or β-galactosidase allowed for fluorescent and colorimetric detection of the amino acids, respectively. Significantly, when a Gln-auxotrophic Escherichia coli mutant was created and transformed by a luciferase expression plasmid, a linear standard curve for Gln was observed in 15 min. These results demonstrate that this methodology can provide versatile bioassays by adopting various combinations of marker genes and host strains according to the analytes and experimental circumstances.

  13. Strategy study of quantification harmonization of SUV in PET/CT images; Estudo da estrategia de harmonizacao da quantificacao do SUV em imagens de PET/CT

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andreia Caroline Fischer da Silveira

    2014-07-01

    In clinical practice, PET/CT images are often analyzed qualitatively by visual comparison of tumor lesions and normal tissues uptake; and semi-quantitatively by means of a parameter called SUV (Standardized Uptake Value). To ensure that longitudinal studies acquired on different scanners are interchangeable, and information of quantification is comparable, it is necessary to establish a strategy to harmonize the quantification of SUV. The aim of this study is to evaluate the strategy to harmonize the quantification of PET/CT images, performed with different scanner models and manufacturers. For this purpose, a survey of the technical characteristics of equipment and acquisition protocols of clinical images of different services of PET/CT in the state of Rio Grande do Sul was conducted. For each scanner, the accuracy of SUV quantification, and the Recovery Coefficient (RC) curves were determined, using the reconstruction parameters clinically relevant and available. From these data, harmonized performance specifications among the evaluated scanners were identified, as well as the algorithm that produces, for each one, the most accurate quantification. Finally, the most appropriate reconstruction parameters to harmonize the SUV quantification in each scanner, either regionally or internationally were identified. It was found that the RC values of the analyzed scanners proved to be overestimated by up to 38%, particularly for objects larger than 17mm. These results demonstrate the need for further optimization, through the reconstruction parameters modification, and even the change of the reconstruction algorithm used in each scanner. It was observed that there is a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies. Thus, the choice of reconstruction method should be tied to the purpose of the PET/CT study in question, since the same reconstruction algorithm is not adequate, in one scanner, for qualitative

  14. Development of a new HPLC-based method for 3-nitrotyrosine quantification in different biological matrices.

    Science.gov (United States)

    Teixeira, Dulce; Prudêncio, Cristina; Vieira, Mónica

    2017-03-01

    The nitration of tyrosine residues in proteins is associated with nitrosative stress, resulting in the formation of 3-nitrotyrosine (3-NT).1 3-NT levels in biological samples have been associated with numerous physiological and pathological conditions. Hence several attempts have been made in order to develop methods that accurately quantify 3-NT in these matrices. The aim of this study was to develop a simple, rapid, low-cost and sensitive high-performance liquid chromatography (HPLC)-based 3-NT quantification method. All experiments were performed on an Hitachi LaChrom Elite® HPLC system. The method was validated according to International Conference on Harmonisation (ICH) guidelines for serum samples. Additionally, other biological matrices were tested, namely whole blood, urine, B16 F-10 melanoma cell line, growth medium conditioned with the same cell line, bacterial and yeast suspensions. From all the protocols tested, the best results were obtained using 0.5% CH3COOH:MeOH:H2O (15:15:70) as mobile phase, with detection at wavelengths 215, 276 and 356nm, at 25°C, and using a flow rate of 1mLmin-1. By using this protocol, it was possible to obtain a linear calibration curve, limits of detection and quantification in the order of μgL-1, and a short analysis time (developed protocol allowed the successful detection and quantification of 3-NT in all biological matrices tested, with detection at 356nm. This method, successfully developed and validated for 3-NT quantification, is simple, cheap and fast. These features render this method a suitable option for analysis of a wide range of biological matrices, being a promising useful tool for both research and diagnosis activities. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Assessment of probiotic viability during Cheddar cheese manufacture and ripening using propidium monoazide-PCR quantification

    Directory of Open Access Journals (Sweden)

    Emilie eDesfossés-Foucault

    2012-10-01

    Full Text Available The use of a suitable food carrier such as cheese could significantly enhance probiotic viability during storage. The main goal of this study was to assess viability of commercial probiotic strains during Cheddar cheesemaking and ripening (four to six months by comparing the efficiency of microbiological and molecular approaches. Molecular methods such as quantitative PCR (qPCR allow bacterial quantification, and DNA-blocking molecules such as propidium monoazide (PMA select only the living cells’ DNA. Cheese samples were manufactured with a lactococci starter and with one of three probiotic strains (Bifidobacterium animalis subsp. lactis BB-12, Lactobacillus rhamnosus RO011 or Lactobacillus helveticus RO052 or a mixed culture containing B. animalis subsp. lactis BB-12 and L. helveticus RO052 (MC1, both lactobacilli strains (MC2 or all three strains (MC3. DNA extractions were then carried out on PMA-treated and non-treated cell pellets in order to assess PMA treatment efficiency, followed by quantification using the 16S rRNA gene, the elongation factor Tu gene (tuf or the transaldolase gene (tal. Results with intact/dead ratios of bacteria showed that PMA-treated cheese samples had a significantly lower bacterial count than non-treated DNA samples (P<0.005, confirming that PMA did eliminate dead bacteria from PCR quantification. For both quantification methods, the addition of probiotic strains seemed to accelerate the loss of lactococci viability in comparison to control cheese samples, especially when L. helveticus RO052 was added. Viability of all three probiotic strains was also significantly reduced in mixed culture cheese samples (P<0.0001, B. animalis subsp. lactis BB-12 being the most sensitive to the presence of other strains. However, all probiotic strains did retain their viability (log nine cfu/g of cheese throughout ripening. This study was successful in monitoring living probiotic species in Cheddar cheese samples through PMA-qPCR.

  16. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    Science.gov (United States)

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  17. Comparison of biofilm cell quantification methods for drinking water distribution systems.

    Science.gov (United States)

    Waller, Sharon A; Packman, Aaron I; Hausner, Martina

    2018-01-01

    Drinking water quality typically degrades after treatment during conveyance through the distribution system. Potential causes include biofilm growth in distribution pipes which may result in pathogen retention, inhibited disinfectant diffusion, and proliferation of bad tastes and odors. However, there is no standard method for direct measurement of biofilms or quantification of biofilm cells in drinking water distribution systems. Three methods are compared here for quantification of biofilm cells grown in pipe loops samplers: biofilm heterotrophic plate count (HPC), biofilm biovolume by confocal laser scanning microscopy (CLSM) and biofilm total cell count by flow cytometry (FCM) paired with Syto 9. Both biofilm biovolume by CLSM and biofilm total cell count by FCM were evaluated for quantification of the whole biofilms (including non-viable cells and viable but not culturable cells). Signal-to-background ratios and overall performance of biofilm biovolume by CLSM and biofilm total cell count by FCM were found to vary with the pipe material. Biofilm total cell count by FCM had a low signal-to-background ratio on all materials, indicating that further development is recommended before application in drinking water environments. Biofilm biovolume by CLSM showed the highest signal-to-background ratio for cement and cast iron, which suggests promise for wider application in full-scale systems. Biofilm biovolume by CLSM and Syto 9 staining allowed in-situ biofilm cell quantification thus elimination variable associated with cell detachment for quantification but had limitations associated with non-specific staining of cement and, to a lesser degree, auto-fluorescence of both cement and polyvinyl chloride materials. Due to variability in results obtained from each method, multiple methods are recommended to assess biofilm growth in drinking water distribution systems. Of the methods investigated here, HPC and CLSM and recommended for further development towards

  18. Automated renal histopathology: digital extraction and quantification of renal pathology

    Science.gov (United States)

    Sarder, Pinaki; Ginley, Brandon; Tomaszewski, John E.

    2016-03-01

    The branch of pathology concerned with excess blood serum proteins being excreted in the urine pays particular attention to the glomerulus, a small intertwined bunch of capillaries located at the beginning of the nephron. Normal glomeruli allow moderate amount of blood proteins to be filtered; proteinuric glomeruli allow large amount of blood proteins to be filtered. Diagnosis of proteinuric diseases requires time intensive manual examination of the structural compartments of the glomerulus from renal biopsies. Pathological examination includes cellularity of individual compartments, Bowman's and luminal space segmentation, cellular morphology, glomerular volume, capillary morphology, and more. Long examination times may lead to increased diagnosis time and/or lead to reduced precision of the diagnostic process. Automatic quantification holds strong potential to reduce renal diagnostic time. We have developed a computational pipeline capable of automatically segmenting relevant features from renal biopsies. Our method first segments glomerular compartments from renal biopsies by isolating regions with high nuclear density. Gabor texture segmentation is used to accurately define glomerular boundaries. Bowman's and luminal spaces are segmented using morphological operators. Nuclei structures are segmented using color deconvolution, morphological processing, and bottleneck detection. Average computation time of feature extraction for a typical biopsy, comprising of ~12 glomeruli, is ˜69 s using an Intel(R) Core(TM) i7-4790 CPU, and is ~65X faster than manual processing. Using images from rat renal tissue samples, automatic glomerular structural feature estimation was reproducibly demonstrated for 15 biopsy images, which contained 148 individual glomeruli images. The proposed method holds immense potential to enhance information available while making clinical diagnoses.

  19. Reporter gene assay for the quantification of the activity and neutralizing antibody response to TNFα antagonists

    DEFF Research Database (Denmark)

    Lallemand, Christophe; Kavrochorianou, Nadia; Steenholdt, Casper

    2011-01-01

    A cell-based assay has been developed for the quantification of the activity of TNFα antagonists based on human erythroleukemic K562 cells transfected with a NFκB regulated firefly luciferase reporter-gene construct. Both drug activity and anti-drug neutralizing antibodies can be quantified...... with a high degree of precision within 2h, and without interference from cytokines and other factors known to activate NFκB. The assay cells also contain the Renilla luciferase reporter gene under the control of a constitutive promoter that allows TNFα-induced firefly luciferase activity to be normalized...

  20. The Parallel C++ Statistical Library ‘QUESO’: Quantification of Uncertainty for Estimation, Simulation and Optimization

    KAUST Repository

    Prudencio, Ernesto E.

    2012-01-01

    QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently abstract in order to handle a large spectrum of models, (b) be algorithmically extensible, allowing an easy insertion of new and improved algorithms, and (c) take advantage of parallel computing, in order to handle realistic models. Such objectives demand a combination of an object-oriented design with robust software engineering practices. QUESO is written in C++, uses MPI, and leverages libraries already available to the scientific community. We describe some UQ concepts, present QUESO, and list planned enhancements.

  1. Quantification of Structure from Medical Images

    DEFF Research Database (Denmark)

    Qazi, Arish Asif

    In this thesis, we present automated methods that quantify information from medical images; information that is intended to assist and enable clinicians gain a better understanding of the underlying pathology. The first part of the thesis presents methods that analyse the articular cartilage......, and information beyond that of traditional morphometric measures. The thesis also proposes a fully automatic and generic statistical framework for identifying biologically interpretable regions of difference (ROD) between two groups of biological objects, attributed by anatomical differences or changes relating...... to pathology, without a priori knowledge about the location, extent, or topology of the ROD. Based on quantifications from both morphometric and textural based imaging markers, our method has identified the most pathological regions in the articular cartilage. The remaining part of the thesis presents methods...

  2. Recurrence quantification analysis theory and best practices

    CERN Document Server

    Jr, Jr; Marwan, Norbert

    2015-01-01

    The analysis of recurrences in dynamical systems by using recurrence plots and their quantification is still an emerging field.  Over the past decades recurrence plots have proven to be valuable data visualization and analysis tools in the theoretical study of complex, time-varying dynamical systems as well as in various applications in biology, neuroscience, kinesiology, psychology, physiology, engineering, physics, geosciences, linguistics, finance, economics, and other disciplines.   This multi-authored book intends to comprehensively introduce and showcase recent advances as well as established best practices concerning both theoretical and practical aspects of recurrence plot based analysis.  Edited and authored by leading researcher in the field, the various chapters address an interdisciplinary readership, ranging from theoretical physicists to application-oriented scientists in all data-providing disciplines.

  3. Uncertainty quantification in wind farm flow models

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo

    uncertainties through a model chain are presented and applied to several wind energy related problems such as: annual energy production estimation, wind turbine power curve estimation, wake model calibration and validation, and estimation of lifetime equivalent fatigue loads on a wind turbine. Statistical...... the uncertainty in the lifetime performance of a wind turbine under realistic inflow conditions. Operational measurements of several large offshore wind farms are used to perform model calibration and validation of several stationary wake models. These results provide a guideline to identify the regions in which......This thesis formulates a framework to perform uncertainty quantification within wind energy. This framework has been applied to some of the most common models used to estimate the annual energy production in the planning stages of a wind energy project. Efficient methods to propagate input...

  4. Multispectral image analysis for algal biomass quantification.

    Science.gov (United States)

    Murphy, Thomas E; Macon, Keith; Berberoglu, Halil

    2013-01-01

    This article reports a novel multispectral image processing technique for rapid, noninvasive quantification of biomass concentration in attached and suspended algae cultures. Monitoring the biomass concentration is critical for efficient production of biofuel feedstocks, food supplements, and bioactive chemicals. Particularly, noninvasive and rapid detection techniques can significantly aid in providing delay-free process control feedback in large-scale cultivation platforms. In this technique, three-band spectral images of Anabaena variabilis cultures were acquired and separated into their red, green, and blue components. A correlation between the magnitude of the green component and the areal biomass concentration was generated. The correlation predicted the biomass concentrations of independently prepared attached and suspended cultures with errors of 7 and 15%, respectively, and the effect of varying lighting conditions and background color were investigated. This method can provide necessary feedback for dilution and harvesting strategies to maximize photosynthetic conversion efficiency in large-scale operation. © 2013 American Institute of Chemical Engineers.

  5. Quantification Methods of Management Skills in Shipping

    Directory of Open Access Journals (Sweden)

    Riana Iren RADU

    2012-04-01

    Full Text Available Romania can not overcome the financial crisis without business growth, without finding opportunities for economic development and without attracting investment into the country. Successful managers find ways to overcome situations of uncertainty. The purpose of this paper is to determine the managerial skills developed by the Romanian fluvial shipping company NAVROM (hereinafter CNFR NAVROM SA, compared with ten other major competitors in the same domain, using financial information of these companies during the years 2005-2010. For carrying out the work it will be used quantification methods of managerial skills to CNFR NAVROM SA Galati, Romania, as example mentioning the analysis of financial performance management based on profitability ratios, net profit margin, suppliers management, turnover.

  6. Uncertainty quantification of acoustic emission filtering techniques

    Science.gov (United States)

    Zárate, Boris A.; Caicedo, Juan M.; Ziehl, Paul

    2012-04-01

    This paper compares six different filtering protocols used in Acoustic Emission (AE) monitoring of fatigue crack growth. The filtering protocols are combination of three different filtering techniques which are based on Swansong-like filters and load filters. The filters are compared deterministically and probabilistically. The deterministic comparison is based on the coefficient of determination of the resulting AE data, while the probabilistic comparison is based on the quantification of the uncertainty of the different filtering protocols. The uncertainty of the filtering protocols is quantified by calculating the entropy of the probability distribution of some AE and fracture mechanics parameters for the given filtering protocol. The methodology is useful in cases where several filtering protocols are available and there is no reason to choose one over the others. Acoustic Emission data from a compact tension specimen tested under cyclic load is used for the comparison.

  7. Towards objective quantification of the Tinetti test.

    Science.gov (United States)

    Panella, Lorenzo; Lombardi, Remo; Buizza, Angelo; Gandolfi, Roberto; Pizzagalli, Paola

    2002-01-01

    The Tinetti test is a widespread test for assessing motor control in the elderly, which could also be usefully applied in neurology. At present it uses a qualitative measurement scale. As a first step towards its objective quantification, trunk inclination was measured during the test by two inclinometers and quantified by descriptive parameters. The 95th or 5th percentiles of parameter distributions in normal subjects (no.=150) were taken as limits of normality, and parameters computed on 130 institutionalised elderly people were compared to these limits, to test the parameters' discriminatory power. The distributions of many parameters were statistically different in normal subjects and patients. These results suggest that this approach is a promising tool for objective evaluation of the Tinetti test.

  8. Recurrence quantification analysis of global stock markets

    Science.gov (United States)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  9. Sky Surveys

    Science.gov (United States)

    Djorgovski, S. George; Mahabal, Ashish; Drake, Andrew; Graham, Matthew; Donalek, Ciro

    Sky surveys represent a fundamental data basis for astronomy. We usethem to map in a systematic way the universe and its constituents andto discover new types of objects or phenomena. We review the subject,with an emphasis on the wide-field, imaging surveys, placing them ina broader scientific and historical context. Surveys are now the largestdata generators in astronomy, propelled by the advances in informationand computation technology, and have transformed the ways in whichastronomy is done. This trend is bound to continue, especially with thenew generation of synoptic sky surveys that cover wide areas of the skyrepeatedly and open a new time domain of discovery. We describe thevariety and the general properties of surveys, illustrated by a number ofexamples, the ways in which they may be quantified and compared, andoffer some figures of merit that can be used to compare their scientificdiscovery potential. Surveys enable a very wide range of science, and that isperhaps their key unifying characteristic. As new domains of the observableparameter space open up thanks to the advances in technology, surveys areoften the initial step in their exploration. Some science can be done withthe survey data alone (or a combination of data from different surveys),and some require a targeted follow-up of potentially interesting sourcesselected from surveys. Surveys can be used to generate large, statisticalsamples of objects that can be studied as populations or as tracers of largerstructures to which they belong. They can be also used to discover orgenerate samples of rare or unusual objects and may lead to discoveriesof some previously unknown types. We discuss a general framework ofparameter spaces that can be used for an assessment and comparison ofdifferent surveys and the strategies for their scientific exploration. As we aremoving into the Petascale regime and beyond, an effective processing andscientific exploitation of such large data sets and data streams pose

  10. Aerosol-type retrieval and uncertainty quantification from OMI data

    Science.gov (United States)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model

  11. 3C-digital PCR for quantification of chromatin interactions.

    Science.gov (United States)

    Du, Meijun; Wang, Liang

    2016-12-06

    Chromosome conformation capture (3C) is a powerful and widely used technique for detecting the physical interactions between chromatin regions in vivo. The principle of 3C is to convert physical chromatin interactions into specific DNA ligation products, which are then detected by quantitative polymerase chain reaction (qPCR). However, 3C-qPCR assays are often complicated by the necessity of normalization controls to correct for amplification biases. In addition, qPCR is often limited to a certain cycle number, making it difficult to detect fragment ligations with low frequency. Recently, digital PCR (dPCR) technology has become available, which allows for highly sensitive nucleic acid quantification. Main advantage of dPCR is its high precision of absolute nucleic acid quantification without requirement of normalization controls. To demonstrate the utility of dPCR in quantifying chromatin interactions, we examined two prostate cancer risk loci at 8q24 and 2p11.2 for their interaction target genes MYC and CAPG in LNCaP cell line. We designed anchor and testing primers at known regulatory element fragments and target gene regions, respectively. dPCR results showed that interaction frequency between the regulatory element and MYC gene promoter was 0.7 (95% CI 0.40-1.10) copies per 1000 genome copies while other regions showed relatively low ligation frequencies. The dPCR results also showed that the ligation frequencies between the regulatory element and two EcoRI fragments containing CAPG gene promoter were 1.9 copies (95% CI 1.41-2.47) and 1.3 copies per 1000 genome copies (95% CI 0.76-1.92), respectively, while the interaction signals were reduced on either side of the promoter region of CAPG gene. Additionally, we observed comparable results from 3C-dPCR and 3C-qPCR at 2p11.2 in another cell line (DU145). Compared to traditional 3C-qPCR, our results show that 3C-dPCR is much simpler and more sensitive to detect weak chromatin interactions. It may eliminate

  12. Quantification of in-channel large wood recruitment through a 3-D probabilistic approach

    Science.gov (United States)

    Cislaghi, Alessio; Rigon, Emanuel; Aristide Lenzi, Mario; Battista Bischetti, Gian

    2017-04-01

    Large wood (LW) is a relevant factor in physical, chemical, environmental and biological aspects of low order mountain streams system. LW recruitment, in turn, is affected by many physical processes, such as debris flows, shallow landslides, bank erosion, snow- and wind throw, and increases the potential hazard for downstream human population and infrastructures during intense flood events. In spite of that, the LW recruitment quantification and the modelling of related processes are receiving attention only since few years ago, with particular reference to hillslope instabilities which are the dominant source of LW recruitment in mountainous terrains at regional scale. Actually, models based on the infinite slope approach, commonly adopted for slope stability analysis, can be used for estimating probable LW volume and for identifying the most hazardous areas of wood input, transport and deposition. Such models, however, generally request a robust calibration on landslide inventory and tend to overestimate unstable areas and then LW recruitment volumes. On this background, this work proposes a new LW estimation procedure which combines the forest stand characteristics of the entire catchment and a three-dimensional probabilistic slope stability model. The slope stability model overcomes the limits of the infinite slope approach and considers the spatial variability and uncertainty of the model input parameters through a Monte Carlo analysis. The forest stands characteristics allow including the root reinforcement into the stability model as stochastic input parameter, and provide the necessary information to evaluate the forest wood volume prone to be recruited as LW and its position on the hillslopes. The procedure was tested on a small mountainous headwater catchment in the Eastern Italian Alps, covered with pasture and coniferous forest and prone to shallow landslide and debris flow phenomena, especially during the late spring and the early autumn. The results

  13. Uncertainty Quantification of Equilibrium Climate Sensitivity

    Science.gov (United States)

    Lucas, D. D.; Brandon, S. T.; Covey, C. C.; Domyancic, D. M.; Johannesson, G.; Klein, R.; Tannahill, J.; Zhang, Y.

    2011-12-01

    Significant uncertainties exist in the temperature response of the climate system to changes in the levels of atmospheric carbon dioxide. We report progress to quantify the uncertainties of equilibrium climate sensitivity using perturbed parameter ensembles of the Community Earth System Model (CESM). Through a strategic initiative at the Lawrence Livermore National Laboratory, we have been developing uncertainty quantification (UQ) methods and incorporating them into a software framework called the UQ Pipeline. We have applied this framework to generate a large number of ensemble simulations using Latin Hypercube and other schemes to sample up to three dozen uncertain parameters in the atmospheric (CAM) and sea ice (CICE) model components of CESM. The parameters sampled are related to many highly uncertain processes, including deep and shallow convection, boundary layer turbulence, cloud optical and microphysical properties, and sea ice albedo. An extensive ensemble database comprised of more than 46,000 simulated climate-model-years of recent climate conditions has been assembled. This database is being used to train surrogate models of CESM responses and to perform statistical calibrations of the CAM and CICE models given observational data constraints. The calibrated models serve as a basis for propagating uncertainties forward through climate change simulations using a slab ocean model configuration of CESM. This procedure is being used to quantify the probability density function of equilibrium climate sensitivity accounting for uncertainties in climate model processes. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013. (LLNL-ABS-491765)

  14. Game engines: a survey

    OpenAIRE

    A. Andrade

    2015-01-01

    Due to hardware limitations at the origin of the video game industry, each new game was generally coded from the ground up. Years later, from the evolution of hardware and the need for quick game development cycles, spawned the concept of game engine. A game engine is a reusable software layer allowing the separation of common game concepts from the game assets (levels, graphics, etc.). This paper surveys fourteen different game engines relevant today, ranging from the industry-level to the n...

  15. Technical Note: Clinical translation of the Rapid-Steady-State-T1 MRI method for direct cerebral blood volume quantification.

    Science.gov (United States)

    Perles-Barbacaru, Teodora-Adriana; Tropres, Irene; Sarraf, Michel G; Chechin, David; Zaccaria, Affif; Grand, Sylvie; Le Bas, Jean-François; Berger, François; Lahrech, Hana

    2015-11-01

    In preclinical studies, the Rapid-Steady-State-T1 (RSST1) MRI method has advantages over conventional MRI methods for blood volume fraction (BVf) mapping, since after contrast agent administration, the BVf is directly quantifiable from the signal amplitude corresponding to the vascular equilibrium magnetization. This study focuses on its clinical implementation and feasibility. Following sequence implementation on clinical Philips Achieva scanners, the RSST1-method is assessed at 1.5 and 3 T in the follow-up examination of neurooncological patients receiving 0.1-0.2 mmol/kg Gd-DOTA to determine the threshold dose needed for cerebral BVf quantification. Confounding effects on BVf quantification such as transendothelial water exchange, transverse relaxation, and contrast agent extravasation are evaluated. For a dose≥0.13 mmol/kg at 1.5 T and ≥0.16 mmol/kg at 3 T, the RSST1-signal time course in macrovessels and brain tissue with Gd-DOTA impermeable vasculature reaches a steady state at maximum amplitude for about 8 s. In macrovessels, a BVf of 100% was obtained validating cerebral microvascular BVf quantification (3.5%-4.5% in gray matter and 1.5%-2.0% in white matter). In tumor tissue, a continuously increasing signal is detected, necessitating signal modeling for tumor BVf calculation. Using approved doses of Gd-DOTA, the steady state RSST1-signal in brain tissue is reached during the first pass and corresponds to the BVf. The first-pass duration is sufficient to allow accurate BVf quantification. The RSST1-method is appropriate for serial clinical studies since it allows fast and straightforward BVf quantification without arterial input function determination. This quantitative MRI method is particularly useful to assess the efficacy of antiangiogenic agents.

  16. On the Methods for Calculating Annual Allowable Cut

    Directory of Open Access Journals (Sweden)

    V. А. Sokolov

    2014-10-01

    Full Text Available Crisis in supplying regions and the country related to available forest resources and low profitability of forest sector, as a whole, is an indicator of failure of the existing model of forest management and forest use organization in Russia at the present time. Many Russian regions, which are traditionally considered as forest industrial territories, face the challenge of lack of economically accessible forests. The forests are decreasing against a background of under exploitation of the annual allowable cut. This situation occurs in Siberia as well. In many cases, using calculated allowable cut will result in unsustainable harvest levels and a future decrease of accessible forest resources. Thus, the statement that «a volume of wood resource utilization is determined by allowable cut represented the scientifically grounded norm of sustainable forest use» is considered as no more than the declarative proposition. Modeling the normal forest, and using a formula of allowable cut calculation estimated for some decades based on the modeling, is totally unreliable and unreal. The long-term forecast should use analog methods, but it will hardly be sufficiently accurate and adequate to set norms. In order to estimate ecological and economic accessibility of forest resources, an algorithm was made, and a method and model were developed. This model is based on GIS-database and makes it possible to estimate accessibility of forest resources and to map it as well. The conclusion on necessity to determine annual allowable cut in two varieties was drawn following the procedures for calculating annual allowable cut. The first variety is silvicultural (according the currently used methods and the other one is economically accessible allowable cut, which could provide economic effective use of tradable mature wood, taking in to account ecological and economic accessibility of forest resources.

  17. Methodological considerations in quantification of oncological FDG PET studies.

    Science.gov (United States)

    Vriens, Dennis; Visser, Eric P; de Geus-Oei, Lioe-Fee; Oyen, Wim J G

    2010-07-01

    This review aims to provide insight into the factors that influence quantification of glucose metabolism by FDG PET images in oncology as well as their influence on repeated measures studies (i.e. treatment response assessment), offering improved understanding both for clinical practice and research. Structural PubMed searches have been performed for the many factors affecting quantification of glucose metabolism by FDG PET. Review articles and references lists have been used to supplement the search findings. Biological factors such as fasting blood glucose level, FDG uptake period, FDG distribution and clearance, patient motion (breathing) and patient discomfort (stress) all influence quantification. Acquisition parameters should be adjusted to maximize the signal to noise ratio without exposing the patient to a higher than strictly necessary radiation dose. This is especially challenging in pharmacokinetic analysis, where the temporal resolution is of significant importance. The literature is reviewed on the influence of attenuation correction on parameters for glucose metabolism, the effect of motion, metal artefacts and contrast agents on quantification of CT attenuation-corrected images. Reconstruction settings (analytical versus iterative reconstruction, post-reconstruction filtering and image matrix size) all potentially influence quantification due to artefacts, noise levels and lesion size dependency. Many region of interest definitions are available, but increased complexity does not necessarily result in improved performance. Different methods for the quantification of the tissue of interest can introduce systematic and random inaccuracy. This review provides an up-to-date overview of the many factors that influence quantification of glucose metabolism by FDG PET.

  18. Automated Processing of Zebrafish Imaging Data: A Survey

    Science.gov (United States)

    Dickmeis, Thomas; Driever, Wolfgang; Geurts, Pierre; Hamprecht, Fred A.; Kausler, Bernhard X.; Ledesma-Carbayo, María J.; Marée, Raphaël; Mikula, Karol; Pantazis, Periklis; Ronneberger, Olaf; Santos, Andres; Stotzka, Rainer; Strähle, Uwe; Peyriéras, Nadine

    2013-01-01

    Abstract Due to the relative transparency of its embryos and larvae, the zebrafish is an ideal model organism for bioimaging approaches in vertebrates. Novel microscope technologies allow the imaging of developmental processes in unprecedented detail, and they enable the use of complex image-based read-outs for high-throughput/high-content screening. Such applications can easily generate Terabytes of image data, the handling and analysis of which becomes a major bottleneck in extracting the targeted information. Here, we describe the current state of the art in computational image analysis in the zebrafish system. We discuss the challenges encountered when handling high-content image data, especially with regard to data quality, annotation, and storage. We survey methods for preprocessing image data for further analysis, and describe selected examples of automated image analysis, including the tracking of cells during embryogenesis, heartbeat detection, identification of dead embryos, recognition of tissues and anatomical landmarks, and quantification of behavioral patterns of adult fish. We review recent examples for applications using such methods, such as the comprehensive analysis of cell lineages during early development, the generation of a three-dimensional brain atlas of zebrafish larvae, and high-throughput drug screens based on movement patterns. Finally, we identify future challenges for the zebrafish image analysis community, notably those concerning the compatibility of algorithms and data formats for the assembly of modular analysis pipelines. PMID:23758125

  19. Development of a Taqman real-time PCR assay for rapid detection and quantification of Vibrio tapetis in extrapallial fluids of clams

    Directory of Open Access Journals (Sweden)

    Adeline Bidault

    2015-12-01

    Full Text Available The Gram-negative bacterium Vibrio tapetis is known as the causative agent of Brown Ring Disease (BRD in the Manila clam Venerupis (=Ruditapes philippinarum. This bivalve is the second most important species produced in aquaculture and has a high commercial value. In spite of the development of several molecular methods, no survey has been yet achieved to rapidly quantify the bacterium in the clam. In this study, we developed a Taqman real-time PCR assay targeting virB4 gene for accurate and quantitative identification of V. tapetis strains pathogenic to clams. Sensitivity and reproducibility of the method were assessed using either filtered sea water or extrapallial fluids of clam injected with the CECT4600T V. tapetis strain. Quantification curves of V. tapetis strain seeded in filtered seawater (FSW or extrapallial fluids (EF samples were equivalent showing reliable qPCR efficacies. With this protocol, we were able to specifically detect V. tapetis strains down to 1.125 101 bacteria per mL of EF or FSW, taking into account the dilution factor used for appropriate template DNA preparation. This qPCR assay allowed us to monitor V. tapetis load both experimentally or naturally infected Manila clams. This technique will be particularly useful for monitoring the kinetics of massive infections by V. tapetis and for designing appropriate control measures for aquaculture purposes.

  20. Myoblots: dystrophin quantification by in-cell western assay for a streamlined development of Duchenne muscular dystrophy (DMD) treatments.

    Science.gov (United States)

    Ruiz-Del-Yerro, E; Garcia-Jimenez, I; Mamchaoui, K; Arechavala-Gomeza, V

    2017-10-31

    New therapies for neuromuscular disorders are often mutation specific and require to be studied in patient's cell cultures. In Duchenne muscular dystrophy (DMD) dystrophin restoration drugs are being developed but as muscle cell cultures from DMD patients are scarce and do not grow or differentiate well, only a limited number of candidate drugs are tested. Moreover, dystrophin quantification by western blotting requires a large number of cultured cells; so fewer compounds are as thoroughly screened as is desirable. We aimed to develop a quantitative assessment tool using fewer cells to contribute in the study of dystrophin and to identify better drug candidates. An 'in-cell western' assay is a quantitative immunofluorescence assay performed in cell culture microplates that allows protein quantification directly in culture, allowing a higher number of experimental repeats and throughput. We have optimized the assay ('myoblot') to be applied to the study of differentiated myoblast cultures. After an exhaustive optimization of the technique to adapt it to the growth and differentiation rates of our cultures and the low intrinsic expression of our proteins of interests, our myoblot protocol allows the quantification of dystrophin and other muscle-associated proteins in muscle cell cultures. We are able to distinguish accurately between the different sets of patients based on their dystrophin expression and detect dystrophin restoration after treatment. We expect that this new tool to quantify muscle proteins in DMD and other muscle disorders will aid in their diagnosis and in the development of new therapies. © 2017 British Neuropathological Society.

  1. Survey Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Cleaned and QCd data for the Fishing Effort Survey. Questions on fishing and other out are asked on weather and outdoor activity, including fishing trips. Used for...

  2. Surveying Humaness

    DEFF Research Database (Denmark)

    Markussen, Randi; Gad, Christopher

    and development of a large collection of biological and psychological symptoms and psycho-social problems. However, the surveys say nothing about how the information will be of use to the people who answer the procedure or how this scientific intervention will be put to use more specifically within the public......Christopher Gad. Ph.d. Dept. of Information and Media Studies Randi Markussen. Associate Professor, Dept. of Information and Media Studies. rmark@imv.au.dk   Abstract:   Surveying humanness -politics of care improvement   For various reasons we both were subjected to a specific survey procedure...... carried out in a Danish county in order to improve treatment of people who have suffered from long-term illnesses. The surveys concern not only feed back on how people experience their present and past interaction with the social services and health care system; they also ask people to indicate the state...

  3. Surveying Humaness

    DEFF Research Database (Denmark)

    Markussen, Randi; Gad, Christopher

    carried out in a Danish county in order to improve treatment of people who have suffered from long-term illnesses. The surveys concern not only feed back on how people experience their present and past interaction with the social services and health care system; they also ask people to indicate the state...... and development of a large collection of biological and psychological symptoms and psycho-social problems. However, the surveys say nothing about how the information will be of use to the people who answer the procedure or how this scientific intervention will be put to use more specifically within the public......Christopher Gad. Ph.d. Dept. of Information and Media Studies Randi Markussen. Associate Professor, Dept. of Information and Media Studies. rmark@imv.au.dk   Abstract:   Surveying humanness -politics of care improvement   For various reasons we both were subjected to a specific survey procedure...

  4. Allowing for crystalline structure effects in Geant4

    Science.gov (United States)

    Bagli, Enrico; Asai, Makoto; Dotti, Andrea; Pandola, Luciano; Verderi, Marc

    2017-07-01

    In recent years, the Geant4 toolkit for the Monte Carlo simulation of radiation with matter has seen large growth in its divers user community. A fundamental aspect of a successful physics experiment is the availability of a reliable and precise simulation code. Geant4 currently does not allow for the simulation of particle interactions with anything other than amorphous matter. To overcome this limitation, the GECO (GEant4 Crystal Objects) project developed a general framework for managing solid-state structures in the Geant4 kernel and validate it against experimental data. Accounting for detailed geometrical structures allows, for example, simulation of diffraction from crystal planes or the channeling of charged particle.

  5. The First Micro-simulation Result on Distributional Effects of Introducing Child Allowance in Japan(in Japanese)

    OpenAIRE

    Takayama, Noriyuki; Shiraishi, Kousuke

    2010-01-01

    This is an empirical study on the policy package on child allowance, using the micro data from the 2007 Comprehensive Survey of Living Conditions of the People on Health and Welfare (Ministry of Health, Labour and Welfare) , this paper analyzed an impact of the new child allowance in Japan. The estimations are independently made for the amount of income tax, payroll deduction, social security contribution deduction, and so on, which reflect the content of the 2009 system. Through the policy p...

  6. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    Science.gov (United States)

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2017-05-04

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  7. Exploiting multicompartment effects in triple-echo steady-state T2 mapping for fat fraction quantification.

    Science.gov (United States)

    Liu, Dian; Steingoetter, Andreas; Curcic, Jelena; Kozerke, Sebastian

    2018-01-01

    To investigate and exploit the effect of intravoxel off-resonance compartments in the triple-echo steady-state (TESS) sequence without fat suppression for T2 mapping and to leverage the results for fat fraction quantification. In multicompartment tissue, where at least one compartment is excited off-resonance, the total signal exhibits periodic modulations as a function of echo time (TE). Simulated multicompartment TESS signals were synthesized at various TEs. Fat emulsion phantoms were prepared and scanned at the same TE combinations using TESS. In vivo knee data were obtained with TESS to validate the simulations. The multicompartment effect was exploited for fat fraction quantification in the stomach by acquiring TESS signals at two TE combinations. Simulated and measured multicompartment signal intensities were in good agreement. Multicompartment effects caused erroneous T2 offsets, even at low water-fat ratios. The choice of TE caused T2 variations of as much as 28% in cartilage. The feasibility of fat fraction quantification to monitor the decrease of fat content in the stomach during digestion is demonstrated. Intravoxel off-resonance compartments are a confounding factor for T2 quantification using TESS, causing errors that are dependent on the TE. At the same time, off-resonance effects may allow for efficient fat fraction mapping using steady-state imaging. Magn Reson Med 79:423-429, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  8. Housing Demand and Department of Defense Policy on Housing Allowances

    Science.gov (United States)

    1990-09-01

    34 in E. S. Mills (ed.), Handbook of Regional and Urban Economics , Vol. II, Elsevier Science Publishers, Amsterdam, 1987. Blackley, P., and J. Ondrich, "A...Chinloy, P. T., "An Empirical Model of the Market for Resale Homes," Journal of Urban Economics 7 (May 1980): 279-292. David, M. H., Family Composition...and E. Jimenez, "Estimating the Demand for Housing Characteristics: A Survey and Critique," Regional Science and Urban Economics 15 (1985): 77-107

  9. Assessing the Implications of Allowing Transgender Personnel to Serve Openly

    Science.gov (United States)

    2016-01-01

    incidents of harassment must be dealt with according to the Canadian military’s discrimination and harassment policy. Finally, if the trans- gender ...transgender personnel were allowed to serve openly following a national policy revision that ended discrimination based on sexual orientation or gender ...memorandum, October 7, 2013. Office of Personnel Management, Addressing Sexual Orientation and Gender Identity Discrimination in Federal Civilian

  10. 7 CFR 3560.202 - Establishing rents and utility allowances.

    Science.gov (United States)

    2010-01-01

    ... housing tax credit (LIHTC) rents. (d) Utility allowances. In projects where tenants pay the utilities... contributions or rehabilitation loans will not be counted towards reducing rents. (f) Rents for resident manager... rental unit in a housing project when they are acting as a management agent or resident manager as...

  11. 20 CFR 617.21 - Reemployment services and allowances.

    Science.gov (United States)

    2010-04-01

    ... and techniques for finding a job. Such programs vary in design and operation and call for a carefully... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Reemployment services and allowances. 617.21... ADJUSTMENT ASSISTANCE FOR WORKERS UNDER THE TRADE ACT OF 1974 Reemployment Services § 617.21 Reemployment...

  12. Is It Safe to Allow Cell Phones in School?

    Science.gov (United States)

    Trump, Kenneth S.

    2009-01-01

    Cell phones were banned from most schools years ago, but after the Columbine High School and 9/11 tragedies, parents started pressuring some school boards and administrators to reverse the bans. On its surface, allowing students to have cell phones under the guise of improved school safety may seem like a "no-brainer" to many board members and…

  13. Mitigation of Global Warming with Focus on Personal Carbon Allowances

    DEFF Research Database (Denmark)

    Meyer, Niels I

    2008-01-01

    The mitigation of global warming requires new efficient systems and methods. The paper presents a new proposal called personal carbon allowances with caps on the CO2 emission from household heating and electricity and on emission from transport in private cars and in personal air flights. Results...

  14. 20 CFR 645.300 - What constitutes an allowable match?

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What constitutes an allowable match? 645.300 Section 645.300 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR PROVISIONS GOVERNING WELFARE-TO-WORK GRANTS Additional Formula Grant Administrative Standards and Procedures § 645.300...

  15. Allowable irreducible representations of the point groups with five ...

    Indian Academy of Sciences (India)

    Allowable irreducible representations of the point groups with five-fold rotations – that represent the symmetry of the quasicrystals in two and three dimensions – are derived by employing the little group technique in conjunction with the solvability property. The point groups D 5 h ( 10 ¯ m 2 ) and I h ( 2 m 3 ¯ 5 ¯ ) are taken ...

  16. Conditions allowing the formation of biogenic amines in cheese

    NARCIS (Netherlands)

    Joosten, H.M.L.J.

    1988-01-01

    A study was undertaken to reveal the conditions that allow the formation of biogenic amines in cheese.

    The starters most commonly used in the Dutch cheese industry do not have decarboxylative properties. Only if the milk or curd is contaminated with non-starter bacteria, amine

  17. 40 CFR 86.1834-01 - Allowable maintenance.

    Science.gov (United States)

    2010-07-01

    ... Compliance Provisions for Control of Air Pollution From New and In-Use Light-Duty Vehicles, Light-Duty Trucks... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Allowable maintenance. 86.1834-01 Section 86.1834-01 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS...

  18. Pedagogy of social transformation in the Hebrew Bible: Allowing ...

    African Journals Online (AJOL)

    Pedagogy of social transformation in the Hebrew Bible: Allowing Scripture to inform our interpretive strategy for contemporary application. ... HTS Teologiese Studies / Theological Studies ... Lastly, the article considers how the narrative, Law, prophets and wisdom texts in the Hebrew Bible train in social critique. This article ...

  19. Spacecraft Maximum Allowable Concentrations for Selected Airborne Contaminants, Volume 5

    National Research Council Canada - National Science Library

    Committee on Spacecraft Exposure Guidelines; Board on Environmental Studies and Toxicology; Committee on Toxicology

    2008-01-01

    ... requested the National Research Council (NRC) to develop guidelines for establishing spacecraft maximum allowable concentrations (SMACs) for contaminants and to review SMACs for various spacecraft contaminants to determine whether NASA's recommended exposure limits are consistent with the guidelines recommended by the committee. In response to this...

  20. Spacecraft maximum allowable concentrations for selected airborne contaminants, volume 1

    Science.gov (United States)

    1994-01-01

    As part of its efforts to promote safe conditions aboard spacecraft, NASA requested the National Research Council (NRC) to develop guidelines for establishing spacecraft maximum allowable concentrations (SMAC's) for contaminants, and to review SMAC's for various spacecraft contaminants to determine whether NASA's recommended exposure limits are consistent with the guidelines recommended by the subcommittee. In response to NASA's request, the NRC organized the Subcommittee on Guidelines for Developing Spacecraft Maximum Allowable Concentrations for Space Station Contaminants within the Committee on Toxicology (COT). In the first phase of its work, the subcommittee developed the criteria and methods for preparing SMAC's for spacecraft contaminants. The subcommittee's report, entitled Guidelines for Developing Spacecraft Maximum Allowable Concentrations for Space Station Contaminants, was published in 1992. The executive summary of that report is reprinted as Appendix A of this volume. In the second phase of the study, the Subcommittee on Spacecraft Maximum Allowable Concentrations reviewed reports prepared by NASA scientists and contractors recommending SMAC's for 35 spacecraft contaminants. The subcommittee sought to determine whether the SMAC reports were consistent with the 1992 guidelines. Appendix B of this volume contains the first 11 SMAC reports that have been reviewed for their application of the guidelines developed in the first phase of this activity and approved by the subcommittee.

  1. 20 CFR 631.41 - Allowable State activities.

    Science.gov (United States)

    2010-04-01

    ... 631.41 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR PROGRAMS UNDER TITLE III OF THE JOB TRAINING PARTNERSHIP ACT State Programs § 631.41 Allowable State activities. (a...) Activities shall be coordinated with other programs serving dislocated workers, including training under...

  2. 20 CFR 632.173 - Allowable program activities.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Allowable program activities. 632.173 Section 632.173 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR INDIAN AND... appropriate Federal or State agency including BIA, of a NAG's total section 401 allocation. For nonreservation...

  3. Pedagogy of social transformation in the Hebrew Bible: Allowing ...

    African Journals Online (AJOL)

    2016-05-12

    May 12, 2016 ... This article helps. Christians to develop a biblically based hermeneutic of the Hebrew Scripture's social transformation for application today. Pedagogy of social transformation in the Hebrew Bible: Allowing Scripture to inform our interpretive strategy for contemporary application. Read online: Scan this QR.

  4. 34 CFR 645.40 - What are allowable costs?

    Science.gov (United States)

    2010-07-01

    ... for whom English language proficiency is necessary to succeed in postsecondary education. (l... 34 Education 3 2010-07-01 2010-07-01 false What are allowable costs? 645.40 Section 645.40 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY...

  5. 34 CFR 371.41 - What are allowable costs?

    Science.gov (United States)

    2010-07-01

    ... rehabilitation services. (2) Expenditures for services reflecting the cultural background of the American Indians... 34 Education 2 2010-07-01 2010-07-01 false What are allowable costs? 371.41 Section 371.41 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF SPECIAL EDUCATION...

  6. Education Maintenance Allowances: The Impact on Further Education. FEDA Reports.

    Science.gov (United States)

    Fletcher, Mick

    In September 1999, a pilot program of Education Maintenance Allowances (EMAs) was introduced in 15 local education authorities (LEAs) in England to provide payments to students aged 16-19 who are from low-income families and who are attending full-time courses in schools and colleges. Participants are entitled to 2 years' support and must be…

  7. 48 CFR 752.7028 - Differential and allowances.

    Science.gov (United States)

    2010-10-01

    ... from a school in the United States for secondary education (in lieu of an educational allowance) and... conditions of environment in the continental United States and warrant additional compensation as a... on the day of departure from post of assignment en route to the United States. Sick or vacation leave...

  8. 44 CFR 11.74 - Claims not allowed.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Claims not allowed. 11.74 Section 11.74 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF... arose during the conduct of personal business are not payable. (2) Subrogation claims. Claims based upon...

  9. Group-sequential analysis may allow for early trial termination

    DEFF Research Database (Denmark)

    Gerke, Oke; Vilstrup, Mie H; Halekoh, Ulrich

    2017-01-01

    and the final analysis. Other partitions did not suggest early stopping after adjustment for multiple testing due to one influential outlier and our small sample size. CONCLUSIONS: Group-sequential testing may enable early stopping of a trial, allowing for potential time and resource savings. The testing...

  10. Minimum Parental Allowance Payments Received by Finnish Mothers

    Directory of Open Access Journals (Sweden)

    Pentti Takala

    2005-01-01

    Full Text Available In Finland, the number and the proportion of women receiving minimum maternity or parental allowance increased dramatically in the 1990s. Their share increased to a high point of 30 percent in 1996 and remained at over 25 percent to the end of the decade. The aim of this study was to describe some of the characteristics typical of these women, and to analyse how often and in what circumstances they had to rely on last-resort income support (social assistance and housing allowance. The material comprises data on the total working-aged population, retrieved from the income security registers maintained by the Social Insurance Institution and from the social assistance register maintained by Stakes (National Research and Development Centre for Welfare and Health. Both cross sectional and longitudinal data were analysed by means of cross tabulations and means as well as logistic regression. We could differentiate two groups of mothers receiving minimum bene? t: young mothers and middle-aged mothers with many children. Twenty percent of women on the minimum allowance also received social assistance and 38 percent received housing allowance. Reliance on social assistance was particularly common among mothers who had delivered their ? rst baby and among young mothers who had delivered more than one baby

  11. Policy Lessons from Children's Allowances for Children's Savings Accounts.

    Science.gov (United States)

    Curley, Jami; Sherraden, Michael

    2000-01-01

    Examines the history and current structure of "children's allowances" (cash grants from the government to families with children) around the world and particularly in the United States, to provide a framework for children's savings accounts--long-term savings and asset accumulation for all children. Considers public policy directions for…

  12. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Science.gov (United States)

    De Langhe, Ellen; Vande Velde, Greetje; Hostens, Jeroen; Himmelreich, Uwe; Nemery, Benoit; Luyten, Frank P; Vanoirbeek, Jeroen; Lories, Rik J

    2012-01-01

    In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. Mice received an intratracheal instillation of bleomycin (n = 8), elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8) or saline control (n = 6 for fibrosis, n = 5 for emphysema). A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4). Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8). Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This method is reproducible with low inherent measurement variability. We show that

  13. Quantification of lung fibrosis and emphysema in mice using automated micro-computed tomography.

    Directory of Open Access Journals (Sweden)

    Ellen De Langhe

    Full Text Available BACKGROUND: In vivo high-resolution micro-computed tomography allows for longitudinal image-based measurements in animal models of lung disease. The combination of repetitive high resolution imaging with fully automated quantitative image analysis in mouse models of lung fibrosis lung benefits preclinical research. This study aimed to develop and validate such an automated micro-computed tomography analysis algorithm for quantification of aerated lung volume in mice; an indicator of pulmonary fibrosis and emphysema severity. METHODOLOGY: Mice received an intratracheal instillation of bleomycin (n = 8, elastase (0.25 U elastase n = 9, 0.5 U elastase n = 8 or saline control (n = 6 for fibrosis, n = 5 for emphysema. A subset of mice was scanned without intervention, to evaluate potential radiation-induced toxicity (n = 4. Some bleomycin-instilled mice were treated with imatinib for proof of concept (n = 8. Mice were scanned weekly, until four weeks after induction, when they underwent pulmonary function testing, lung histology and collagen quantification. Aerated lung volumes were calculated with our automated algorithm. PRINCIPAL FINDINGS: Our automated image-based aerated lung volume quantification method is reproducible with low intra-subject variability. Bleomycin-treated mice had significantly lower scan-derived aerated lung volumes, compared to controls. Aerated lung volume correlated with the histopathological fibrosis score and total lung collagen content. Inversely, a dose-dependent increase in lung volume was observed in elastase-treated mice. Serial scanning of individual mice is feasible and visualized dynamic disease progression. No radiation-induced toxicity was observed. Three-dimensional images provided critical topographical information. CONCLUSIONS: We report on a high resolution in vivo micro-computed tomography image analysis algorithm that runs fully automated and allows quantification of aerated lung volume in mice. This

  14. Spacecraft Maximum Allowable Concentrations for Selected Airborne Contaminants. Volume 2

    Science.gov (United States)

    1996-01-01

    The National Aeronautics and Space Administration (NASA) is aware of the potential toxicological hazards to humans that might be associated with prolonged spacecraft missions. Despite major engineering advances in controlling the atmosphere within spacecraft, some contamination of the air appears inevitable. NASA has measured numerous airborne contaminants during space missions. As the missions increase in duration and complexity, ensuring the health and well-being of astronauts traveling and working in this unique environment becomes increasingly difficult. As part of its efforts to promote safe conditions aboard spacecraft, NASA requested the National Research Council (NRC) to develop guidelines for establishing spacecraft maximum allowable concentrations (SMACs) for contaminants, and to review SMACs for various space-craft contaminants to determine whether NASA's recommended exposure limits are consistent with the guidelines recommended by the subcommittee. In response to NASA's request, the NRC organized the Subcommittee on Guidelines for Developing Spacecraft Maximum Allowable Concentrations for Space Station Contaminants within the Committee On Toxicology (COT). In the first phase of its work, the subcommittee developed the criteria and methods for preparing SMACs for spacecraft contaminants. The subcommittee's report, entitled Guidelines for Developing Spacecraft Maximum Allowable Concentrations for Space Station Contaminants, was published in 1992. The executive summary of that report is reprinted as Appendix A of this volume. In the second phase of the study, the Subcommittee on Spacecraft Maximum Allowable Concentrations reviewed reports prepared by NASA scientists and contractors recommending SMACs for approximately 35 spacecraft contaminants. The subcommittee sought to determine whether the SMAC reports were consistent with the 1992 guidelines. Appendix B of this volume contains the SMAC reports for 12 chemical contaminants that have been reviewed for

  15. Validation of a weather forecast model at radiance level against satellite observations allowing quantification of temperature, humidity, and cloud-related biases

    Science.gov (United States)

    Bani Shahabadi, Maziar; Huang, Yi; Garand, Louis; Heilliette, Sylvain; Yang, Ping

    2016-09-01

    An established radiative transfer model (RTM) is adapted for simulating all-sky infrared radiance spectra from the Canadian Global Environmental Multiscale (GEM) model in order to validate its forecasts at the radiance level against Atmospheric InfraRed Sounder (AIRS) observations. Synthetic spectra are generated for 2 months from short-term (3-9 h) GEM forecasts. The RTM uses a monthly climatological land surface emissivity/reflectivity atlas. An updated ice particle optical property library was introduced for cloudy radiance calculations. Forward model brightness temperature (BT) biases are assessed to be of the order of ˜1 K for both clear-sky and overcast conditions. To quantify GEM forecast meteorological variables biases, spectral sensitivity kernels are generated and used to attribute radiance biases to surface and atmospheric temperatures, atmospheric humidity, and clouds biases. The kernel method, supplemented with retrieved profiles based on AIRS observations in collocation with a microwave sounder, achieves good closure in explaining clear-sky radiance biases, which are attributed mostly to surface temperature and upper tropospheric water vapor biases. Cloudy-sky radiance biases are dominated by cloud-induced radiance biases. Prominent GEM biases are identified as: (1) too low surface temperature over land, causing about -5 K bias in the atmospheric window region; (2) too high upper tropospheric water vapor, inducing about -3 K bias in the water vapor absorption band; (3) too few high clouds in the convective regions, generating about +10 K bias in window band and about +6 K bias in the water vapor band.

  16. Definition of the "Drug-Angiogenic-Activity-Index" that allows the quantification of the positive and negative angiogenic active drugs: a study based on the chorioallantoic membrane model.

    Science.gov (United States)

    Demir, Resit; Peros, Georgios; Hohenberger, Werner

    2011-06-01

    Since the introduction of the angiogenic therapy by Folkman et al. in the 1970'ies many antiangiogenic drugs were identified. Only few of them are still now in clinical use. Also the Vascular Endothelial Growth Factor (VEGF), the cytokine with the highest angiogenic activity, has been identified. Its antagonist, Bevacizumab, is produced and admitted for the angiogenic therapy in first line for metastatic colorectal cancer. When we look at preclinical studies, they fail of in vivo models that define the "Drug-Angiogenic-Activity-Index" of angiogenic or antiangiogenic drugs. This work proposes a possible standardized procedure to define the "Drug Angiogenic Activity Index" by counting the vascular intersections (VIS) on the Chorioallantoic Membrane after drug application. The equation was defined as follows: {ΔVIS[Drug]-ΔVIS[Control]} / Δ VIS[Control]. For VEGF a Drug-Angiogenic-Activity-Index of 0.92 was found and for Bevacizumab a -1. This means almost that double of the naturally angiogenic activity was achieved by VEGF on the Chorioallantoic membrane. A complete blocking of naturally angiogenic activity was observed after Bevacizumabs application. Establishing the "Drug-Angiogenic-Activity-Index" in the preclinical phase will give us an impact of effectiveness for the new constructed antiangiogenic drugs like the impact of effectiveness in the cortisone family.

  17. Precise quantification of minimal residual disease at day 29 allows identification of children with acute lymphoblastic leukemia and an excellent outcome

    DEFF Research Database (Denmark)

    Nyvold, Charlotte; Madsen, Hans O; Ryder, Lars P

    2002-01-01

    The postinduction level of minimal residual disease (MRD) was quantified with a competitive polymerase chain reaction (PCR) technique in 104 children with acute lymphoblastic leukemia (ALL) diagnosed between June 1993 and January 1998 and followed for a median of 4.2 years. A significant correlat......The postinduction level of minimal residual disease (MRD) was quantified with a competitive polymerase chain reaction (PCR) technique in 104 children with acute lymphoblastic leukemia (ALL) diagnosed between June 1993 and January 1998 and followed for a median of 4.2 years. A significant...

  18. Synthesis and Review: Advancing agricultural greenhouse gas quantification

    Science.gov (United States)

    Olander, Lydia P.; Wollenberg, Eva; Tubiello, Francesco N.; Herold, Martin

    2014-07-01

    Reducing emissions of agricultural greenhouse gases (GHGs), such as methane and nitrous oxide, and sequestering carbon in the soil or in living biomass can help reduce the impact of agriculture on climate change while improving productivity and reducing resource use. There is an increasing demand for improved, low cost quantification of GHGs in agriculture, whether for national reporting to the United Nations Framework Convention on Climate Change (UNFCCC), underpinning and stimulating improved practices, establishing crediting mechanisms, or supporting green products. This ERL focus issue highlights GHG quantification to call attention to our existing knowledge and opportunities for further progress. In this article we synthesize the findings of 21 papers on the current state of global capability for agricultural GHG quantification and visions for its improvement. We conclude that strategic investment in quantification can lead to significant global improvement in agricultural GHG estimation in the near term.

  19. Quantification of myocardial perfusion by cardiovascular magnetic resonance.

    Science.gov (United States)

    Jerosch-Herold, Michael

    2010-10-08

    The potential of contrast-enhanced cardiovascular magnetic resonance (CMR) for a quantitative assessment of myocardial perfusion has been explored for more than a decade now, with encouraging results from comparisons with accepted "gold standards", such as microspheres used in the physiology laboratory. This has generated an increasing interest in the requirements and methodological approaches for the non-invasive quantification of myocardial blood flow by CMR. This review provides a synopsis of the current status of the field, and introduces the reader to the technical aspects of perfusion quantification by CMR. The field has reached a stage, where quantification of myocardial perfusion is no longer a claim exclusive to nuclear imaging techniques. CMR may in fact offer important advantages like the absence of ionizing radiation, high spatial resolution, and an unmatched versatility to combine the interrogation of the perfusion status with a comprehensive tissue characterization. Further progress will depend on successful dissemination of the techniques for perfusion quantification among the CMR community.

  20. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  1. Quantification and Localization of Mast Cells in Periapical Lesions

    African Journals Online (AJOL)

    of the potential range of mast cell function and interactions in. Quantification and .... [4] A great variety of bacterial antigens may stimulate host immune responses ... the thickness of the cyst capsule indicated that they were more prevalent just ...

  2. Extraction, quantification and degree of polymerization of yacon ...

    African Journals Online (AJOL)

    Extraction, quantification and degree of polymerization of yacon (Smallanthus sonchifolia) fructans. EWN da Fonseca Contado, E de Rezende Queiroz, DA Rocha, RM Fraguas, AA Simao, LNS Botelho, A de Fatima Abreu, MABCMP de Abreu ...

  3. Mandibular asymmetry: a three-dimensional quantification of bilateral condyles

    National Research Council Canada - National Science Library

    Lin, Han; Zhu, Ping; Lin, Yi; Wan, Shuangquan; Shu, Xin; Xu, Yue; Zheng, Youhua

    2013-01-01

    .... In this study, a three-dimensional (3-D) quantification of bilateral asymmetrical condyles was firstly conducted to identify the specific role of 3-D condylar configuration for mandibular asymmetry...

  4. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective for the Phase II effort will be to develop a comprehensive, efficient, and flexible uncertainty quantification (UQ) framework implemented within a...

  5. Uncertainty Quantification for Production Navier-Stokes Solvers Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The uncertainty quantification methods developed under this program are designed for use with current state-of-the-art flow solvers developed by and in use at NASA....

  6. Quantification of propionic acid from Scutellaria baicalensis roots

    Directory of Open Access Journals (Sweden)

    Eunjung Son

    2017-03-01

    Conclusion: This study is the first to report that propionic acid exists in S. baicalensis roots and also provides a useful ultra performance liquid chromatography analysis method for its quantification.

  7. (1) H-MRS processing parameters affect metabolite quantification

    DEFF Research Database (Denmark)

    Bhogal, Alex A; Schür, Remmelt R; Houtepen, Lotte C

    2017-01-01

    + NAAG/Cr + PCr and Glu/Cr + PCr, respectively. Metabolite quantification using identical (1) H-MRS data was influenced by processing parameters, basis sets and software choice. Locally preferred processing choices affected metabolite quantification, even when using identical software. Our results....... It is currently unknown to what extent variations in the analysis pipeline used to quantify (1) H-MRS data affect outcomes. The purpose of this study was to evaluate whether the quantification of identical (1) H-MRS scans across independent and experienced research groups would yield comparable results. We...... investigated the influence of model parameters and spectral quantification software on fitted metabolite concentration values. Sixty spectra in 30 individuals (repeated measures) were acquired using a 7-T MRI scanner. Data were processed by four independent research groups with the freedom to choose their own...

  8. A quantification model for the structure of clay materials.

    Science.gov (United States)

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  9. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Gallati, Sabina, E-mail: sabina.gallati@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland); Schaller, Andre, E-mail: andre.schaller@insel.ch [Division of Human Genetics, Departements of Pediatrics and Clinical Research, Inselspital, University of Berne, Freiburgstrasse, CH-3010 Berne (Switzerland)

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serial qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in

  10. Quantification of aortic regurgitation by magnetic resonance velocity mapping

    DEFF Research Database (Denmark)

    Søndergaard, Lise; Lindvig, K; Hildebrandt, P

    1993-01-01

    The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients, and the regurgit......The use of magnetic resonance (MR) velocity mapping in the quantification of aortic valvular blood flow was examined in 10 patients with angiographically verified aortic regurgitation. MR velocity mapping succeeded in identifying and quantifying the regurgitation in all patients...

  11. Quantification of propionic acid from Scutellaria baicalensis roots

    OpenAIRE

    Eunjung Son; Ho Kyoung Kim; Hyun Sik Kim; Mee Ree Kim; Dong-Seon Kim

    2017-01-01

    Background: Propionic acid is a widely used preservative and has been mainly formed by artificial synthesis or fermentation. In the case of natural products, the presence of propionic acid is viewed as a sign that an additive has been introduced for antimicrobial effects. Methods: In this work, the propionic acid that occurs in Scutellaria baicalensis roots was studied. A quantification method was developed, validated, and showed good linearity, low limit of detection, and limit of quantif...

  12. Spacecraft Maximum Allowable Concentrations for Selected Airborne Contaminants. Volume 5

    Science.gov (United States)

    2008-01-01

    To protect space crews from air contaminants, NASA requested that the National Research Council (NRC) provide guidance for developing spacecraft maximum allowable concentrations (SMACs) and review NASA's development of exposure guidelines for specific chemicals. The NRC convened the Committee on Spacecraft Exposure Guidelines to address this task. The committee published Guidelines for Developing Spacecraft Maximum Allowable Concentrations for Space Station Contaminants (NRC 1992). The reason for the review of chemicals in Volume 5 is that many of them have not been examined for more than 10 years, and new research necessitates examining the documents to ensure that they reflect current knowledge. New knowledge can be in the form of toxicologic data or in the application of new approaches for analysis of available data. In addition, because NASA anticipates longer space missions beyond low Earth orbit, SMACs for 1,000-d exposures have also been developed.

  13. Microscopic quantification of cell integrity in raw and processed onion parenchyma cells.

    Science.gov (United States)

    Gonzalez, M E; Jernstedt, J A; Slaughter, D C; Barrett, D M

    2010-09-01

    A cell viability assessment method based computer vision analysis of the uptake of neutral red dye was used to quantify cell membrane integrity in raw and processed parenchyma cells of onion tissues. The presence of stained vacuoles was used as an indicator of tonoplast membrane integrity and photomicrographs were acquired for microscopic image analysis and cell integrity quantification. Two different image analysis methods, involving the analysis of the saturation and green components of RGB (red, green, blue) images, were compared to the conventional cell count method. Use of the saturation component of RGB images allowed for the visualization and quantification of viable and inviable cells as well as extracellular air spaces. The combination of neutral red uptake, as visualization by light field microscopy, and saturation image analysis, allowed for quantitative determination of the effects of high pressure processing on onion cell integrity. Preservation of vegetable tissues may involve heating or other methods that result in the loss of tissue integrity and potentially quality deterioration. In this study, we stained unprocessed and processed onion tissues with neutral red dye and then used a microscope and a computer imaging program to quantify how many cells were intact or ruptured.

  14. Genome-wide Quantification of Translation in Budding Yeast by Ribosome Profiling.

    Science.gov (United States)

    Beaupere, Carine; Chen, Rosalyn B; Pelosi, William; Labunskyy, Vyacheslav M

    2017-12-21

    Translation of mRNA into proteins is a complex process involving several layers of regulation. It is often assumed that changes in mRNA transcription reflect changes in protein synthesis, but many exceptions have been observed. Recently, a technique called ribosome profiling (or Ribo-Seq) has emerged as a powerful method that allows identification, with high accuracy, which regions of mRNA are translated into proteins and quantification of translation at the genome-wide level. Here, we present a generalized protocol for genome-wide quantification of translation using Ribo-Seq in budding yeast. In addition, combining Ribo-Seq data with mRNA abundance measurements allows us to simultaneously quantify translation efficiency of thousands of mRNA transcripts in the same sample and compare changes in these parameters in response to experimental manipulations or in different physiological states. We describe a detailed protocol for generation of ribosome footprints using nuclease digestion, isolation of intact ribosome-footprint complexes via sucrose gradient fractionation, and preparation of DNA libraries for deep sequencing along with appropriate quality controls necessary to ensure accurate analysis of in vivo translation.

  15. Court allows marijuana clubs to raise medical necessity defense.

    Science.gov (United States)

    1999-10-01

    A ruling by the 9th U.S. Circuit Court of Appeals will allow California's medical marijuana clubs to defend themselves against an injunction against operating. The court ruled that U.S. District Judge Charles R. Breyer erred by failing to consider that marijuana was an indispensable part of treatment for the club's clients. The ruling has applicability in cases in Alaska, Arizona, Nevada, Oregon and Washington, which are all within the jurisdiction of the 9th Circuit Court.

  16. Online Homework Management Systems: Should We Allow Multiple Attempts?

    OpenAIRE

    Rhodes, M. Taylor; Sarbaum, Jeffrey K.

    2013-01-01

    Conventional pencil and paper wisdom suggests that allowing multiple attempts on homework will lead to more time spent on homework, higher homework grades, and better exam performance. For a variety of reasons, homework is increasingly being auto-administered online. This paper discusses the results of a quasi-experiment designed to evaluate student behavior under single and multiple attempt homework settings using an online homework management system. The paper explores whether multiple atte...

  17. Mass spectrometry allows direct identification of proteins in large genomes

    DEFF Research Database (Denmark)

    Küster, B; Mortensen, Peter V.; Andersen, Jens S.

    2001-01-01

    Proteome projects seek to provide systematic functional analysis of the genes uncovered by genome sequencing initiatives. Mass spectrometric protein identification is a key requirement in these studies but to date, database searching tools rely on the availability of protein sequences derived fro...... genome and allows identification, mapping, cloning and assistance in gene prediction of any protein for which minimal mass spectrometric information can be obtained. Several novel proteins from Arabidopsis thaliana and human have been discovered in this way....

  18. FINAL REMINDER: Extension/suppression of allowance for dependent child

    CERN Multimedia

    Human Resources Department

    2005-01-01

    Members of the personnel with dependent children aged 18 or above (or reaching 18 during the 2005/2006 school year) who have not yet provided a SCHOOL CERTIFICATE must do so as soon as possible. If we have not received this certificate by 2 December 2005 at the latest, the child allowance will be withdrawn retroactively as from 1 September 2005. Human Resources Department Tel. 72862

  19. Engineering surveying

    CERN Document Server

    Schofield, W

    2007-01-01

    Engineering surveying involves determining the position of natural and man-made features on or beneath the Earth's surface and utilizing these features in the planning, design and construction of works. It is a critical part of any engineering project. Without an accurate understanding of the size, shape and nature of the site the project risks expensive and time-consuming errors or even catastrophic failure.Engineering Surveying 6th edition covers all the basic principles and practice of this complex subject and the authors bring expertise and clarity. Previous editions of this classic text have given readers a clear understanding of fundamentals such as vertical control, distance, angles and position right through to the most modern technologies, and this fully updated edition continues that tradition.This sixth edition includes:* An introduction to geodesy to facilitate greater understanding of satellite systems* A fully updated chapter on GPS, GLONASS and GALILEO for satellite positioning in surveying* Al...

  20. Quantification of variability in trichome patterns

    Directory of Open Access Journals (Sweden)

    Bettina eGreese

    2014-11-01

    Full Text Available While pattern formation is studied in various areas of biology, little is known about the intrinsic noise leading to variations between individual realizations of the pattern. One prominent example for de novo pattern formation in plants is the patterning of trichomes on Arabidopsis leaves, which involves genetic regulation and cell-to-cell communication. These processes are potentially variable due to , e.g., the abundance of cell components or environmental conditions. To elevate the understanding of the regulatory processes underlying the pattern formation it is crucial to quantitatively analyze the variability in naturally occurring patterns. Here, we review recent approaches towards characterization of noise on trichome initiation. We present methods for the quantification of spatial patterns, which are the basis for data-driven mathematical modeling and enable the analysis of noise from different sources. Besides the insight gained on trichome formation, the examination of observed trichome patterns also shows that highly regulated biological processes can be substantially affected by variability.

  1. Characterization and quantification of biochar alkalinity.

    Science.gov (United States)

    Fidel, Rivka B; Laird, David A; Thompson, Michael L; Lawrinenko, Michael

    2017-01-01

    Lack of knowledge regarding the nature of biochar alkalis has hindered understanding of pH-sensitive biochar-soil interactions. Here we investigate the nature of biochar alkalinity and present a cohesive suite of methods for its quantification. Biochars produced from cellulose, corn stover and wood feedstocks had significant low-pKa organic structural (0.03-0.34 meq g(-1)), other organic (0-0.92 meq g(-1)), carbonate (0.02-1.5 meq g(-1)), and other inorganic (0-0.26 meq g(-1)) alkalinities. All four categories of biochar alkalinity contributed to total biochar alkalinity and are therefore relevant to pH-sensitive soil processes. Total biochar alkalinity was strongly correlated with base cation concentration, but biochar alkalinity was not a simple function of elemental composition, soluble ash, fixed carbon, or volatile matter content. More research is needed to characterize soluble biochar alkalis other than carbonates and to establish predictive relationships among biochar production parameters and the composition of biochar alkalis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  3. Legionella spp. isolation and quantification from greywater.

    Science.gov (United States)

    Rodríguez-Martínez, Sara; Blanky, Marina; Friedler, Eran; Halpern, Malka

    2015-01-01

    Legionella, an opportunistic human pathogen whose natural environment is water, is transmitted to humans through inhalation of contaminated aerosols. Legionella has been isolated from a high diversity of water types. Due its importance as a pathogen, two ISO protocols have been developed for its monitoring. However, these two protocols are not suitable for analyzing Legionella in greywater (GW). GW is domestic wastewater excluding the inputs from toilets and kitchen. It can serve as an alternative water source, mainly for toilet flushing and garden irrigation; both producing aerosols that can cause a risk for Legionella infection. Hence, before reuse, GW has to be treated and its quality needs to be monitored. The difficulty of Legionella isolation from GW strives in the very high load of contaminant bacteria. Here we describe a modification of the ISO protocol 11731:1998 that enables the isolation and quantification of Legionella from GW samples. The following modifications were made:•To enable isolation of Legionella from greywater, a pre-filtration step that removes coarse matter is recommended.•Legionella can be isolated after a combined acid-thermic treatment that eliminates the high load of contaminant bacteria in the sample.

  4. Uncertainty Quantification for Cargo Hold Fires

    CERN Document Server

    DeGennaro, Anthony M; Martinelli, Luigi; Rowley, Clarence W

    2015-01-01

    The purpose of this study is twofold -- first, to introduce the application of high-order discontinuous Galerkin methods to buoyancy-driven cargo hold fire simulations, second, to explore statistical variation in the fluid dynamics of a cargo hold fire given parameterized uncertainty in the fire source location and temperature. Cargo hold fires represent a class of problems that require highly-accurate computational methods to simulate faithfully. Hence, we use an in-house discontinuous Galerkin code to treat these flows. Cargo hold fires also exhibit a large amount of uncertainty with respect to the boundary conditions. Thus, the second aim of this paper is to quantify the resulting uncertainty in the flow, using tools from the uncertainty quantification community to ensure that our efforts require a minimal number of simulations. We expect that the results of this study will provide statistical insight into the effects of fire location and temperature on cargo fires, and also assist in the optimization of f...

  5. On uncertainty quantification in hydrogeology and hydrogeophysics

    Science.gov (United States)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  6. Allowable pillar to diameter ratio for strategic petroleum reserve caverns.

    Energy Technology Data Exchange (ETDEWEB)

    Ehgartner, Brian L.; Park, Byoung Yoon

    2011-05-01

    This report compiles 3-D finite element analyses performed to evaluate the stability of Strategic Petroleum Reserve (SPR) caverns over multiple leach cycles. When oil is withdrawn from a cavern in salt using freshwater, the cavern enlarges. As a result, the pillar separating caverns in the SPR fields is reduced over time due to usage of the reserve. The enlarged cavern diameters and smaller pillars reduce underground stability. Advances in geomechanics modeling enable the allowable pillar to diameter ratio (P/D) to be defined. Prior to such modeling capabilities, the allowable P/D was established as 1.78 based on some very limited experience in other cavern fields. While appropriate for 1980, the ratio conservatively limits the allowable number of oil drawdowns and hence limits the overall utility and life of the SPR cavern field. Analyses from all four cavern fields are evaluated along with operating experience gained over the past 30 years to define a new P/D for the reserve. A new ratio of 1.0 is recommended. This ratio is applicable only to existing SPR caverns.

  7. Scientific substantination of maximum allowable concentration of fluopicolide in water

    Directory of Open Access Journals (Sweden)

    Pelo I.М.

    2014-03-01

    Full Text Available In order to substantiate fluopicolide maximum allowable concentration in the water of water reservoirs the research was carried out. Methods of study: laboratory hygienic experiment using organoleptic and sanitary-chemical, sanitary-toxicological, sanitary-microbiological and mathematical methods. The results of fluopicolide influence on organoleptic properties of water, sanitary regimen of reservoirs for household purposes were given and its subthreshold concentration in water by sanitary and toxicological hazard index was calculated. The threshold concentration of the substance by the main hazard criteria was established, the maximum allowable concentration in water was substantiated. The studies led to the following conclusions: fluopicolide threshold concentration in water by organoleptic hazard index (limiting criterion – the smell – 0.15 mg/dm3, general sanitary hazard index (limiting criteria – impact on the number of saprophytic microflora, biochemical oxygen demand and nitrification – 0.015 mg/dm3, the maximum noneffective concentration – 0.14 mg/dm3, the maximum allowable concentration - 0.015 mg/dm3.

  8. Bacterial adhesion force quantification by fluidic force microscopy

    Science.gov (United States)

    Potthoff, Eva; Ossola, Dario; Zambelli, Tomaso; Vorholt, Julia A.

    2015-02-01

    Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many cells. The contact time and setpoint dependence of the adhesion forces of E. coli and Streptococcus pyogenes, as well as the sequential detachment of bacteria out of a chain, are shown, revealing distinct force patterns in the detachment curves. This study demonstrates the potential of the FluidFM technology for quantitative bacterial adhesion measurements of cell-substrate and cell-cell interactions that are relevant in biofilms and infection biology.Quantification of detachment forces between bacteria and substrates facilitates the understanding of the bacterial adhesion process that affects cell physiology and survival. Here, we present a method that allows for serial, single bacterial cell force spectroscopy by combining the force control of atomic force microscopy with microfluidics. Reversible bacterial cell immobilization under physiological conditions on the pyramidal tip of a microchanneled cantilever is achieved by underpressure. Using the fluidic force microscopy technology (FluidFM), we achieve immobilization forces greater than those of state-of-the-art cell-cantilever binding as demonstrated by the detachment of Escherichia coli from polydopamine with recorded forces between 4 and 8 nN for many

  9. Surveying Education

    DEFF Research Database (Denmark)

    Enemark, Stig

    2009-01-01

    In relation to surveying education there is one big question to be asked: Is the role of the surveyors changing? In a global perspective the answer will be "Yes". There is a big swing that could be entitled "From Measurement to Management". This does not imply that measurement is no longer....... In surveying education there are a range of other challenges to be faced. These relate to the focus on learning to learn; the need for flexible curriculum to deal with constant change; the move towards introducing virtual academy; the demand for creating a quality culture; and the perspective of lifelong...... on an efficient interaction between education, research, and professional practice....

  10. Progress Report on Alloy 617 Time Dependent Allowables

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Julie Knibloe [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-06-01

    Time dependent allowable stresses are required in the ASME Boiler and Pressure Vessel Code for design of components in the temperature range where time dependent deformation (i.e., creep) is expected to become significant. There are time dependent allowable stresses in Section IID of the Code for use in the non-nuclear construction codes, however, there are additional criteria that must be considered in developing time dependent allowables for nuclear components. These criteria are specified in Section III NH. St is defined as the lesser of three quantities: 100% of the average stress required to obtain a total (elastic, plastic, primary and secondary creep) strain of 1%; 67% of the minimum stress to cause rupture; and 80% of the minimum stress to cause the initiation of tertiary creep. The values are reported for a range of temperatures and for time increments up to 100,000 hours. These values are determined from uniaxial creep tests, which involve the elevated temperature application of a constant load which is relatively small, resulting in deformation over a long time period prior to rupture. The stress which is the minimum resulting from these criteria is the time dependent allowable stress St. In this report data from a large number of creep and creep-rupture tests on Alloy 617 are analyzed using the ASME Section III NH criteria. Data which are used in the analysis are from the ongoing DOE sponsored high temperature materials program, form Korea Atomic Energy Institute through the Generation IV VHTR Materials Program and historical data from previous HTR research and vendor data generated in developing the alloy. It is found that the tertiary creep criterion determines St at highest temperatures, while the stress to cause 1% total strain controls at low temperatures. The ASME Section III Working Group on Allowable Stress Criteria has recommended that the uncertainties associated with determining the onset of tertiary creep and the lack of significant

  11. The quantification of fingerprint quality using a relative contrast index.

    Science.gov (United States)

    Humphreys, Jill D; Porter, Glenn; Bell, Michael

    2008-06-10

    Research into fingermark enhancement techniques has traditionally used visual comparisons and qualitative methods to assess their effectiveness based on the quality of the developed fingermark. However, with increasing research into the optimisation of these techniques the need for a quantitative evaluative method has arisen. Parameters for acceptable fingerprint quality are not well defined and generally encompass clear, sharp edges and high levels of contrast between the fingermark ridges and background material. Using these current parameters, a conclusive measurement of fingerprint quality and thus the effectiveness of development techniques cannot be achieved. This study presents a model through which an aspect of fingerprint quality can be objectively and impartially measured based on a relative contrast index, constructed through measuring the reflective intensity of the fingermark ridges against the background material. Using a fibre-optic spectrophotometer attached to a microscope with axial illumination, the intensity counts of the ridge detail and background material were measured and a logarithmic contrast index constructed. The microscope and spectrophotometer parameters were experimentally tested using a standard colour resolution chart with known reflective properties. The protocol was successfully applied to four sample groups: black inked fingerprints on white paper; latent fingermarks on white paper developed separately with ninhydrin and physical developer; and fingermarks in blood deposited on white tiles and enhanced with amido black. The contrast indices obtained quantitatively reflect the level of contrast and provide an indication of fingerprint quality through a numerical representation rather than previous qualitative methods. It has been suggested that the proposed method of fingerprint quantification may be viable for application in the forensic research arena as it allows the definitive measurement of contrast to aid the evaluation of

  12. Detection and quantification system for monitoring instruments

    Science.gov (United States)

    Dzenitis, John M.; Hertzog, Claudia K.; Makarewicz, Anthony J.; Henderer, Bruce D.; Riot, Vincent J.

    2008-08-12

    A method of detecting real events by obtaining a set of recent signal results, calculating measures of the noise or variation based on the set of recent signal results, calculating an expected baseline value based on the set of recent signal results, determining sample deviation, calculating an allowable deviation by multiplying the sample deviation by a threshold factor, setting an alarm threshold from the baseline value plus or minus the allowable deviation, and determining whether the signal results exceed the alarm threshold.

  13. Identification and Quantification of Carbonate Species Using Rock-Eval Pyrolysis

    Directory of Open Access Journals (Sweden)

    Pillot D.

    2013-03-01

    Full Text Available This paper presents a new reliable and rapid method to characterise and quantify carbonates in solid samples based on monitoring the CO2 flux emitted by progressive thermal decomposition of carbonates during a programmed heating. The different peaks of destabilisation allow determining the different types of carbonates present in the analysed sample. The quantification of each peak gives the respective proportions of these different types of carbonates in the sample. In addition to the chosen procedure presented in this paper, using a standard Rock-Eval 6 pyrolyser, calibration characteristic profiles are also presented for the most common carbonates in nature. This method should allow different types of application for different disciplines, either academic or industrial.

  14. Dual in vivo quantification of integrin-targeted and protease-activated agents in cancer using fluorescence molecular tomography (FMT).

    Science.gov (United States)

    Kossodo, Sylvie; Pickarski, Maureen; Lin, Shu-An; Gleason, Alexa; Gaspar, Renee; Buono, Chiara; Ho, Guojie; Blusztajn, Agnieszka; Cuneo, Garry; Zhang, Jun; Jensen, Jayme; Hargreaves, Richard; Coleman, Paul; Hartman, George; Rajopadhye, Milind; Duong, Le Thi; Sur, Cyrille; Yared, Wael; Peterson, Jeffrey; Bednar, Bohumil

    2010-10-01

    Integrins, especially α(v)β(3) and α(v)β(5), are upregulated in tumor cells and activated endothelial cells and as such, serve as cancer biomarkers. We developed a novel near-infrared-labeled optical agent for the in vivo detection and quantification of α(v)β(3)/α(v)β(5). A small peptidomimetic α(v)β(3) antagonist was synthesized, coupled to a near-infrared fluorescent (NIRF) dye, and tested for binding specificity using integrin-overexpressing cells, inhibition of vitronectin-mediated cell attachment, binding to tumor and endothelial cells in vitro, and competition studies. Pharmacokinetics, biodistribution, specificity of tumor targeting, and the effect of an antiangiogenic treatment were assessed in vivo. The integrin NIRF agent showed strong selectivity towards α(v)β(3/)α(v)β(5) in vitro and predominant tumor distribution in vivo, allowing noninvasive and real-time quantification of integrin signal in tumors. Antiangiogenic treatment significantly inhibited integrin signal in vivo but had no effect on a cathepsin-cleavable NIR agent. Simultaneous imaging revealed different patterns of distribution reflecting the underlying differences in integrin and cathepsin biology during tumor progression. NIRF-labeled integrin antagonists allow noninvasive molecular fluorescent imaging and quantification of tumors in vivo, improving and providing more refined approaches for cancer detection and treatment monitoring.

  15. Informing adolescents about human papillomavirus vaccination: what will parents allow?

    Science.gov (United States)

    Vallely, Lorraine A; Roberts, Stephen A; Kitchener, Henry C; Brabin, Loretta

    2008-04-24

    With the introduction of human papillomavirus (HPV) vaccination an evidence base on effective adolescent educational interventions is urgently required. We undertook formative research to develop and evaluate a film on HPV and cervical cancer prevention for school children who will be offered HPV vaccination in the UK. The main outcome measures were the number of children allowed by parents to view the film and children's knowledge. Our results indicated that the film's four key messages were acceptable to parents and largely understood by adolescents but these messages will need reinforcing if the full potential of a prophylactic vaccine is to be realised.

  16. Bends in nanotubes allow electric spin control and coupling

    DEFF Research Database (Denmark)

    Flensberg, Karsten; Marcus, Charles Masamed

    2010-01-01

    We investigate combined effects of spin-orbit coupling and magnetic field in carbon nanotubes containing one or more bends along their length. We show how bends can be used to provide electrical control of confined spins, while spins confined in straight segments remain insensitive to electric...... fields. Device geometries that allow general rotation of single spins are presented and analyzed. In addition, capacitive coupling along bends provides coherent spin-spin interaction, including between otherwise disconnected nanotubes, completing a universal set of one- and two-qubit gates....

  17. Analysis of electric vehicle's trip cost allowing late arrival

    Science.gov (United States)

    Leng, Jun-Qiang; Liu, Wei-Yi; Zhao, Lin

    2017-05-01

    In this paper, we use a car-following model to study each electric vehicle's trip cost and the total trip cost allowing late arrival. The numerical result show that the electricity cost has great effects on each commuter's trip cost and the total trip costs and that these effects are dependent on each commuter's time headway at the origin, but the electricity cost has no prominent impacts on the minimum value of total trip cost under each commuter's different time headway at the origin.

  18. Italian seismic databank allows on-line access

    Science.gov (United States)

    Barba, Salvatore; Giovambattista, Rita Di; Smriglio, Giuseppe

    Users can interactively query, search, and download parametric information, and digital recordings collected by the Italian Telemetered Seismic Network (ITSN) through the Istituto Nazionale Di Geofisica Seismic Network Databank (ISND). The databank is completely menu-driven and easy to use.The ITSN comprises about 80 seismic stations (Figure 1) equipped with several seismometers acting at a critical damping of 70% and characterized by a period of 1 Hz. The seismometers' signals are transmitted over telephone lines or radio relay systems and then demodulated and recorded by an automatic acquisition system developed in cooperation with the U.S. Geological Survey. Digital data are stored on a magnetic disk and then processed through interactive procedures. The automatic selection of the seismic phases is checked daily and eventually corrected by seismic analysts. New data are available to users within a week.

  19. Quantification and Propagation of Nuclear Data Uncertainties

    Science.gov (United States)

    Rising, Michael E.

    The use of several uncertainty quantification and propagation methodologies is investigated in the context of the prompt fission neutron spectrum (PFNS) uncertainties and its impact on critical reactor assemblies. First, the first-order, linear Kalman filter is used as a nuclear data evaluation and uncertainty quantification tool combining available PFNS experimental data and a modified version of the Los Alamos (LA) model. The experimental covariance matrices, not generally given in the EXFOR database, are computed using the GMA methodology used by the IAEA to establish more appropriate correlations within each experiment. Then, using systematics relating the LA model parameters across a suite of isotopes, the PFNS for both the uranium and plutonium actinides are evaluated leading to a new evaluation including cross-isotope correlations. Next, an alternative evaluation approach, the unified Monte Carlo (UMC) method, is studied for the evaluation of the PFNS for the n(0.5 MeV)+Pu-239 fission reaction and compared to the Kalman filter. The UMC approach to nuclear data evaluation is implemented in a variety of ways to test convergence toward the Kalman filter results and to determine the nonlinearities present in the LA model. Ultimately, the UMC approach is shown to be comparable to the Kalman filter for a realistic data evaluation of the PFNS and is capable of capturing the nonlinearities present in the LA model. Next, the impact that the PFNS uncertainties have on important critical assemblies is investigated. Using the PFNS covariance matrices in the ENDF/B-VII.1 nuclear data library, the uncertainties of the effective multiplication factor, leakage, and spectral indices of the Lady Godiva and Jezebel critical assemblies are quantified. Using principal component analysis on the PFNS covariance matrices results in needing only 2-3 principal components to retain the PFNS uncertainties. Then, using the polynomial chaos expansion (PCE) on the uncertain output

  20. Quantification of isotopic turnover in agricultural systems

    Science.gov (United States)

    Braun, A.; Auerswald, K.; Schnyder, H.

    2012-04-01

    The isotopic turnover, which is a proxy for the metabolic rate, is gaining scientific importance. It is quantified for an increasing range of organisms, from microorganisms over plants to animals including agricultural livestock. Additionally, the isotopic turnover is analyzed on different scales, from organs to organisms to ecosystems and even to the biosphere. In particular, the quantification of the isotopic turnover of specific tissues within the same organism, e.g. organs like liver and muscle and products like milk and faeces, has brought new insights to improve understanding of nutrient cycles and fluxes, respectively. Thus, the knowledge of isotopic turnover is important in many areas, including physiology, e.g. milk synthesis, ecology, e.g. soil retention time of water, and medical science, e.g. cancer diagnosis. So far, the isotopic turnover is quantified by applying time, cost and expertise intensive tracer experiments. Usually, this comprises two isotopic equilibration periods. A first equilibration period with a constant isotopic input signal is followed by a second equilibration period with a distinct constant isotopic input signal. This yields a smooth signal change from the first to the second signal in the object under consideration. This approach reveals at least three major problems. (i) The input signals must be controlled isotopically, which is almost impossible in many realistic cases like free ranging animals. (ii) Both equilibration periods may be very long, especially when the turnover rate of the object under consideration is very slow, which aggravates the first problem. (iii) The detection of small or slow pools is improved by large isotopic signal changes, but large isotopic changes also involve a considerable change in the input material; e.g. animal studies are usually carried out as diet-switch experiments, where the diet is switched between C3 and C4 plants, since C3 and C4 plants differ strongly in their isotopic signal. The

  1. Naturally occurring allele diversity allows potato cultivation in northern latitudes.

    Science.gov (United States)

    Kloosterman, Bjorn; Abelenda, José A; Gomez, María del Mar Carretero; Oortwijn, Marian; de Boer, Jan M; Kowitwanich, Krissana; Horvath, Beatrix M; van Eck, Herman J; Smaczniak, Cezary; Prat, Salomé; Visser, Richard G F; Bachem, Christian W B

    2013-03-14

    Potato (Solanum tuberosum L.) originates from the Andes and evolved short-day-dependent tuber formation as a vegetative propagation strategy. Here we describe the identification of a central regulator underlying a major-effect quantitative trait locus for plant maturity and initiation of tuber development. We show that this gene belongs to the family of DOF (DNA-binding with one finger) transcription factors and regulates tuberization and plant life cycle length, by acting as a mediator between the circadian clock and the StSP6A mobile tuberization signal. We also show that natural allelic variants evade post-translational light regulation, allowing cultivation outside the geographical centre of origin of potato. Potato is a member of the Solanaceae family and is one of the world's most important food crops. This annual plant originates from the Andean regions of South America. Potato develops tubers from underground stems called stolons. Its equatorial origin makes potato essentially short-day dependent for tuberization and potato will not make tubers in the long-day conditions of spring and summer in the northern latitudes. When introduced in temperate zones, wild material will form tubers in the course of the autumnal shortening of day-length. Thus, one of the first selected traits in potato leading to a European potato type is likely to have been long-day acclimation for tuberization. Potato breeders can exploit the naturally occurring variation in tuberization onset and life cycle length, allowing varietal breeding for different latitudes, harvest times and markets.

  2. Contested change: how Germany came to allow PGD

    Directory of Open Access Journals (Sweden)

    Bettina Bock von Wülfingen

    2016-12-01

    Full Text Available Until recently, German laws protecting the human embryo from the moment of conception were some of the strictest internationally. These laws had previously prevented any manipulation of the embryo, such as in preimplantation genetic diagnosis (PGD, and continue to affect stem cell research. In 2011, however, the German parliament voted in favour of allowing PGD in specific cases. While the modification in the law in earlier analysis was interpreted as being in keeping with the usual norms in Germany, this article argues instead that the reasoning behind the partial acceptance of PGD, rather than the legal decision itself, is indicative of a sociocultural change that needs to be accredited. Demonstrating that a significant change occurred, this article analyses the arguments that led to the amendment in law: not only has the identity of the embryo been redefined towards a pragmatic concept but the notions of parenting and pregnancy have also changed. The focus on the mother and the moment of birth has given way to a focus on conception and ‘genetic couplehood’. The professional discourse preceding the decision allowing PGD suggested that the rights of the not-yet-implanted embryo should be negotiated with those of the two parents-to-be, a concept that may be called ‘in-vitro pregnancy’.

  3. [Niacin allowance of students of a sports college].

    Science.gov (United States)

    Borisov, I M

    1977-01-01

    In 227 students of the Institute for Physical Culture examined in the winter-spring and summer-fall seasons of the year, the passage of N1-methylnicotinamide (MNA) with urine per hour on an empty stomach amounted to 245 +/- 15.9 and 311 +/- 14.6 microgram/hour (the difference according to seasons in significant). These figues point to the dependence of the MNA excretion with uridine on the quantity of the niacin equivalents supplied together with the food. The content of such equivalentsin the rations of students-sprotrsmen (7-9.5 mg per 1000 calories per day) proved insufficient to maintain the MNA passage with urine at a level accepted as a standard allowance of niacin for the organism, i. e. 400-500 microgram/hour. Furthermore, the author shows changes in the niacin allowances of the student's organism, engaged in different kinds of sporting activities and also depending upon the sporting qualification of the examinees the work performed by them, the periods of training, and conditions of their every-day life.

  4. Finding All Allowed Edges in a Bipartite Graph

    CERN Document Server

    Tassa, Tamir

    2011-01-01

    We consider the problem of finding all allowed edges in a bipartite graph $G=(V,E)$, i.e., all edges that are included in some maximum matching. We show that given any maximum matching in the graph, it is possible to perform this computation in linear time $O(n+m)$ (where $n=|V|$ and $m=|E|$). Hence, the time complexity of finding all allowed edges reduces to that of finding a single maximum matching, which is $O(n^{1/2}m)$ [Hopcroft and Karp 1973], or $O((n/\\log n)^{1/2}m)$ for dense graphs with $m=\\Theta(n^2)$ [Alt et al. 1991]. This time complexity improves upon that of the best known algorithms for the problem, which is $O(nm)$ ([Costa 1994] for bipartite graphs, and [Carvalho and Cheriyan 2005] for general graphs). Other algorithms for solving that problem are randomized algorithms due to [Rabin and Vazirani 1989] and [Cheriyan 1997], the runtime of which is $\\tilde{O}(n^{2.376})$. Our algorithm, apart from being deterministic, improves upon that time complexity for bipartite graphs when $m=O(n^r)$ and $...

  5. Quantification of water in hydrous ringwoodite

    Directory of Open Access Journals (Sweden)

    Sylvia-Monique eThomas

    2015-01-01

    Full Text Available Ringwoodite, γ-(Mg,Fe2SiO4, in the lower 150 km of Earth’s mantle transition zone (410-660 km depth can incorporate up to 1.5-2 wt% H2O as hydroxyl defects. We present a mineral-specific IR calibration for the absolute water content in hydrous ringwoodite by combining results from Raman spectroscopy, secondary ion mass spectrometery (SIMS and proton-proton (pp-scattering on a suite of synthetic Mg- and Fe-bearing hydrous ringwoodites. H2O concentrations in the crystals studied here range from 0.46 to 1.7 wt% H2O (absolute methods, with the maximum H2O in the same sample giving 2.5 wt% by SIMS calibration. Anchoring our spectroscopic results to absolute H-atom concentrations from pp-scattering measurements, we report frequency-dependent integrated IR-absorption coefficients for water in ringwoodite ranging from 78180 to 158880 L mol-1cm-2, depending upon frequency of the OH absorption. We further report a linear wavenumber IR calibration for H2O quantification in hydrous ringwoodite across the Mg2SiO4-Fe2SiO4 solid solution, which will lead to more accurate estimations of the water content in both laboratory-grown and naturally occurring ringwoodites. Re-evaluation of the IR spectrum for a natural hydrous ringwoodite inclusion in diamond from the study of Pearson et al. (2014 indicates the crystal contains 1.43 ± 0.27 wt% H2O, thus confirming near-maximum amounts of H2O for this sample from the transition zone.

  6. Quantification of nanowire uptake by live cells

    KAUST Repository

    Margineanu, Michael B.

    2015-05-01

    Nanostructures fabricated by different methods have become increasingly important for various applications at the cellular level. In order to understand how these nanostructures “behave” and for studying their internalization kinetics, several attempts have been made at tagging and investigating their interaction with living cells. In this study, magnetic iron nanowires with an iron oxide layer are coated with (3-Aminopropyl)triethoxysilane (APTES), and subsequently labeled with a fluorogenic pH-dependent dye pHrodo™ Red, covalently bound to the aminosilane surface. Time-lapse live imaging of human colon carcinoma HCT 116 cells interacting with the labeled iron nanowires is performed for 24 hours. As the pHrodo™ Red conjugated nanowires are non-fluorescent outside the cells but fluoresce brightly inside, internalized nanowires are distinguished from non-internalized ones and their behavior inside the cells can be tracked for the respective time length. A machine learning-based computational framework dedicated to automatic analysis of live cell imaging data, Cell Cognition, is adapted and used to classify cells with internalized and non-internalized nanowires and subsequently determine the uptake percentage by cells at different time points. An uptake of 85 % by HCT 116 cells is observed after 24 hours incubation at NW-to-cell ratios of 200. While the approach of using pHrodo™ Red for internalization studies is not novel in the literature, this study reports for the first time the utilization of a machine-learning based time-resolved automatic analysis pipeline for quantification of nanowire uptake by cells. This pipeline has also been used for comparison studies with nickel nanowires coated with APTES and labeled with pHrodo™ Red, and another cell line derived from the cervix carcinoma, HeLa. It has thus the potential to be used for studying the interaction of different types of nanostructures with potentially any live cell types.

  7. Fluorometric quantification of natural inorganic polyphosphate.

    Science.gov (United States)

    Diaz, Julia M; Ingall, Ellery D

    2010-06-15

    Polyphosphate, a linear polymer of orthophosphate, is abundant in the environment and a key component in wastewater treatment and many bioremediation processes. Despite the broad relevance of polyphosphate, current methods to quantify it possess significant disadvantages. Here, we describe a new approach for the direct quantification of inorganic polyphosphate in complex natural samples. The protocol relies on the interaction between the fluorochrome 4',6-diamidino-2-phenylindole (DAPI) and dissolved polyphosphate. With the DAPI-based approach we describe, polyphosphate can be quantified at concentrations ranging from 0.5-3 microM P in a neutral-buffered freshwater matrix with an accuracy of +/-0.03 microM P. The patterns of polyphosphate concentration versus fluorescence yielded by standards exhibit no chain length dependence across polyphosphates ranging from 15-130 phosphorus units in size. Shorter length polyphosphate molecules (e.g., polyphosphate of three and five phosphorus units in length) contribute little to no signal in this approach, as these molecules react only slightly or not at all with DAPI in the concentration range tested. The presence of salt suppresses fluorescence from intermediate polyphosphate chain lengths (e.g., 15 phosphorus units) at polyphosphate concentrations ranging from 0.5-3 microM P. For longer chain lengths (e.g., 45-130 phosphorus units), this salt interference is not evident at conductivities up to approximately 10mS/cm. Our results indicate that standard polyphosphates should be stored frozen for no longer than 10-15 days to avoid inconsistent results associated with standard degradation. We have applied the fluorometric protocol to the analysis of five well-characterized natural samples to demonstrate the use of the method.

  8. GPU-accelerated voxelwise hepatic perfusion quantification.

    Science.gov (United States)

    Wang, H; Cao, Y

    2012-09-07

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using compute unified device architecture-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, nonlinear least-squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626 400 voxels in a patient's liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10(-6). The method will be useful for generating liver perfusion images in clinical settings.

  9. Questions for Surveys

    Science.gov (United States)

    Schaeffer, Nora Cate; Dykema, Jennifer

    2011-01-01

    We begin with a look back at the field to identify themes of recent research that we expect to continue to occupy researchers in the future. As part of this overview, we characterize the themes and topics examined in research about measurement and survey questions published in Public Opinion Quarterly in the past decade. We then characterize the field more broadly by highlighting topics that we expect to continue or to grow in importance, including the relationship between survey questions and the total survey error perspective, cognitive versus interactional approaches, interviewing practices, mode and technology, visual aspects of question design, and culture. Considering avenues for future research, we advocate for a decision-oriented framework for thinking about survey questions and their characteristics. The approach we propose distinguishes among various aspects of question characteristics, including question topic, question type and response dimension, conceptualization and operationalization of the target object, question structure, question form, response categories, question implementation, and question wording. Thinking about question characteristics more systematically would allow study designs to take into account relationships among these characteristics and identify gaps in current knowledge. PMID:24970951

  10. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    Science.gov (United States)

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  11. In vivo absolute quantification for mouse muscle metabolites using an inductively coupled synthetic signal injection method and newly developed (1) H/(31) P dual tuned probe.

    Science.gov (United States)

    Lee, Donghoon; Marro, Kenneth; Mathis, Mark; Shankland, Eric; Hayes, Cecil

    2014-04-01

    To obtain robust estimates of (31) P metabolite content in mouse skeletal muscles using our recently developed MR absolute quantification method and a custom-built (1) H/(31) P dual tuned radiofrequency (RF) coil optimized for mouse leg. We designed and fabricated a probe consisting of two dual tuned (1) H/(31) P solenoid coils: one leg was inserted to each solenoid. The mouse leg volume coil was incorporated with injector coils for MR absolute quantification. The absolute quantification method uses a synthetic reference signal injection approach and solves several challenges in MR absolute quantification including changes of coil loading and receiver gains. The (1) H/(31) P dual tuned probe was composed of two separate solenoid coils, one for each leg, to increase coil filling factors and signal-to-noise ratio. Each solenoid was equipped with a second coil to allow injection of reference signals. (31) P metabolite concentrations determined for normal mice were well within the expected range reported in the literature. We developed an RF probe and an absolute quantification approach adapted for mouse skeletal muscle. Copyright © 2014 Wiley Periodicals, Inc.

  12. In vivo absolute quantification for mouse muscle metabolites using an inductively coupled synthetic signal injection method and newly developed 1H/31P dual tuned probe

    Science.gov (United States)

    Lee, Donghoon; Marro, Kenneth; Mathis, Mark; Shankland, Eric; Hayes, Cecil

    2013-01-01

    Purpose To obtain robust estimates of 31P metabolite content in mouse skeletal muscles using our recently developed MR absolute quantification method and a custom-built 1H/31P dual tuned radiofrequency (RF) coil optimized for mouse leg. Materials and Methods We designed and fabricated a probe consisting of two dual tuned 1H/31P solenoid coils: one leg was inserted to each solenoid. The mouse leg volume coil was incorporated with injector coils for MR absolute quantification. The absolute quantification method uses a synthetic reference signal injection approach and solves several challenges in MR absolute quantification including changes of coil loading and receiver gains. Results The 1H/31P dual tuned probe was composed of two separate solenoid coils, one for each leg, to increase coil filling factors and signal-to-noise ratio. Each solenoid was equipped with a second coil to allow injection of reference signals. 31P metabolite concentrations determined for normal mice were well within the expected range reported in the literature. Conclusion We developed an RF probe and an absolute quantification approach adapted for mouse skeletal muscle. PMID:24464912

  13. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    Science.gov (United States)

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Nuclear and cytoplasmic mRNA quantification by SYBR green based real-time RT-PCR.

    Science.gov (United States)

    Wang, Yaming; Zhu, Wei; Levy, David E

    2006-08-01

    Measurement of the steady-state abundance of nuclear and cytoplasmic RNA requires efficient subcellular fractionation and RNA recovery coupled with accurate quantification of individual RNA species. Detergent lysis of tissue culture cells provides a simple fractionation procedure that can be optimized to individual cell lines. The large dynamic range, extreme sensitivity, high sequence-specificity, and fast turn-around time has allowed real-time reverse transcription polymerase chain reaction (real-time RT-PCR) to become a standard tool for mRNA quantification. Among the different chemistries used for PCR product detection during amplification, DNA binding dyes such as SYBR Green I are simple, versatile, and yet highly reliable and least expensive. With attention to primer design and cycling conditions, virtually any mRNA species can be accurately quantified from even minute quantities of starting RNA. This method provides an accurate and efficient procedure for estimating the relative ratios of nuclear and cytoplasmic RNA concentrations.

  15. Methodological considerations in quantification of oncological FDG PET studies

    Energy Technology Data Exchange (ETDEWEB)

    Vriens, Dennis; Visser, Eric P.; Geus-Oei, Lioe-Fee de; Oyen, Wim J.G. [Radboud University Nijmegen Medical Centre, Department of Nuclear Medicine, Nijmegen (Netherlands)

    2010-07-15

    This review aims to provide insight into the factors that influence quantification of glucose metabolism by FDG PET images in oncology as well as their influence on repeated measures studies (i.e. treatment response assessment), offering improved understanding both for clinical practice and research. Structural PubMed searches have been performed for the many factors affecting quantification of glucose metabolism by FDG PET. Review articles and references lists have been used to supplement the search findings. Biological factors such as fasting blood glucose level, FDG uptake period, FDG distribution and clearance, patient motion (breathing) and patient discomfort (stress) all influence quantification. Acquisition parameters should be adjusted to maximize the signal to noise ratio without exposing the patient to a higher than strictly necessary radiation dose. This is especially challenging in pharmacokinetic analysis, where the temporal resolution is of significant importance. The literature is reviewed on the influence of attenuation correction on parameters for glucose metabolism, the effect of motion, metal artefacts and contrast agents on quantification of CT attenuation-corrected images. Reconstruction settings (analytical versus iterative reconstruction, post-reconstruction filtering and image matrix size) all potentially influence quantification due to artefacts, noise levels and lesion size dependency. Many region of interest definitions are available, but increased complexity does not necessarily result in improved performance. Different methods for the quantification of the tissue of interest can introduce systematic and random inaccuracy. This review provides an up-to-date overview of the many factors that influence quantification of glucose metabolism by FDG PET. (orig.) 3.

  16. Evaluation of the TLC quantification method and occurrence of deoxynivalenol in wheat flour of southern Brazil.

    Science.gov (United States)

    Rocha, Denise Felippin de Lima; Oliveira, Melissa Dos Santos; Furlong, Eliana Badiale; Junges, Alexander; Paroul, Natalia; Valduga, Eunice; Backes, Geciane Toniazzo; Zeni, Jamile; Cansian, Rogério Luis

    2017-12-01

    The study evaluated a QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) extraction method for use with a TLC quantification procedure for deoxynivalenol (DON). It also surveyed DON occurrence in wheat flour from the southern region of Brazil. Forty-eight wheat flour samples were analysed, divided into 2 different harvest lots, each consisting of 24 different brands. The detection and quantification limits of the method were 30 and 100 ng of DON on the TLC plate. The various concentrations of DON presented high linearity (R 2  = 0.99). A negative matrix effect (-28%) of the wheat flour was verified, with suppression of the chromatographic signal of DON, and 80.2-105.4% recovery. The TLC method was reliable for DON evaluation, with a coefficient of variation of less than 10%. High-performance liquid chromatography of lot 2 samples confirmed the presence of DON in all samples identified DON-positive by the TLC technique. Of the 48 wheat flour samples in lots 1 and 2 analysed by TLC, 33.3 and 45.8% of the samples respectively were above the Brazilian legislation limit. Correlations were observed between the water activity and DON content, and between the fungal count and moisture content of the wheat flours.

  17. What Are Probability Surveys?

    Science.gov (United States)

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  18. Group-sequential analysis may allow for early trial termination

    DEFF Research Database (Denmark)

    Gerke, Oke; Vilstrup, Mie H; Halekoh, Ulrich

    2017-01-01

    -PET/CT measurements, illuminating the possibility of early trial termination which implicates significant potential time and resource savings. METHODS: Primary lesion maximum standardised uptake value (SUVmax) was determined twice from preoperative FDG-PET/CTs in 45 ovarian cancer patients. Differences in SUVmax were...... and the final analysis. Other partitions did not suggest early stopping after adjustment for multiple testing due to one influential outlier and our small sample size. CONCLUSIONS: Group-sequential testing may enable early stopping of a trial, allowing for potential time and resource savings. The testing...... strategy must, though, be defined at the planning stage, and sample sizes must be reasonably large at interim analysis to ensure robustness against single outliers. Group-sequential testing may have a place in accuracy and agreement studies....

  19. Short peptides allowing preferential detection of Candida albicans hyphae.

    Science.gov (United States)

    Kaba, Hani E J; Pölderl, Antonia; Bilitewski, Ursula

    2015-09-01

    Whereas the detection of pathogens via recognition of surface structures by specific antibodies and various types of antibody mimics is frequently described, the applicability of short linear peptides as sensor molecules or diagnostic tools is less well-known. We selected peptides which were previously reported to bind to recombinant S. cerevisiae cells, expressing members of the C. albicans Agglutinin-Like-Sequence (ALS) cell wall protein family. We slightly modified amino acid sequences to evaluate peptide sequence properties influencing binding to C. albicans cells. Among the selected peptides, decamer peptides with an "AP"-N-terminus were superior to shorter peptides. The new decamer peptide FBP4 stained viable C. albicans cells more efficiently in their mature hyphal form than in their yeast form. Moreover, it allowed distinction of C. albicans from other related Candida spp. and could thus be the basis for the development of a useful tool for the diagnosis of invasive candidiasis.

  20. Super-allowed Fermi beta-decay revisited

    CERN Document Server

    Wilkinson, D H

    2002-01-01

    Analysis of J suppi=0 sup +->0 sup + super-allowed Fermi transitions is limited with respect to the precision of its outcome in terms of the Fermi coupling constant neither by the accuracy of the experimental input data nor by the confidence with which the radiative corrections can be applied but rather by knowledge of the nuclear mismatch: the subversion of isospin symmetry along the multiplets. Theoretical estimates of this mismatch differ considerably from to their direct nuclide-by-nuclide application results in an apparent clear violation of the hypothesis of conservation of the vector current and evident inconsistency with unitary of the Cabibbo-Kobayashi-Maskawa matrix. This paper pursues and elaborates the earlier suggestion that, in these unsatisfactory circumstances, the best procedure is to look to the experimental data themselves to determine and eliminate the mismatch by appropriate extrapolation to Z approx 0 where the mismatch falls away. This is done: (i) without any prior correction for misma...

  1. Diet quality, overweight and daily monetary allowance of Greek adolescents.

    Science.gov (United States)

    Poulimeneas, Dimitrios; Vlachos, Dimitrios; Maraki, Maria I; Daskalou, Efstratia; Grammatikopoulou, Melpomene; Karathanou, Lenia; Kotsias, Emma; Tsofliou, Fotini; Tsigga, Maria; Grammatikopoulou, Maria G

    2017-07-14

    Objective To investigate cross-correlates of pocket-money on diet quality and weight status of Greek adolescents. Methods A total of 172 adolescents (55.2% boys), aged between 10 and 15 years old were recruited. Body weight and height were measured, body mass index (BMI) was computed. Weight status was assessed according to the International Obesity Task Force criteria and diet quality was evaluated via the Healthy Eating Index (HEI) - 2010. Results Adolescents were allowed a mean allowance of €4.63 ± 3.66 daily. Among boys participants, BMI correlated with pocket money (r = 0.311, p ≤ 0.002) and normoweight boys received statistically less money than their overweight peers (p ≤ 0.019). In both sexes, normoweight was more prevalent in the lowest monetary quartiles. Pocket money was not associated with HEI. Among boys, moderate HEI was more prevalent in the third quartile of pocket money, significantly higher compared to all others (p ≤ 0.01 for all). For girls, the prevalence of moderate HEI declined by each ascending pocket money quartile (p ≤ 0.05 for all). Conclusion In our sample, adolescents exhibited high rates of pooled overweight including obesity. The majority of the participants followed a diet of moderate quality. Pocket money was associated with BMI only among boys. As pocket money was not associated with diet quality, it is highly possible that adolescents might choose to spend their money on items other than foods. Our study shows that pocket money should be controlled during adolescence and teenagers should be educated on spending their money on healthier food choices.

  2. Charge Density Quantification and Antimicrobial Efficacy

    Science.gov (United States)

    2008-08-01

    consist of a variety of linear and dendritic quaternary ammonium and phosphonium polymeric salts . Many of the polymers are covalently attached to a...charge density, antimicrobial, XPS, UV-VIS, quaternary ammonium salt 16. SECURITY CLASSIFICATION OF: 19a. NAME OF RESPONSIBLE PERSON Nicole Zander...XPS N/C ratios of method B controls and quaternary nitrogen samples. .......................9 Figure 6. Survey spectrum of quaternary ammonium salt

  3. Game engines: a survey

    Directory of Open Access Journals (Sweden)

    A. Andrade

    2015-11-01

    Full Text Available Due to hardware limitations at the origin of the video game industry, each new game was generally coded from the ground up. Years later, from the evolution of hardware and the need for quick game development cycles, spawned the concept of game engine. A game engine is a reusable software layer allowing the separation of common game concepts from the game assets (levels, graphics, etc.. This paper surveys fourteen different game engines relevant today, ranging from the industry-level to the newcomer-friendlier ones.

  4. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    Science.gov (United States)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  5. MR Spectroscopy in Prostate Cancer: New Algorithms to Optimize Metabolite Quantification.

    Directory of Open Access Journals (Sweden)

    Giovanni Bellomo

    Full Text Available Prostate cancer (PCa is the most common non-cutaneous cancer in male subjects and the second leading cause of cancer-related death in developed countries. The necessity of a non-invasive technique for the diagnosis of PCa in early stage has grown through years. Proton magnetic resonance spectroscopy (1H-MRS and proton magnetic resonance spectroscopy imaging (1H-MRSI are advanced magnetic resonance techniques that can mark the presence of metabolites such as citrate, choline, creatine and polyamines in a selected voxel, or in an array of voxels (in MRSI inside prostatic tissue. Abundance or lack of these metabolites can discriminate between pathological and healthy tissue. Although the use of magnetic resonance spectroscopy (MRS is well established in brain and liver with dedicated software for spectral analysis, quantification of metabolites in prostate can be very difficult to achieve, due to poor signal to noise ratio and strong J-coupling of the citrate. The aim of this work is to develop a software prototype for automatic quantification of citrate, choline and creatine in prostate. Its core is an original fitting routine that makes use of a fixed step gradient descent minimization algorithm (FSGD and MRS simulations developed with the GAMMA libraries in C++. The accurate simulation of the citrate spin systems allows to predict the correct J-modulation under different NMR sequences and under different coupling parameters. The accuracy of the quantifications was tested on measurements performed on a Philips Ingenia 3T scanner using homemade phantoms. Some acquisitions in healthy volunteers have been also carried out to test the software performance in vivo.

  6. Simultaneous quantification of multiple food- and waterborne pathogens by use of microfluidic quantitative PCR.

    Science.gov (United States)

    Ishii, Satoshi; Segawa, Takahiro; Okabe, Satoshi

    2013-05-01

    The direct quantification of multiple pathogens has been desired for diagnostic and public health purposes for a long time. In this study, we applied microfluidic quantitative PCR (qPCR) technology to the simultaneous detection and quantification of multiple food- and waterborne pathogens. In this system, multiple singleplex qPCR assays were run under identical detection conditions in nanoliter-volume chambers that are present in high densities on a chip. First, we developed 18 TaqMan qPCR assays that could be run in the same PCR conditions by using prevalidated TaqMan probes. Specific and sensitive quantification was achieved by using these qPCR assays. With the addition of two previously validated TaqMan qPCR assays, we used 20 qPCR assays targeting 10 enteric pathogens, a fecal indicator bacterium (general Escherichia coli), and a process control strain in the microfluidic qPCR system. We preamplified the template DNA to increase the sensitivity of the qPCR assays. Our results suggested that preamplification was effective for quantifying small amounts of the template DNA without any major impact on the sensitivity, efficiency, and quantitative performance of qPCR. This microfluidic qPCR system allowed us to detect and quantify multiple pathogens from fecal samples and environmental water samples spiked with pathogens at levels as low as 100 cells/liter. These results suggest that the routine monitoring of multiple pathogens in food and water samples is now technically feasible. This method may provide more reliable information for risk assessment than the current fecal contamination indicator approach.

  7. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    Science.gov (United States)

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (pfibrosis in mice, which will be very valuable for future preclinical drug explorations.

  8. Methods for the physical characterization and quantification of extracellular vesicles in biological samples.

    Science.gov (United States)

    Rupert, Déborah L M; Claudio, Virginia; Lässer, Cecilia; Bally, Marta

    2017-01-01

    Our body fluids contain a multitude of cell-derived vesicles, secreted by most cell types, commonly referred to as extracellular vesicles. They have attracted considerable attention for their function as intercellular communication vehicles in a broad range of physiological processes and pathological conditions. Extracellular vesicles and especially the smallest type, exosomes, have also generated a lot of excitement in view of their potential as disease biomarkers or as carriers for drug delivery. In this context, state-of-the-art techniques capable of comprehensively characterizing vesicles in biological fluids are urgently needed. This review presents the arsenal of techniques available for quantification and characterization of physical properties of extracellular vesicles, summarizes their working principles, discusses their advantages and limitations and further illustrates their implementation in extracellular vesicle research. The small size and physicochemical heterogeneity of extracellular vesicles make their physical characterization and quantification an extremely challenging task. Currently, structure, size, buoyant density, optical properties and zeta potential have most commonly been studied. The concentration of vesicles in suspension can be expressed in terms of biomolecular or particle content depending on the method at hand. In addition, common quantification methods may either provide a direct quantitative measurement of vesicle concentration or solely allow for relative comparison between samples. The combination of complementary methods capable of detecting, characterizing and quantifying extracellular vesicles at a single particle level promises to provide new exciting insights into their modes of action and to reveal the existence of vesicle subpopulations fulfilling key biological tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Quantification of receptor-mediated endocytosis.

    Science.gov (United States)

    Wettey, Frank R; Jackson, Antony P

    2006-01-01

    This method allows measuring of the receptor-mediated internalization of 125I-labeled conalbumin, the chicken egg white isoform of transferrin. Kinetic data, i.e. the rate constant k(i) for the initial internalization process, can be extracted from the data by linear curve fitting using an In/Sur plot (intracellular label/cell surface label over time).

  10. Spectrophotometric method for quantification of soil microbial ...

    African Journals Online (AJOL)

    This comparison was performed by two soil sample tests: (i) areas of grain crops with conventional management versus notill farming; and (ii) areas with distinct ... It was found that molecular absorption spectrophotometry was an efficient tool for the determination of soil microbial biomass carbon, allowing replacement of the ...

  11. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations.

    Science.gov (United States)

    Higgs, Richard E; Butler, Jon P; Han, Bomie; Knierman, Michael D

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference.

  12. Technology Credit Scoring Based on a Quantification Method

    Directory of Open Access Journals (Sweden)

    Yonghan Ju

    2017-06-01

    Full Text Available Credit scoring models are usually formulated by fitting the probability of loan default as a function of individual evaluation attributes. Typically, these attributes are measured using a Likert-type scale, but are treated as interval scale explanatory variables to predict loan defaults. Existing models also do not distinguish between types of default, although they vary: default by an insolvent company and default by an insolvent debtor. This practice can bias the results. In this paper, we applied Quantification Method II, a categorical version of canonical correlation analysis, to determine the relationship between two sets of categorical variables: a set of default types and a set of evaluation attributes. We distinguished between two types of loan default patterns based on quantification scores. In the first set of quantification scores, we found knowledge management, new technology development, and venture registration as important predictors of default from non-default status. Based on the second quantification score, we found that the technology and profitability factors influence loan defaults due to an insolvent company. Finally, we proposed a credit-risk rating model based on the quantification score.

  13. GMO quantification: valuable experience and insights for the future.

    Science.gov (United States)

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  14. A survey of mangiferin and hydroxycinnamic acid ester accumulation in coffee (Coffea) leaves: biological implications and uses.

    Science.gov (United States)

    Campa, Claudine; Mondolot, Laurence; Rakotondravao, Arsene; Bidel, Luc P R; Gargadennec, Annick; Couturon, Emmanuel; La Fisca, Philippe; Rakotomalala, Jean-Jacques; Jay-Allemand, Christian; Davis, Aaron P

    2012-08-01

    The phenolic composition of Coffea leaves has barely been studied, and therefore this study conducts the first detailed survey, focusing on mangiferin and hydroxycinnamic acid esters (HCEs). Using HPLC, including a new technique allowing quantification of feruloylquinic acid together with mangiferin, and histochemical methods, mangiferin content and tissue localization were compared in leaves and fruits of C. pseudozanguebariae, C. arabica and C. canephora. The HCE and mangiferin content of leaves was evaluated for 23 species native to Africa or Madagascar. Using various statistical methods, data were assessed in relation to distribution, ecology, phylogeny and use. Seven of the 23 species accumulated mangiferin in their leaves. Mangiferin leaf-accumulating species also contain mangiferin in the fruits, but only in the outer (sporophytic) parts. In both leaves and fruit, mangiferin accumulation decreases with ageing. A relationship between mangiferin accumulation and UV levels is posited, owing to localization with photosynthetic tissues, and systematic distribution in high altitude clades and species with high altitude representatives. Analyses of mangiferin and HCE content showed that there are significant differences between species, and that samples can be grouped into species, with few exceptions. These data also provide independent support for various Coffea lineages, as proposed by molecular phylogenetic analyses. Sampling of the hybrids C. arabica and C. heterocalyx cf. indicates that mangiferin and HCE accumulation may be under independent parental influence. This survey of the phenolic composition in Coffea leaves shows that mangiferin and HCE accumulation corresponds to lineage recognition and species delimitation, respectively. Knowledge of the spectrum of phenolic accumulation within species and populations could be of considerable significance for adaptation to specific environments. The potential health benefits of coffee-leaf tea, and beverages and

  15. Quantification of carbon nanomaterials in vivo.

    Science.gov (United States)

    Wang, Haifang; Yang, Sheng-Tao; Cao, Aoneng; Liu, Yuanfang

    2013-03-19

    this Account, we review the in vivo quantification methods of carbon NMs, focusing on isotopic labeling and tracing methods, and summarize the related labeling, purification, bio-sampling, and detection of carbon NMs. We also address the advantages, applicable situations, and limits of various labeling and tracing methods and propose guidelines for choosing suitable labeling methods. A collective analysis of the ADME information on various carbon NMs in vivo would provide general principles for understanding the fate of carbon NMs and the effects of chemical functionalization and aggregation of carbon NMs on their ADME/T in vivo and their implications in nanotoxicology and biosafety evaluations.

  16. Radical reconciliation: The TRC should have allowed Zacchaeus to testify?

    Directory of Open Access Journals (Sweden)

    Tshepo Lephakga

    2016-02-01

    Full Text Available This article seeks to point out that, the inclusion of a theological term – that is ‘reconciliation’ (at the request of F.W. de Klerk on behalf of the National Party to what was supposed to be the ‘Truth Commission’ (Boesak & DeYoung 2012; Stevens Franchi & Swart 2006 – was for the purpose of taming the work of this commission and using reconciliation to merely reach some political accommodation which did not address the critical questions of justice, equality, and dignity which are prominent in the biblical understanding of reconciliation (Boesak 2008; Boesak & DeYoung 2012:1; Lephakga 2015; Terreblanche 2002. However, it is important to point out that, the problem was not the theological word – that is ‘reconciliation’– but the understanding and interpretation of it in South Africa. This is because previously in South Africa the Bible was made a servant to ideology (Lephakga 2012, 2013; Moodie 1975; Serfontein 1982 and thus domesticated for the purposes of subjection and control (Boesak & DeYoung 2012. As such, this article contends that, the call for the inclusion of ‘reconciliation’ within the ‘truth commission’ was not to allow reconciliation to confront the country with the demands of the gospel but to blunt the process of radical change (Boesak & DeYoung 2012. Therefore, this article will point out that the shortcomings of the South African Truth and Reconciliation Commission (TRC need to be understood against the following events which occurred between the period 1989 to 1995: (1 the fall of the Soviet Union (Cronin 1994:2–6; (2 the National Party’s (NP and South African business sector’s interest in negotiations with the African National Congress (ANC (Cronin 1994:2–6; Mkhondo 1993:3–43; Terreblanche 2002:51–124; (3 the elite compromise (Terreblanche 2002:51–124; and the sudden passing of the Promotion of National Unity and Reconciliation Act, no 34 of 1995 (TRC, Vol. 1998. This paper will use

  17. Total Survey Error for Longitudinal Surveys

    NARCIS (Netherlands)

    Lynn, Peter; Lugtig, P.J.

    2016-01-01

    This article describes the application of the total survey error paradigm to longitudinal surveys. Several aspects of survey error, and of the interactions between different types of error, are distinct in the longitudinal survey context. Furthermore, error trade-off decisions in survey design and

  18. Renewable Electricity Benefits Quantification Methodology: A Request for Technical Assistance from the California Public Utilities Commission

    Energy Technology Data Exchange (ETDEWEB)

    Mosey, G.; Vimmerstedt, L.

    2009-07-01

    The California Public Utilities Commission (CPUC) requested assistance in identifying methodological alternatives for quantifying the benefits of renewable electricity. The context is the CPUC's analysis of a 33% renewable portfolio standard (RPS) in California--one element of California's Climate Change Scoping Plan. The information would be used to support development of an analytic plan to augment the cost analysis of this RPS (which recently was completed). NREL has responded to this request by developing a high-level survey of renewable electricity effects, quantification alternatives, and considerations for selection of analytic methods. This report addresses economic effects and health and environmental effects, and provides an overview of related analytic tools. Economic effects include jobs, earnings, gross state product, and electricity rate and fuel price hedging. Health and environmental effects include air quality and related public-health effects, solid and hazardous wastes, and effects on water resources.

  19. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: validation of the quantitative multimolecular approach by radiocarbon analysis.

    Science.gov (United States)

    Jeanneau, Laurent; Faure, Pierre

    2010-09-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.

  20. Eating and drinking in labor: should it be allowed?

    Science.gov (United States)

    Maharaj, Dushyant

    2009-09-01

    Eating and drinking in labor is a controversial subject with practice varying widely by practitioners, within facilities, and around the world. The risk of aspiration pneumonitis and anesthesia-related deaths at cesarean section has resulted in adherence to historical practices of starving women in labor. Studies have shown that the risk of this anesthetic-related complication is low. It is the fear of the birth-attendant to bear full responsibility if a patient inhales gastric contents when giving in to demands for liberal fluid and food regimes during labor that governs practice. While the bulk of evidence supports fluid intake in labor, there are insufficient published studies to draw conclusions about the relationship between fasting times and the risk of pulmonary aspiration during labor. Whether or not allowing food and fluid throughout labor is beneficial or harmful can only be determined by further research. A computerized search was done of MEDLINE, PUBMED, SCOPUS and CINAHL, as well of historical articles, texts, articles from indexed journals, and references cited in published works.

  1. Genomic HEXploring allows landscaping of novel potential splicing regulatory elements.

    Science.gov (United States)

    Erkelenz, Steffen; Theiss, Stephan; Otte, Marianne; Widera, Marek; Peter, Jan Otto; Schaal, Heiner

    2014-01-01

    Effective splice site selection is critically controlled by flanking splicing regulatory elements (SREs) that can enhance or repress splice site use. Although several computational algorithms currently identify a multitude of potential SRE motifs, their predictive power with respect to mutation effects is limited. Following a RESCUE-type approach, we defined a hexamer-based 'HEXplorer score' as average Z-score of all six hexamers overlapping with a given nucleotide in an arbitrary genomic sequence. Plotted along genomic regions, HEXplorer score profiles varied slowly in the vicinity of splice sites. They reflected the respective splice enhancing and silencing properties of splice site neighborhoods beyond the identification of single dedicated SRE motifs. In particular, HEXplorer score differences between mutant and reference sequences faithfully represented exonic mutation effects on splice site usage. Using the HIV-1 pre-mRNA as a model system highly dependent on SREs, we found an excellent correlation in 29 mutations between splicing activity and HEXplorer score. We successfully predicted and confirmed five novel SREs and optimized mutations inactivating a known silencer. The HEXplorer score allowed landscaping of splicing regulatory regions, provided a quantitative measure of mutation effects on splice enhancing and silencing properties and permitted calculation of the mutationally most effective nucleotide. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Credit financing for deteriorating imperfect quality items with allowable shortages

    Directory of Open Access Journals (Sweden)

    Aditi Khanna

    2016-01-01

    Full Text Available The outset of new technologies, systems and applications in manufacturing sector has no doubt lighten up our workload, yet the chance causes of variation in production system cannot be eliminated completely. Every produced/ordered lot may have some fraction of defectives which may vary from process to process. In addition the situation is more susceptible when the items are deteriorating in nature. However, the defective items can be secluded from the good quality lot through a careful inspection process. Thus, a screening process is obligatory in today’s technology driven industry which has the customer satisfaction as its only motto. Moreover, in order to survive in the current global markets, credit financing has been proven a very influential promotional tool to attract new customers and a good inducement policy for the retailers. Keeping this scenario in mind, the present paper investigates an inventory model for a retailer dealing with imperfect quality deteriorating items under permissible delay in payments. Shortages are allowed and fully backlogged. This model jointly optimizes the order quantity and shortages by maximizing the expected total profit. A mathematical model is developed to depict this scenario. Results have been validated with the help of numerical example. Comprehensive sensitivity analysis has also been presented.

  3. Different dispersal abilities allow reef fish to coexist.

    Science.gov (United States)

    Bode, Michael; Bode, Lance; Armsworth, Paul R

    2011-09-27

    The coexistence of multiple species on a smaller number of limiting resources is an enduring ecological paradox. The mechanisms that maintain such biodiversity are of great interest to ecology and of central importance to conservation. We describe and prove a unique and robust mechanism for coexistence: Species that differ only in their dispersal abilities can coexist, if habitat patches are distributed at irregular distances. This mechanism is straightforward and ecologically intuitive, but can nevertheless create complex coexistence patterns that are robust to substantial environmental stochasticity. The Great Barrier Reef (GBR) is noted for its diversity of reef fish species and its complex arrangement of reef habitat. We demonstrate that this mechanism can allow fish species with different pelagic larval durations to stably coexist in the GBR. Further, coexisting species on the GBR often dominate different subregions, defined primarily by cross-shelf position. Interspecific differences in dispersal ability generate similar coexistence patterns when dispersal is influenced by larval behavior and variable oceanographic conditions. Many marine and terrestrial ecosystems are characterized by patchy habitat distributions and contain coexisting species that have different dispersal abilities. This coexistence mechanism is therefore likely to have ecological relevance beyond reef fish.

  4. Stereological quantification of mast cells in human synovium

    DEFF Research Database (Denmark)

    Damsgaard, T E; Sørensen, Flemming Brandt; Herlin, T

    1999-01-01

    Mast cells participate in both the acute allergic reaction as well as in chronic inflammatory diseases. Earlier studies have revealed divergent results regarding the quantification of mast cells in the human synovium. The aim of the present study was therefore to quantify these cells in the human...... synovium, using stereological techniques. Different methods of staining and quantification have previously been used for mast cell quantification in human synovium. Stereological techniques provide precise and unbiased information on the number of cell profiles in two-dimensional tissue sections of......, in this case, human synovium. In 10 patients suffering from osteoarthritis a median of 3.6 mast cells/mm2 synovial membrane was found. The total number of cells (synoviocytes, fibroblasts, lymphocytes, leukocytes) present was 395.9 cells/mm2 (median). The mast cells constituted 0.8% of all the cell profiles...

  5. Superlattice band structure: New and simple energy quantification condition

    Energy Technology Data Exchange (ETDEWEB)

    Maiz, F., E-mail: fethimaiz@gmail.com [University of Cartage, Nabeul Engineering Preparatory Institute, Merazka, 8000 Nabeul (Tunisia); King Khalid University, Faculty of Science, Physics Department, P.O. Box 9004, Abha 61413 (Saudi Arabia)

    2014-10-01

    Assuming an approximated effective mass and using Bastard's boundary conditions, a simple method is used to calculate the subband structure for periodic semiconducting heterostructures. Our method consists to derive and solve the energy quantification condition (EQC), this is a simple real equation, composed of trigonometric and hyperbolic functions, and does not need any programming effort or sophistic machine to solve it. For less than ten wells heterostructures, we have derived and simplified the energy quantification conditions. The subband is build point by point; each point presents an energy level. Our simple energy quantification condition is used to calculate the subband structure of the GaAs/Ga{sub 0.5}Al{sub 0.5}As heterostructures, and build its subband point by point for 4 and 20 wells. Our finding shows a good agreement with previously published results.

  6. Iron overload in the liver diagnostic and quantification

    Energy Technology Data Exchange (ETDEWEB)

    Alustiza, Jose M. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)]. E-mail: jmalustiza@osatek.es; Castiella, Agustin [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Juan, Maria D. de [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Emparanza, Jose I. [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Artetxe, Jose [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain); Uranga, Maite [Osatek SA, P Dr. Beguiristain 109, 20014, San Sebastian, Guipuzcoa (Spain)

    2007-03-15

    Hereditary Hemochromatosis is the most frequent modality of iron overload. Since 1996 genetic tests have facilitated significantly the non-invasive diagnosis of the disease. There are however many cases of negative genetic tests that require confirmation by hepatic iron quantification which is traditionally performed by hepatic biopsy. There are many studies that have demonstrated the possibility of performing hepatic iron quantification with Magnetic Resonance. However, a consensus has not been reached yet regarding the technique or the possibility to reproduce the same method of calculus in different machines. This article reviews the state of the art of the question and delineates possible future lines to standardise this non-invasive method of hepatic iron quantification.

  7. Radiation dose determines the method for quantification of DNA double strand breaks

    Energy Technology Data Exchange (ETDEWEB)

    Bulat, Tanja; Keta, Olitija; Korićanac, Lela; Žakula, Jelena; Petrović, Ivan; Ristić-Fira, Aleksandra [University of Belgrade, Vinča Institute of Nuclear Sciences, Belgrade (Serbia); Todorović, Danijela, E-mail: dtodorovic@medf.kg.ac.rs [University of Kragujevac, Faculty of Medical Sciences, Kragujevac (Serbia)

    2016-03-15

    Ionizing radiation induces DNA double strand breaks (DSBs) that trigger phosphorylation of the histone protein H2AX (γH2AX). Immunofluorescent staining visualizes formation of γH2AX foci, allowing their quantification. This method, as opposed to Western blot assay and Flow cytometry, provides more accurate analysis, by showing exact position and intensity of fluorescent signal in each single cell. In practice there are problems in quantification of γH2AX. This paper is based on two issues: the determination of which technique should be applied concerning the radiation dose, and how to analyze fluorescent microscopy images obtained by different microscopes. HTB140 melanoma cells were exposed to γ-rays, in the dose range from 1 to 16 Gy. Radiation effects on the DNA level were analyzed at different time intervals after irradiation by Western blot analysis and immunofluorescence microscopy. Immunochemically stained cells were visualized with two types of microscopes: AxioVision (Zeiss, Germany) microscope, comprising an ApoTome software, and AxioImagerA1 microscope (Zeiss, Germany). Obtained results show that the level of γH2AX is time and dose dependent. Immunofluorescence microscopy provided better detection of DSBs for lower irradiation doses, while Western blot analysis was more reliable for higher irradiation doses. AxioVision microscope containing ApoTome software was more suitable for the detection of γH2AX foci. (author)

  8. McKay agar enables routine quantification of the 'Streptococcus milleri' group in cystic fibrosis patients.

    Science.gov (United States)

    Sibley, Christopher D; Grinwis, Margot E; Field, Tyler R; Parkins, Michael D; Norgaard, Jens C; Gregson, Daniel B; Rabin, Harvey R; Surette, Michael G

    2010-05-01

    The 'Streptococcus milleri' group (SMG) has recently been recognized as a contributor to bronchopulmonary disease in cystic fibrosis (CF). Routine detection and quantification is limited by current CF microbiology protocols. McKay agar was developed previously for the semi-selective isolation of this group. Here, McKay agar was validated against a panel of clinical SMG isolates, which revealed improved SMG recovery compared with Columbia blood agar. The effectiveness of this medium was evaluated by appending it to the standard CF sputum microbiology protocols in a clinical laboratory for a 6-month period. All unique colony types were isolated and identified by 16S rRNA gene sequencing. Whilst a wide variety of organisms were isolated, members of the SMG were the most prevalent bacteria cultured, and McKay agar allowed routine quantification of the SMG from 10(3) to >10(8) c.f.u. ml(-1) directly from sputum. All members of the SMG were detected [Streptococcus anginosus (40.7 %), Streptococcus intermedius (34.3 %) and Streptococcus constellatus (25 %)] with an overall prevalence rate of 40.6 % in our adult CF population. Without exception, samples where SMG isolates were cultured at 10(7) c.f.u. ml(-1) or greater were associated with pulmonary exacerbations. This study demonstrates that McKay agar can be used routinely to quantify the SMG from complex clinical samples.

  9. Quantification of renal Na-K-ATPase activity by image analysing system.

    Science.gov (United States)

    Laborde, K; Bussieres, L; De Smet, A; Dechaux, M; Sachs, C

    1990-01-01

    The localisation of renal Na-K-ATPase activity along the rat nephron by a cytochemical method, and its quantification by an image analysis system, are described in this paper. Frozen kidney sections were exposed to a trapping agent, the lead ammoniac-citrate-acetate complex (LACA), and to all the substrates necessary to the enzyme activity. The absorbance of the histochemical reaction product (precipitated in situ), proportional to the enzymatic activity, was then measured through the analysis of the grey levels of the transmitted image of the kidney section. This method was both sufficiently sensitive and technically simple to permit measurements of the enzyme in large numbers of tubules and to determine its activity in each region of the nephron. The Na-K-ATPase activity has been determined in the proximal convoluted tubule (PCT), the medullary thick ascending limb of the Henle's loop (mTAL), and the distal convoluted tubules (DCT) of the rat nephron. The Na-K-ATPase distribution shows an activity per millimeter tubule length higher in the DCT than in the mTAL and the PCT: 1,406 +/- 33, 823 +/- 64, and 350 +/- 71 pmoles Pi/tubule mm/h, respectively. In conclusion, the described method allows the segmental quantification of Na-K-ATPase activity at a cellular level and offers a precise approach to the analysis of this enzyme along the length of nephrons.

  10. Immunonephelometric quantification of specific urinary proteins versus a simple electrophoretic method for characterizing proteinuria.

    Science.gov (United States)

    Wolff, Fleur; Willems, Dominique

    2008-04-01

    The quantification of urinary proteins presenting different molecular sizes is useful in characterizing a proteinuria. We assessed the performance of an electrophoretic system, the Hydragel Urine Profile, which allows firstly the identification of proteinuria and secondly a qualitative detection of monoclonal free light chains (FLC). Initially, the proteinuria was characterized on 127 pathological urines by quantifying albumin, a1microglobulin, immunoglobulins G by immunonephelometric quantification technique and the results were compared with the protein pattern obtained by the electrophoretic method. Secondly, we assessed the sensitivity and specificity of this electrophoretic test for the detection and characterization of Bence Jones proteins. FLC were analyzed quantitatively by an immunonephelometric assay and qualitatively by the electrophoretic test in 150 urines. The agreement between the two methods was good with a percentage of homology for characterizing the proteinuria of 89%. For detecting a monoclonal FLC, the electrophoretic method demonstrated a lesser sensitivity but a higher specificity compared to the immunoassay. The Urine Profile kit is a reliable assay that can be used as a screening test to differentiate the type of proteinuria.

  11. Doppler vortography: a color Doppler approach for quantification of the intraventricular blood flow vortices

    Science.gov (United States)

    Mehregan, Forough; Tournoux, François; Muth, Stéphan; Pibarot, Philippe; Rieu, Régis; Cloutier, Guy; Garcia, Damien

    2013-01-01

    We propose a new approach for quantification of intracardiac vorticity – Doppler vortography – based on conventional color Doppler images. Doppler vortography relies on the centrosymmetric properties of the vortices. Such properties induce particular symmetries in the Doppler flow data which can be exploited to describe the vortices quantitatively. For this purpose, a kernel filter was developed to derive a parameter –the blood vortex signature (BVS) – that allows detecting the main intracardiac vortices and estimating their core vorticities. The reliability of Doppler vortography was assessed in mock Doppler fields issued from simulations and in vitro data. Doppler vortography was also tested in patients and compared with vector flow mapping by echocardiography. Strong correlations were obtained between the Doppler vortography-derived and the ground-truth vorticities (in silico: r2 = 0.98, in vitro: r2 = 0.86, in vivo: r2 = 0.89). Our results demonstrated that Doppler vortography is a potentially promising echocardiographic tool for quantification of vortex flow in the left ventricle. PMID:24210865

  12. Discrimination of Solanaceae taxa and quantification of scopolamine and hyoscyamine by ATR-FTIR spectroscopy.

    Science.gov (United States)

    Naumann, Annette; Kurtze, Lukas; Krähmer, Andrea; Hagels, Hansjoerg; Schulz, Hartwig

    2014-10-01

    Plant species of the Solanaceae family (nightshades) contain pharmacologically active anticholinergic tropane alkaloids, e.g., scopolamine and hyoscyamine. Tropane alkaloids are of special interest, either as active principles or as starting materials for semisynthetic production of other substances. For genetic evaluation, domestication, cultivation, harvest and post-harvest treatments, quantification of the individual active principles is necessary to monitor industrial processes and the resulting finished products. Up to now, frequently applied methods for quantification are based on high performance liquid chromatography and gas chromatography optionally combined with mass spectrometry. However, alternative analytical methods have the potential to replace the established standard methods partly. In this context, attenuated total reflection-Fourier transform infrared spectroscopy enabled chemotaxonomical classification of the Solanaceae Atropa belladonna, Datura stramonium, Hyoscyamus niger, Solanum dulcamara, and Duboisia in combination with cluster analysis. Also discrimination of genotypes within species was achieved to some extent. The most characteristic scopolamine bands could be identified in attenuated total reflection-Fourier transform infrared spectra of Solanaceae leaves, which allow a fast characterisation of plants with high scopolamine content. Applying a partial least square algorithm, very good calibration statistics were obtained for the prediction of the scopolamine content (residual prediction deviation = 7.67), and moderate prediction quality could be achieved for the hyoscyamine content (residual prediction deviation = 2.48). Georg Thieme Verlag KG Stuttgart · New York.

  13. Characterization and quantification of oxides generated by anodization on titanium for implantation purposes

    Science.gov (United States)

    Aloia Games, L.; Pastore, J.; Bouchet, A.; Ballarre, J.

    2011-12-01

    The use of titanium as implant material is widely known in the surgery field. The formation of natural or artificial compact and protective oxide is a convenient tool for metal protection and a good way to generate phosphate deposits to enhance biocompatibility and bone fixation with the existing tissue. The present work has the aim of superficially modify commercially pure titanium sheets used in orthopedics and odontology, with a potencistatic anodization process with an ammonium phosphate and ammonium fluoride solution as electrolyte. The objective is to generate titanium oxides doped with phosphorous on the surface, to promote bioactivity. The characterization and quantification of the generated deposits is presented as a starting point for the future application of these materials. The applied characterization methods are X ray diffraction, micro-Raman spectroscopy analysis for evaluating the chemical and phase composition on the modified surface and PDI image analysis techniques that allow the segmentation of SEM images and the measurement and quantification of the oxides generated by the anodization process. The samples with polished treated surface at 30V have the deposit of a phosphate rich thick layer covering almost all the surface and spherical-shaped titanium oxide crystals randomly placed (covering more than 20% of the surface area).

  14. Doppler vortography: a color Doppler approach to quantification of intraventricular blood flow vortices.

    Science.gov (United States)

    Mehregan, Forough; Tournoux, François; Muth, Stéphan; Pibarot, Philippe; Rieu, Régis; Cloutier, Guy; Garcia, Damien

    2014-01-01

    We propose a new approach to quantification of intracardiac vorticity based on conventional color Doppler images -Doppler vortography. Doppler vortography relies on the centrosymmetric properties of the vortices. Such properties induce particular symmetries in the Doppler flow data that can be exploited to describe the vortices quantitatively. For this purpose, a kernel filter was developed to derive a parameter, the blood vortex signature (BVS), that allows detection of the main intracardiac vortices and estimation of their core vorticities. The reliability of Doppler vortography was assessed in mock Doppler fields issued from simulations and in vitro data. Doppler vortography was also tested in patients and compared with vector flow mapping by echocardiography. Strong correlations were obtained between Doppler vortography-derived and ground-truth vorticities (in silico: r2 = 0.98, in vitro: r2 = 0.86, in vivo: r2 = 0.89). Our results indicate that Doppler vortography is a potentially promising echocardiographic tool for quantification of vortex flow in the left ventricle. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  15. Molecular Approaches for High Throughput Detection and Quantification of Genetically Modified Crops: A Review

    Directory of Open Access Journals (Sweden)

    Ibrahim B. Salisu

    2017-10-01

    Full Text Available As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1 DNA and (2 proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction and enzyme-linked immunosorbent assay (ELISA were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future.

  16. Noninvasive in vivo detection and quantification of Demodex mites by confocal laser scanning microscopy.

    Science.gov (United States)

    Sattler, E C; Maier, T; Hoffmann, V S; Hegyi, J; Ruzicka, T; Berking, C

    2012-11-01

    In many Demodex-associated skin diseases Demodex mites are present in abundance and seem to be at least partially pathogenic. So far all diagnostic approaches such as scraping or standardized superficial skin biopsy are (semi-)invasive and may cause discomfort to the patient. To see whether confocal laser scanning microscopy (CLSM) - a noninvasive method for the visualization of superficial skin layers - is able to detect and quantify D. folliculorum in facial skin of patients with rosacea. Twenty-five patients (34-72 years of age) with facial rosacea and 25 age- and sex-matched normal controls were examined by CLSM. Mosaics of 8 × 8 mm and 5 × 5 mm were created by scanning horizontal layers of lesional skin and quantification of mites per follicle and per area as well as follicles per area was performed. In all patients D. folliculorum could be detected by CLSM and presented as roundish or lengthy cone-shaped structures. CLSM allowed the quantification of Demodex mites and revealed significant differences (P Demodex mites noninvasively in facial skin of patients with rosacea. © 2012 The Authors. BJD © 2012 British Association of Dermatologists.

  17. Radiation dose determines the method for quantification of DNA double strand breaks

    Directory of Open Access Journals (Sweden)

    TANJA BULAT

    2016-03-01

    Full Text Available ABSTRACT Ionizing radiation induces DNA double strand breaks (DSBs that trigger phosphorylation of the histone protein H2AX (γH2AX. Immunofluorescent staining visualizes formation of γH2AX foci, allowing their quantification. This method, as opposed to Western blot assay and Flow cytometry, provides more accurate analysis, by showing exact position and intensity of fluorescent signal in each single cell. In practice there are problems in quantification of γH2AX. This paper is based on two issues: the determination of which technique should be applied concerning the radiation dose, and how to analyze fluorescent microscopy images obtained by different microscopes. HTB140 melanoma cells were exposed to γ-rays, in the dose range from 1 to 16 Gy. Radiation effects on the DNA level were analyzed at different time intervals after irradiation by Western blot analysis and immunofluorescence microscopy. Immunochemically stained cells were visualized with two types of microscopes: AxioVision (Zeiss, Germany microscope, comprising an ApoTome software, and AxioImagerA1 microscope (Zeiss, Germany. Obtained results show that the level of γH2AX is time and dose dependent. Immunofluorescence microscopy provided better detection of DSBs for lower irradiation doses, while Western blot analysis was more reliable for higher irradiation doses. AxioVision microscope containing ApoTome software was more suitable for the detection of γH2AX foci.

  18. Methane-oxygen electrochemical coupling in an ionic liquid: a robust sensor for simultaneous quantification.

    Science.gov (United States)

    Wang, Zhe; Guo, Min; Baker, Gary A; Stetter, Joseph R; Lin, Lu; Mason, Andrew J; Zeng, Xiangqun

    2014-10-21

    Current sensor devices for the detection of methane or natural gas emission are either expensive and have high power requirements or fail to provide a rapid response. This report describes an electrochemical methane sensor utilizing a non-volatile and conductive pyrrolidinium-based ionic liquid (IL) electrolyte and an innovative internal standard method for methane and oxygen dual-gas detection with high sensitivity, selectivity, and stability. At a platinum electrode in bis(trifluoromethylsulfonyl)imide (NTf2)-based ILs, methane is electro-oxidized to produce CO2 and water when an oxygen reduction process is included. The in situ generated CO2 arising from methane oxidation was shown to provide an excellent internal standard for quantification of the electrochemical oxygen sensor signal. The simultaneous quantification of both methane and oxygen in real time strengthens the reliability of the measurements by cross-validation of two ambient gases occurring within a single sample matrix and allows for the elimination of several types of random and systematic errors in the detection. We have also validated this IL-based methane sensor employing both conventional solid macroelectrodes and flexible microfabricated electrodes using single- and double-potential step chronoamperometry.

  19. A reduction approach to improve the quantification of linked fault trees through binary decision diagrams

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez-Llano, Cristina, E-mail: cristina.ibanez@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain); Rauzy, Antoine, E-mail: Antoine.RAUZY@3ds.co [Dassault Systemes, 10 rue Marcel Dassault CS 40501, 78946 Velizy Villacoublay, Cedex (France); Melendez, Enrique, E-mail: ema@csn.e [Consejo de Seguridad Nuclear (CSN), C/Justo Dorado 11, 28040 Madrid (Spain); Nieto, Francisco, E-mail: nieto@iit.upcomillas.e [Instituto de Investigacion Tecnologica (IIT), Escuela Tecnica Superior de Ingenieria ICAI, Universidad Pontificia Comillas, C/Santa Cruz de Marcenado 26, 28015 Madrid (Spain)

    2010-12-15

    Over the last two decades binary decision diagrams have been applied successfully to improve Boolean reliability models. Conversely to the classical approach based on the computation of the MCS, the BDD approach involves no approximation in the quantification of the model and is able to handle correctly negative logic. However, when models are sufficiently large and complex, as for example the ones coming from the PSA studies of the nuclear industry, it begins to be unfeasible to compute the BDD within a reasonable amount of time and computer memory. Therefore, simplification or reduction of the full model has to be considered in some way to adapt the application of the BDD technology to the assessment of such models in practice. This paper proposes a reduction process based on using information provided by the set of the most relevant minimal cutsets of the model in order to perform the reduction directly on it. This allows controlling the degree of reduction and therefore the impact of such simplification on the final quantification results. This reduction is integrated in an incremental procedure that is compatible with the dynamic generation of the event trees and therefore adaptable to the recent dynamic developments and extensions of the PSA studies. The proposed method has been applied to a real case study, and the results obtained confirm that the reduction enables the BDD computation while maintaining accuracy.

  20. Molecular Approaches for High Throughput Detection and Quantification of Genetically Modified Crops: A Review

    Science.gov (United States)

    Salisu, Ibrahim B.; Shahid, Ahmad A.; Yaqoob, Amina; Ali, Qurban; Bajwa, Kamran S.; Rao, Abdul Q.; Husnain, Tayyab

    2017-01-01

    As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1) DNA and (2) proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction) and enzyme-linked immunosorbent assay (ELISA) were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA) are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future. PMID:29085378