WorldWideScience

Sample records for quantitative microplate-based respirometry

  1. Adaptation of Microplate-based Respirometry for Hippocampal Slices and Analysis of Respiratory Capacity

    Science.gov (United States)

    Schuh, Rosemary A.; Clerc, Pascaline; Hwang, Hyehyun; Mehrabian, Zara; Bittman, Kevin; Chen, Hegang; Polster, Brian M.

    2011-01-01

    Multiple neurodegenerative disorders are associated with altered mitochondrial bioenergetics. Although mitochondrial O2 consumption is frequently measured in isolated mitochondria, isolated synaptic nerve terminals (synaptosomes), or cultured cells, the absence of mature brain circuitry is a remaining limitation. Here we describe the development of a method that adapts the Seahorse Extracellular Flux Analyzer (XF24) for the microplate-based measurement of hippocampal slice O2 consumption. As a first evaluation of the technique, we compared whole slice bioenergetics to previous measurements made with synaptosomes or cultured neurons. We found that mitochondrial respiratory capacity and O2 consumption coupled to ATP synthesis could be estimated in cultured or acute hippocampal slices with preserved neural architecture. Mouse organotypic hippocampal slices oxidizing glucose displayed mitochondrial O2 consumption that was well-coupled, as determined by the sensitivity to the ATP synthase inhibitor oligomycin. However stimulation of respiration by uncoupler was modest (<120% of basal respiration) compared to previous measurements in cells or synaptosomes, although enhanced slightly (to ~150% of basal respiration) by the acute addition of the mitochondrial complex I-linked substrate pyruvate. These findings suggest a high basal utilization of respiratory capacity in slices and a limitation of glucose-derived substrate for maximal respiration. The improved throughput of microplate-based hippocampal respirometry over traditional O2 electrode-based methods is conducive to neuroprotective drug screening. When coupled with cell type-specific pharmacology or genetic manipulations, the ability to efficiently measure O2 consumption from whole slices should advance our understanding of mitochondrial roles in physiology and neuropathology. PMID:21520220

  2. Respirometry techniques and activated sludge models

    NARCIS (Netherlands)

    Benes, O.; Spanjers, H.; Holba, M.

    2002-01-01

    This paper aims to explain results of respirometry experiments using Activated Sludge Model No. 1. In cases of insufficient fit of ASM No. 1, further modifications to the model were carried out and the so-called "Enzymatic model" was developed. The best-fit method was used to determine the effect of

  3. Comparative cytotoxic and genotoxic potential of 13 drinking water disinfection by-products using a microplate-based cytotoxicity assay and a developed SOS/umu assay.

    Science.gov (United States)

    Zhang, Shao-Hui; Miao, Dong-Yue; Tan, Li; Liu, Ai-Lin; Lu, Wen-Qing

    2016-01-01

    The implications of disinfection by-products (DBPs) present in drinking water are of public health concern because of their potential mutagenic, carcinogenic and other toxic effects on humans. In this study, we selected 13 main DBPs found in drinking water to quantitatively analyse their cytotoxicity and genotoxicity using a microplate-based cytotoxicity assay and a developed SOS/umu assay in Salmonella typhimurium TA1535/pSK1002. With the developed SOS/umu test, eight DBPs: 3-chloro-4-(dichloromethyl)-5-hydroxy-2[5H]-fura3-chloro-4-(dichloromethyl)-5-hydroxy-2-[5H]-furanone (MX), dibromoacetonitrile (DBN), iodoacetic acid (IA), bromochloroacetonitrile (BCN), bromoacetic acid (BA), trichloroacetonitrile (TCN), dibromoacetic acid (DBA) and dichloroacetic acid (DCA) were significantly genotoxic to S. typhimurium. Three DBPs: chloroacetic acid (CA), trichloroacetic acid (TCA) and dichloroacetonitrile (DCN) were weakly genotoxic, whereas the remaining DBPs: chloroacetonitrile (CN) and chloral hydrate (CH) were negative. The rank order in decreasing genotoxicity was as follows: MX > DBN > IA > BCN > BA > TCN > DBA > DCA > CA, TCA, DCN > CN, CH. MX was approximately 370 000 times more genotoxic than DCA. In the microplate-based cytotoxicity assay, cytotoxic potencies of the 13 DBPs were compared and ranked in decreasing order as follows: MX > IA > DBN > BCN > BA > TCN > DCN > CA > DCA > DBA > CN > TCA > CH. MX was approximately 19 200 times more cytotoxic than CH. A statistically significant correlation was found between cytotoxicity and genotoxicity of the 13 DBPs in S. typhimurium. Results suggest that microplate-based cytotoxicity assay and the developed SOS/umu assay are feasible tools for analysing the cytotoxicity and genotoxicity of DBPs, particularly for comparing their toxic intensities quantitatively. © The Author 2015. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e

  4. Microplate-based high throughput screening procedure for the isolation of lipid-rich marine microalgae

    Directory of Open Access Journals (Sweden)

    Pereira Hugo

    2011-12-01

    Full Text Available Abstract We describe a new selection method based on BODIPY (4,4-difluoro-1,3,5,7-tetramethyl-4-bora-3a,4a-diaza-s-indacene staining, fluorescence activated cell sorting (FACS and microplate-based isolation of lipid-rich microalgae from an environmental sample. Our results show that direct sorting onto solid medium upon FACS can save about 3 weeks during the scale-up process as compared with the growth of the same cultures in liquid medium. This approach enabled us to isolate a biodiverse collection of several axenic and unialgal cultures of different phyla.

  5. Stability measurements of compost trough electrolytic respirometry

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez-Arias, V.; Fernandez, F. J.; Rodriguez, L.; Villasenor, J.

    2009-07-01

    An experimental technique for compost stability measurements based on electrolytic respirometry was optimized and subsequently applied to a composting process. Anaerobically digested sewage sludge mixed with reed was composted during 90 days in a pilot scale rotary drum with forced aeration. Periodic solid samples were taken, and a previously optimized respirometric procedure was applied to them in order to measure the oxygen consumption. The resirometric experiments were made directly with a few grams of solid samples, optimum moisture and 37 degree centigrade during 96h. (Author)

  6. Stability measurements of compost trough electrolytic respirometry

    International Nuclear Information System (INIS)

    Sanchez-Arias, V.; Fernandez, F. J.; Rodriguez, L.; Villasenor, J.

    2009-01-01

    An experimental technique for compost stability measurements based on electrolytic respirometry was optimized and subsequently applied to a composting process. Anaerobically digested sewage sludge mixed with reed was composted during 90 days in a pilot scale rotary drum with forced aeration. Periodic solid samples were taken, and a previously optimized respirometric procedure was applied to them in order to measure the oxygen consumption. The resirometric experiments were made directly with a few grams of solid samples, optimum moisture and 37 degree centigrade during 96h. (Author)

  7. Robust Microplate-Based Methods for Culturing and in Vivo Phenotypic Screening of Chlamydomonas reinhardtii

    Directory of Open Access Journals (Sweden)

    Timothy C. Haire

    2018-03-01

    Full Text Available Chlamydomonas reinhardtii (Cr, a unicellular alga, is routinely utilized to study photosynthetic biochemistry, ciliary motility, and cellular reproduction. Its minimal culture requirements, unicellular morphology, and ease of transformation have made it a popular model system. Despite its relatively slow doubling time, compared with many bacteria, it is an ideal eukaryotic system for microplate-based studies utilizing either, or both, absorbance as well as fluorescence assays. Such microplate assays are powerful tools for researchers in the areas of toxicology, pharmacology, chemical genetics, biotechnology, and more. However, while microplate-based assays are valuable tools for screening biological systems, these methodologies can significantly alter the conditions in which the organisms are cultured and their subsequent physiology or morphology. Herein we describe a novel method for the microplate culture and in vivo phenotypic analysis of growth, viability, and photosynthetic pigments of C. reinhardtii. We evaluated the utility of our assay by screening silver nanoparticles for their effects on growth and viability. These methods are amenable to a wide assortment of studies and present a significant advancement in the methodologies available for research involving this model organism.

  8. Analysis of regional brain mitochondrial bioenergetics and susceptibility to mitochondrial inhibition utilizing a microplate based system

    Science.gov (United States)

    Sauerbeck, Andrew; Pandya, Jignesh; Singh, Indrapal; Bittman, Kevin; Readnower, Ryan; Bing, Guoying; Sullivan, Patrick

    2012-01-01

    The analysis of mitochondrial bioenergetic function typically has required 50–100 μg of protein per sample and at least 15 min per run when utilizing a Clark-type oxygen electrode. In the present work we describe a method utilizing the Seahorse Biosciences XF24 Flux Analyzer for measuring mitochondrial oxygen consumption simultaneously from multiple samples and utilizing only 5 μg of protein per sample. Utilizing this method we have investigated whether regionally based differences exist in mitochondria isolated from the cortex, striatum, hippocampus, and cerebellum. Analysis of basal mitochondrial bioenergetics revealed that minimal differences exist between the cortex, striatum, and hippocampus. However, the cerebellum exhibited significantly slower basal rates of Complex I and Complex II dependent oxygen consumption (p < 0.05). Mitochondrial inhibitors affected enzyme activity proportionally across all samples tested and only small differences existed in the effect of inhibitors on oxygen consumption. Investigation of the effect of rotenone administration on Complex I dependent oxygen consumption revealed that exposure to 10 pM rotenone led to a clear time dependent decrease in oxygen consumption beginning 12 min after administration (p < 0.05). These studies show that the utilization of this microplate based method for analysis of mitochondrial bioenergetics is effective at quantifying oxygen consumption simultaneously from multiple samples. Additionally, these studies indicate that minimal regional differences exist in mitochondria isolated from the cortex, striatum, or hippocampus. Furthermore, utilization of the mitochondrial inhibitors suggests that previous work indicating regionally specific deficits following systemic mitochondrial toxin exposure may not be the result of differences in the individual mitochondria from the affected regions. PMID:21402103

  9. Some errors in respirometry of aquatic breathers: How to avoid and correct for them

    DEFF Research Database (Denmark)

    STEFFENSEN, JF

    1989-01-01

    Respirometry in closed and flow-through systems is described with the objective of pointing out problems and sources of errors involved and how to correct for them. Both closed respirometry applied to resting and active animals and intermillent-flow respirometry is described. In addition, flow...

  10. Design and setup of intermittent-flow respirometry system for aquatic organisms

    DEFF Research Database (Denmark)

    Svendsen, Morten Bo Søndergaard; Bushnell, P.G.; Steffensen, John Fleng

    2016-01-01

    Intermittent-flow respirometry is an experimental protocol for measuring oxygen consumption in aquatic organisms that utilizes the best features of closed (stop-flow) and flow-through respirometry while eliminating (or at least reducing) some of their inherent problems. By interspersing short...... and software further reduces error by allowing many measurements to be made over long periods thereby minimizing animal stress due to acclimation issues. This paper describes some of the fundamental principles that need to be considered when designing and carrying out automated intermittent-flow respirometry...

  11. Direct calorimetry identifies deficiencies in respirometry for the determination of resting metabolic rate in C57Bl/6 and FVB mice

    Science.gov (United States)

    Burnett, Colin M. L.

    2013-01-01

    Substantial research efforts have been aimed at identifying novel targets to increase resting metabolic rate (RMR) as an adjunct approach to the treatment of obesity. Respirometry (one form of “indirect calorimetry”) is unquestionably the dominant technique used in the obesity research field to assess RMR in vivo, although this method relies upon a lengthy list of assumptions that are likely to be violated in pharmacologically or genetically manipulated animals. A “total” calorimeter, including a gradient layer direct calorimeter coupled to a conventional respirometer, was used to test the accuracy of respirometric-based estimations of RMR in laboratory mice (Mus musculus Linnaeus) of the C57Bl/6 and FVB background strains. Using this combined calorimeter, we determined that respirometry underestimates RMR of untreated 9- to 12-wk-old male mice by ∼10–12%. Quantitative and qualitative differences resulted between methods for untreated C57Bl/6 and FVB mice, C57Bl/6 mice treated with ketamine-xylazine anesthesia, and FVB mice with genetic deletion of the angiotensin II type 2 receptor. We conclude that respirometric methods underestimate RMR in mice in a magnitude that is similar to or greater than the desired RMR effects of novel therapeutics. Sole reliance upon respirometry to assess RMR in mice may lead to false quantitative and qualitative conclusions regarding the effects of novel interventions. Increased use of direct calorimetry for the assessment of RMR and confirmation of respirometry results and the reexamination of previously discarded potential obesity therapeutics are warranted. PMID:23964071

  12. Evaluation of Biostimulation (Nutrients) in hydrocarbons contaminated soils by respirometry

    International Nuclear Information System (INIS)

    Garcia, Erika; Roldan, Fabio; Garzon, Laura

    2011-01-01

    The biostimulation process was evaluated in a hydrocarbon contaminated soil by respirometry after amendment with inorganic compound fertilizer (ICF) (N: P: K 28:12:7) and simple inorganic salts (SIS) (NH 4 NO 3 and K 2 HPO 4 ). The soil was contaminated with oily sludge (40.000 MgTPH/Kgdw). The oxygen uptake was measured using two respirometers (HACH 2173b and OXITOP PF 600) during thirteen days (n=3). Two treatments (ICF and SIS) and three controls (abiotic, reference substance and without nutrients) were evaluated during the study. Physicochemical (pH, nutrients, and TPH) and microbiological analysis (heterotrophic and hydrocarbon-utilizing microorganisms) were obtained at the beginning and at the end of each assay. Higher respiration rates were recorded in sis and without nutrient control. Results were 802.28 and 850.72- 1 d-1, MgO 2 kgps - 1d i n HACH, while in OXITOP were 936.65 and 502.05 MgO 2 Kgps respectively. These data indicate that amendment of nutrients stimulated microbial metabolism. ICF had lower respiration rates (188.18 and 139.87 MgO 2 kgps - 1d - 1 i n HACH and OXITOP, respectively) as well as counts; this could be attributed to ammonia toxicity.

  13. Cutaneous respirometry by dynamic measurement of mitochondrial oxygen tension for monitoring mitochondrial function in vivo.

    Science.gov (United States)

    Harms, Floor A; Voorbeijtel, Wilhelmina J; Bodmer, Sander I A; Raat, Nicolaas J H; Mik, Egbert G

    2013-09-01

    Progress in diagnosis and treatment of mitochondrial dysfunction in chronic and acute disease could greatly benefit from techniques for monitoring of mitochondrial function in vivo. In this study we demonstrate the feasibility of in vivo respirometry in skin. Mitochondrial oxygen measurements by means of oxygen-dependent delayed fluorescence of protoporphyrin IX are shown to provide a robust basis for measurement of local oxygen disappearance rate (ODR). The fundamental principles behind the technology are described, together with an analysis method for retrievel of respirometry data. The feasibility and reproducibility of this clinically useful approach are demonstrated in a series of rats. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Identification and quantification of nitrogen nutrient deficiency in the activated sludge process using respirometry

    NARCIS (Netherlands)

    Ning, Z.; Patry, G.G.; Spanjers, H.

    2000-01-01

    Experimental protocols to identify and quantify nitrogen nutrient deficiency in the activated sludge process were developed and tested using respirometry. Respirometric experiments showed that when a nitrogen nutrient deficient sludge is exposed to ammonia nitrogen, the oxygen uptake rate (OUR) of

  15. Cutaneous respirometry as novel technique to monitor mitochondrial function: A feasibility study in healthy volunteers

    NARCIS (Netherlands)

    F.A. Harms (Floor A.); R.J. Stolker (Robert); E.G. Mik (Egbert)

    2016-01-01

    textabstractBackground: The protoporphyrin IX-triplet state lifetime technique (PpIX-TSLT) is proposed as a potential clinical non-invasive tool to monitor mitochondrial function. This technique has been evaluated in several animal studies. Mitochondrial respirometry allows measurement in vivo of

  16. Comparison of doubly labeled water with respirometry at low- and high-activity levels

    International Nuclear Information System (INIS)

    Westerterp, K.R.; Brouns, F.; Saris, W.H.; ten Hoor, F.

    1988-01-01

    In previous studies the doubly labeled water method for measuring energy expenditure in free-living humans has been validated against respirometry under sedentary conditions. In the present investigation, energy expenditure is measured simultaneously with doubly labeled water and respirometry at low- and high-activity levels. Over 6 days, five subjects were measured doing mainly sedentary activities like desk work; their average daily metabolic rate was 1.40 +/- 0.09 (SD) times sleeping metabolic rate. Four subjects were measured twice over 3.5 days, including 2 days with heavy bicycle ergometer work, resulting in an average daily metabolic rate of 2.61 +/- 0.25 (SD) times sleeping metabolic rate. At the low-activity level, energy expenditures from the doubly labeled water method were on the average 1.4 +/- 3.9% (SD) larger than those from respirometry. At the high-activity level, the doubly labeled water method yielded values that were 1.0 +/- 7.0% (SD) lower than those from respirometry. Results demonstrate the utility of the doubly labeled water method for the determination of energy expenditure in the range of activity levels in daily life

  17. Noninvasive diagnostics of mitochondrial disorders in isolated lymphocytes with high resolution respirometry

    Czech Academy of Sciences Publication Activity Database

    Pecina, Petr; Houšťková, H.; Mráček, Tomáš; Pecinová, Alena; Nůsková, Hana; Tesařová, M.; Hansíková, H.; Janota, J.; Zeman, J.; Houštěk, Josef

    2014-01-01

    Roč. 2, Dec (2014), s. 62-71 ISSN 2214-6474 R&D Projects: GA MZd(CZ) NT12370; GA ČR(CZ) GAP303/11/0970; GA ČR(CZ) GB14-36804G Institutional support: RVO:67985823 Keywords : lymphocytes * respirometry * oxidative phosphorylation * mitochondrial diseases * diagnostics Subject RIV: FG - Pediatrics

  18. Comparability of slack water and Lagrangian flow respirometry methods for community metabolic measurements.

    Directory of Open Access Journals (Sweden)

    Emily C Shaw

    Full Text Available Coral reef calcification is predicted to decline as a result of ocean acidification and other anthropogenic stressors. The majority of studies predicting declines based on in situ relationships between environmental parameters and net community calcification rate have been location-specific, preventing accurate predictions for coral reefs globally. In this study, net community calcification and production were measured on a coral reef flat at One Tree Island, Great Barrier Reef, using Lagrangian flow respirometry and slack water methods. Net community calcification, daytime net photosynthesis and nighttime respiration were higher under the flow respirometry method, likely due to increased water flow relative to the slack water method. The two methods also varied in the degrees to which they were influenced by potential measurement uncertainties. The difference in the results from these two commonly used methods implies that some of the location-specific differences in coral reef community metabolism may be due to differences in measurement methods.

  19. Microplate-Based Evaluation of the Sugar Yield from Giant Reed, Giant Miscanthus and Switchgrass after Mild Chemical Pre-Treatments and Hydrolysis with Tailored Trichoderma Enzymatic Blends.

    Science.gov (United States)

    Cianchetta, Stefano; Bregoli, Luca; Galletti, Stefania

    2017-11-01

    Giant reed, miscanthus, and switchgrass are considered prominent lignocellulosic feedstocks to obtain fermentable sugars for biofuel production. The bioconversion into sugars requires a delignifying pre-treatment step followed by hydrolysis with cellulase and other accessory enzymes like xylanase, especially in the case of alkali pre-treatments, which retain the hemicellulose fraction. Blends richer in accessory enzymes than commercial mix can be obtained growing fungi on feedstock-based substrates, thus ten selected Trichoderma isolates, including the hypercellulolytic strain Trichoderma reesei Rut-C30, were grown on giant reed, miscanthus, or switchgrass-based substrates. The produced enzymes were used to saccharify the corresponding feedstocks, compared to a commercial enzymatic mix (6 FPU/g). Feedstocks were acid (H 2 SO 4 0.2-2%, w/v) or alkali (NaOH 0.02-0.2%, w/v) pre-treated. A microplate-based approach was chosen for most of the experimental steps due to the large number of samples. The highest bioconversion was generally obtained with Trichoderma harzianum Or4/99 enzymes (78, 89, and 94% final sugar yields at 48 h for giant reed, miscanthus, and switchgrass, respectively), with significant increases compared to the commercial mix, especially with alkaline pre-treatments. The differences in bioconversion yields were only partially caused by xylanases (maximum R 2  = 0.5), indicating a role for other accessory enzymes.

  20. Coupling HPLC-SPE-NMR with a microplate-based high-resolution antioxidant assay for efficient analysis of antioxidants in food--validation and proof-of-concept study with caper buds.

    Science.gov (United States)

    Wiese, Stefanie; Wubshet, Sileshi G; Nielsen, John; Staerk, Dan

    2013-12-15

    This work describes the coupling of a microplate-based antioxidant assay with a hyphenated system consisting of high-performance liquid chromatography-solid-phase extraction-nuclear magnetic resonance spectroscopy, i.e., HPLC-SPE-NMR/high-resolution antioxidant assay, for the analysis of complex food extracts. The applicability of the microplate-based antioxidant assay for high-resolution screening of common food phenolics as well as parameters related to their trapping efficiency, elution behavior, and recovery on/from SPE cartridges are described. It was found that the microplate-based high-resolution antioxidant assay is an attractive and easy implementable alternative to direct on-line screening methods. Furthermore, it was shown that Resin SH and Resin GP SPE material are superior to RP C18HD for trapping of phenolic compounds. Proof-of-concept study was performed with caper bud extract, revealing the most important antioxidants to be quercetin, kaempferol, rutin, kaempferol-3-O-β-rutinoside and N(1),N(5),N(10)-triphenylpropenoyl spermidine amides. Targeted isolation of the latter, and comprehensive NMR experiments showed them to be N(1),N(10)-di-(E)-caffeoyl-N(5)-p-(E)-coumaroyl spermidine, N(1)-(E)-caffeoyl-N(5),N(10)-di-p-(E)-coumaroyl spermidine, N(10)-(E)-caffeoyl-N(1),N(5)-di-p-(E)-coumaroyl spermidine, and N(1),N(5),N(10)-tri-p-(E)-coumaroyl spermidine amides. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Further analysis of open-respirometry systems: an a-compartmental mechanistic approach

    Directory of Open Access Journals (Sweden)

    Chaui-Berlinck J.G.

    2000-01-01

    Full Text Available A system is said to be "instantaneous" when for a given constant input an equilibrium output is obtained after a while. In the meantime, the output is changing from its initial value towards the equilibrium one. This is the transient period of the system and transients are important features of open-respirometry systems. During transients, one cannot compute the input amplitude directly from the output. The existing models (e.g., first or second order dynamics cannot account for many of the features observed in real open-respirometry systems, such as time lag. Also, these models do not explain what should be expected when a system is speeded up or slowed down. The purpose of the present study was to develop a mechanistic approach to the dynamics of open-respirometry systems, employing basic thermodynamic concepts. It is demonstrated that all the main relevant features of the output dynamics are due to and can be adequately explained by a distribution of apparent velocities within the set of molecules travelling along the system. The importance of the rate at which the molecules leave the sensor is explored for the first time. The study approaches the difference in calibrating a system with a continuous input and with a "unit impulse": the former truly reveals the dynamics of the system while the latter represents the first derivative (in time of the former and, thus, cannot adequately be employed in the apparent time-constant determination. Also, we demonstrate why the apparent order of the output changes with volume or flow.

  2. Development of an operational tool for biomonitoring using constant pressure respirometry

    International Nuclear Information System (INIS)

    Zachritz, W.H. II; Morrow, J.

    1992-01-01

    The United States Environmental Protection Agency (US EPA) policy statement (FR 49, 9016, 9 March 1984) for the development of water quality based permit limitations includes toxicity testing of effluents as an important part of a water quality approach to controlling toxics (Pickering, 1988). To assure that state waters are free of toxics, both chemical and biological methods were recommended for assessing effluent quality. The US EPA validated bioassay procedures for toxicity testing of wastewater discharges use three distinctive organisms groups: vertebrates, invertebrates, and algae (Weber, 1989). The specific species for these three groups are fathead minnow, Pimephales promelas; water tea, Ceriodaphnia, dubia; and the green algae, Selenastrum capricornum, respectively. Definitive testing estimates the concentration atwhich a certain percentage of organisms exhibit a certain response. The definitive test exposes several replicate groups of organisms to the target substrate for a predetermined time period effluent concentration. The overall goal of this study is to develop an acceptable protocol for operational biomonitoring based on constant pressure respirometry for LANL. The specific objectives include: Development an appropriate toxicity testing protocol based on constant pressure respirometry for whole effluent toxicity testing, and evaluate the protocol based on factors such as sensitivity, response time, cost of analysis, and simplicity of operation

  3. Reliability of maximal mitochondrial oxidative phosphorylation in permeabilized fibers from the vastus lateralis employing high-resolution respirometry

    DEFF Research Database (Denmark)

    Cardinale, Daniele A; Gejl, Kasper D; Ørtenblad, Niels

    2018-01-01

    The purpose was to assess the impact of various factors on methodological errors associated with measurement of maximal oxidative phosphorylation (OXPHOS) in human skeletal muscle determined by high-resolution respirometry in saponin-permeabilized fibers. Biopsies were collected from 25 men...

  4. High resolution respirometry analysis of polyethylenimine-mediated mitochondrial energy crisis and cellular stress

    DEFF Research Database (Denmark)

    Hall, Arnaldur; Larsen, Anna Karina; Parhamifar, Ladan

    2013-01-01

    and spectrophotometry analysis of cytochrome c oxidase activity we were able to identify complex IV (cytochrome c oxidase) as a likely specific site of PEI mediated inhibition within the electron transport system. Unraveling the mechanisms of PEI-mediated mitochondrial energy crisis is central for combinatorial design...... of PEI-mediated plasma membrane damage and subsequent ATP leakage to the extracellular medium. Studies with freshly isolated mouse liver mitochondria corroborated with bioenergetic findings and demonstrated parallel polycation concentration- and time-dependent changes in state 2 and state 4o oxygen flux...... as well as lowered ADP phosphorylation (state 3) and mitochondrial ATP synthesis. Polycation-mediated reduction of electron transport system activity was further demonstrated in 'broken mitochondria' (freeze-thawed mitochondrial preparations). Moreover, by using both high-resolution respirometry...

  5. Mitochondrial Stress Tests Using Seahorse Respirometry on Intact Dictyostelium discoideum Cells.

    Science.gov (United States)

    Lay, Sui; Sanislav, Oana; Annesley, Sarah J; Fisher, Paul R

    2016-01-01

    Mitochondria not only play a critical and central role in providing metabolic energy to the cell but are also integral to the other cellular processes such as modulation of various signaling pathways. These pathways affect many aspects of cell physiology, including cell movement, growth, division, differentiation, and death. Mitochondrial dysfunction which affects mitochondrial bioenergetics and causes oxidative phosphorylation defects can thus lead to altered cellular physiology and manifest in disease. The assessment of the mitochondrial bioenergetics can thus provide valuable insights into the physiological state, and the alterations to the state of the cells. Here, we describe a method to successfully use the Seahorse XF(e)24 Extracellular Flux Analyzer to assess the mitochondrial respirometry of the cellular slime mold Dictyostelium discoideum.

  6. Impact of electrochemical treatment of soil washing solution on PAH degradation efficiency and soil respirometry

    International Nuclear Information System (INIS)

    Mousset, Emmanuel; Huguenot, David; Hullebusch, Eric D. van; Oturan, Nihal; Guibaud, Gilles; Esposito, Giovanni; Oturan, Mehmet A.

    2016-01-01

    The remediation of a genuinely PAH-contaminated soil was performed, for the first time, through a new and complete investigation, including PAH extraction followed by advanced oxidation treatment of the washing solution and its recirculation, and an analysis of the impact of the PAH extraction on soil respirometry. The study has been performed on the remediation of genuine PAH-contaminated soil, in the following three steps: (i) PAH extraction with soil washing (SW) techniques, (ii) PAH degradation with an electro-Fenton (EF) process, and (iii) recirculation of the partially oxidized effluent for another SW cycle. The following criteria were monitored during the successive washing cycles: PAH extraction efficiency, PAH oxidation rates and yields, extracting agent recovery, soil microbial activity, and pH of soil. Two representative extracting agents were compared: hydroxypropyl-beta-cyclodextrin (HPCD) and a non-ionic surfactant, Tween"® 80. Six PAH with different numbers of rings were monitored: acenaphthene (ACE), phenanthrene (PHE), fluoranthene (FLA), pyrene (PYR), benzo(a)pyrene (BaP), and benzo(g,h,i)perylene (BghiP). Tween"® 80 showed much better PAH extraction efficiency (after several SW cycles) than HPCD, regardless of the number of washing cycles. Based on successive SW experiments, a new mathematical relation taking into account the soil/water partition coefficient (Kd*) was established, and could predict the amount of each PAH extracted by the surfactant with a good correlation with experimental results (R"2 > 0.975). More HPCD was recovered (89%) than Tween"® 80 (79%), while the monitored pollutants were completely degraded (>99%) after 4 h and 8 h, respectively. Even after being washed with partially oxidized solutions, the Tween"® 80 solutions extracted significantly more PAH than HPCD and promoted better soil microbial activity, with higher oxygen consumption rates. Moreover, neither the oxidation by-products nor the acidic media (p

  7. Assessment of mitochondrial functions in Daphnia pulex clones using high-resolution respirometry.

    Science.gov (United States)

    Kake-Guena, Sandrine A; Touisse, Kamal; Vergilino, Roland; Dufresne, France; Blier, Pierre U; Lemieux, Hélène

    2015-06-01

    The objectives of our study were to adapt a method to measure mitochondrial function in intact mitochondria from the small crustacean Daphnia pulex and to validate if this method was sensitive enough to characterize mitochondrial metabolism in clones of the pulex complex differing in ploidy levels, mitochondrial DNA haplotypes, and geographic origins. Daphnia clones belonging to the Daphnia pulex complex represent a powerful model to delineate the link between mitochondrial DNA evolution and mitochondrial phenotypes, as single genotypes with divergent mtDNA can be grown under various experimental conditions. Our study included two diploid clones from temperate environments and two triploid clones from subarctic environments. The whole animal permeabilization and measurement of respiration with high-resolution respirometry enabled the measurement of the functional capacity of specific mitochondrial complexes in four clones. When expressing the activity as ratios, our method detected significant interclonal variations. In the triploid subarctic clone from Kuujjurapik, a higher proportion of the maximal physiological oxidative phosphorylation (OXPHOS) capacity of mitochondria was supported by complex II, and a lower proportion by complex I. The triploid subarctic clone from Churchill (Manitoba) showed the lowest proportion of the maximal OXPHOS supported by complex II. Additional studies are required to determine if these differences in mitochondrial functions are related to differences in mitochondrial haplotypes or ploidy level and if they might be associated with fitness divergences and therefore selective value. © 2015 Wiley Periodicals, Inc.

  8. The “Flexi-Chamber”: A Novel Cost-Effective In Situ Respirometry Chamber for Coral Physiological Measurements

    Science.gov (United States)

    Camp, Emma F.; Krause, Sophie-Louise; Santos, Lourianne M. F.; Naumann, Malik S.; Kikuchi, Ruy K. P.; Smith, David J.; Wild, Christian; Suggett, David J.

    2015-01-01

    Coral reefs are threatened worldwide, with environmental stressors increasingly affecting the ability of reef-building corals to sustain growth from calcification (G), photosynthesis (P) and respiration (R). These processes support the foundation of coral reefs by directly influencing biogeochemical nutrient cycles and complex ecological interactions and therefore represent key knowledge required for effective reef management. However, metabolic rates are not trivial to quantify and typically rely on the use of cumbersome in situ respirometry chambers and/or the need to remove material and examine ex situ, thereby fundamentally limiting the scale, resolution and possibly the accuracy of the rate data. Here we describe a novel low-cost in situ respirometry bag that mitigates many constraints of traditional glass and plexi-glass incubation chambers. We subsequently demonstrate the effectiveness of our novel “Flexi-Chamber” approach via two case studies: 1) the Flexi-Chamber provides values of P, R and G for the reef-building coral Siderastrea cf. stellata collected from reefs close to Salvador, Brazil, which were statistically similar to values collected from a traditional glass respirometry vessel; and 2) wide-scale application of obtaining P, R and G rates for different species across different habitats to obtain inter- and intra-species differences. Our novel cost-effective design allows us to increase sampling scale of metabolic rate measurements in situ without the need for destructive sampling and thus significantly expands on existing research potential, not only for corals as we have demonstrated here, but also other important benthic groups. PMID:26448294

  9. Impact of electrochemical treatment of soil washing solution on PAH degradation efficiency and soil respirometry.

    Science.gov (United States)

    Mousset, Emmanuel; Huguenot, David; van Hullebusch, Eric D; Oturan, Nihal; Guibaud, Gilles; Esposito, Giovanni; Oturan, Mehmet A

    2016-04-01

    The remediation of a genuinely PAH-contaminated soil was performed, for the first time, through a new and complete investigation, including PAH extraction followed by advanced oxidation treatment of the washing solution and its recirculation, and an analysis of the impact of the PAH extraction on soil respirometry. The study has been performed on the remediation of genuine PAH-contaminated soil, in the following three steps: (i) PAH extraction with soil washing (SW) techniques, (ii) PAH degradation with an electro-Fenton (EF) process, and (iii) recirculation of the partially oxidized effluent for another SW cycle. The following criteria were monitored during the successive washing cycles: PAH extraction efficiency, PAH oxidation rates and yields, extracting agent recovery, soil microbial activity, and pH of soil. Two representative extracting agents were compared: hydroxypropyl-beta-cyclodextrin (HPCD) and a non-ionic surfactant, Tween(®) 80. Six PAH with different numbers of rings were monitored: acenaphthene (ACE), phenanthrene (PHE), fluoranthene (FLA), pyrene (PYR), benzo(a)pyrene (BaP), and benzo(g,h,i)perylene (BghiP). Tween(®) 80 showed much better PAH extraction efficiency (after several SW cycles) than HPCD, regardless of the number of washing cycles. Based on successive SW experiments, a new mathematical relation taking into account the soil/water partition coefficient (Kd*) was established, and could predict the amount of each PAH extracted by the surfactant with a good correlation with experimental results (R(2) > 0.975). More HPCD was recovered (89%) than Tween(®) 80 (79%), while the monitored pollutants were completely degraded (>99%) after 4 h and 8 h, respectively. Even after being washed with partially oxidized solutions, the Tween(®) 80 solutions extracted significantly more PAH than HPCD and promoted better soil microbial activity, with higher oxygen consumption rates. Moreover, neither the oxidation by-products nor the acidic media (p

  10. Cutaneous Respirometry as Novel Technique to Monitor Mitochondrial Function: A Feasibility Study in Healthy Volunteers

    Science.gov (United States)

    Stolker, Robert Jan; Mik, Egbert

    2016-01-01

    Background The protoporphyrin IX-triplet state lifetime technique (PpIX-TSLT) is proposed as a potential clinical non-invasive tool to monitor mitochondrial function. This technique has been evaluated in several animal studies. Mitochondrial respirometry allows measurement in vivo of mitochondrial oxygen tension (mitoPO2) and mitochondrial oxygen consumption (mitoVO2) in skin. This study describes the first use of a clinical prototype in skin of humans. Methods The clinical prototype was tested in 30 healthy volunteers. A self-adhesive patch containing 2 mg 5-aminolevulinic acid (ALA) was applied on the skin of the anterior chest wall (sternal) for induction of mitochondrial protoporphyrin IX and was protected from light for 5 h. MitoPO2 was measured by means of oxygen-dependent delayed fluorescence of protoporphyrin IX. MitoVO2 was determined by dynamic mitoPO2 measurements on the primed skin, while locally blocking oxygen supply by applying local pressure with the measurement probe. MitoPO2 was recorded before and during a 60-s period of compression of the microcirculation, at an interval of 1 Hz. Oxygen consumption (i.e. the local oxygen disappearance rate) was calculated from the decay of the mitoPO2 slope. Results Oxygen-dependent delayed fluorescence measurements were successfully performed in the skin of 27 volunteers. The average value (± SD) of mitoPO2 was 44 ± 17 mmHg and mean mitoVO2 values were 5.8 ± 2.3 and 6.1 ± 1.6 mmHg s-1 at a skin temperature of 34°C and 40°C, respectively. No major discomfort during measurement and no long-term dermatological abnormalities were reported in a survey performed 1 month after measurements. Conclusion These results show that the clinical prototype allows measurement of mitochondrial oxygenation and oxygen consumption in humans. The development of this clinically applicable device offers opportunities for further evaluation of the technique in humans and the start of first clinical studies. PMID:27455073

  11. Measuring maximum and standard metabolic rates using intermittent-flow respirometry: a student laboratory investigation of aerobic metabolic scope and environmental hypoxia in aquatic breathers.

    Science.gov (United States)

    Rosewarne, P J; Wilson, J M; Svendsen, J C

    2016-01-01

    Metabolic rate is one of the most widely measured physiological traits in animals and may be influenced by both endogenous (e.g. body mass) and exogenous factors (e.g. oxygen availability and temperature). Standard metabolic rate (SMR) and maximum metabolic rate (MMR) are two fundamental physiological variables providing the floor and ceiling in aerobic energy metabolism. The total amount of energy available between these two variables constitutes the aerobic metabolic scope (AMS). A laboratory exercise aimed at an undergraduate level physiology class, which details the appropriate data acquisition methods and calculations to measure oxygen consumption rates in rainbow trout Oncorhynchus mykiss, is presented here. Specifically, the teaching exercise employs intermittent flow respirometry to measure SMR and MMR, derives AMS from the measurements and demonstrates how AMS is affected by environmental oxygen. Students' results typically reveal a decline in AMS in response to environmental hypoxia. The same techniques can be applied to investigate the influence of other key factors on metabolic rate (e.g. temperature and body mass). Discussion of the results develops students' understanding of the mechanisms underlying these fundamental physiological traits and the influence of exogenous factors. More generally, the teaching exercise outlines essential laboratory concepts in addition to metabolic rate calculations, data acquisition and unit conversions that enhance competency in quantitative analysis and reasoning. Finally, the described procedures are generally applicable to other fish species or aquatic breathers such as crustaceans (e.g. crayfish) and provide an alternative to using higher (or more derived) animals to investigate questions related to metabolic physiology. © 2016 The Fisheries Society of the British Isles.

  12. Assessing honeybee and wasp thermoregulation and energetics-New insights by combination of flow-through respirometry with infrared thermography

    Energy Technology Data Exchange (ETDEWEB)

    Stabentheiner, Anton, E-mail: anton.stabentheiner@uni-graz.at [Institut fuer Zoologie, Karl-Franzens-Universitaet Graz, Universitaetsplatz 2, A-8010 Graz (Austria); Kovac, Helmut, E-mail: he.kovac@uni-graz.at [Institut fuer Zoologie, Karl-Franzens-Universitaet Graz, Universitaetsplatz 2, A-8010 Graz (Austria); Hetz, Stefan K. [Department of Animal Physiology/Systems Neurobiology and Neural Computation, Philippstrasse 13-Leonor Michaelis Haus, Humboldt-Universitaet zu Berlin, 10115 Berlin (Germany); Kaefer, Helmut; Stabentheiner, Gabriel [Institut fuer Zoologie, Karl-Franzens-Universitaet Graz, Universitaetsplatz 2, A-8010 Graz (Austria)

    2012-04-20

    Highlights: Black-Right-Pointing-Pointer We demonstrate the benefits of a combined use of infrared thermography with respiratory measurements in insect ecophysiological research. Black-Right-Pointing-Pointer Infrared thermography enables repeated investigation of behaviour and thermoregulation without behavioural impairment. Black-Right-Pointing-Pointer Comparison with respirometry brings new insights into the mechanisms of energetic optimisation of bee and wasp foraging. Black-Right-Pointing-Pointer Combination of methods improves interpretation of respiratory traces in determinations of insect critical thermal limits. - Abstract: Endothermic insects like honeybees and some wasps have to cope with an enormous heat loss during foraging because of their small body size in comparison to endotherms like mammals and birds. The enormous costs of thermoregulation call for optimisation. Honeybees and wasps differ in their critical thermal maximum, which enables the bees to kill the wasps by heat. We here demonstrate the benefits of a combined use of body temperature measurement with infrared thermography, and respiratory measurements of energy turnover (O{sub 2} consumption or CO{sub 2} production via flow-through respirometry) to answer questions of insect ecophysiological research, and we describe calibrations to receive accurate results. To assess the question of what foraging honeybees optimise, their body temperature was compared with their energy turnover. Honeybees foraging from an artificial flower with unlimited sucrose flow increased body surface temperature and energy turnover with profitability of foraging (sucrose content of the food; 0.5 or 1.5 mol/L). Costs of thermoregulation, however, were rather independent of ambient temperature (13-30 Degree-Sign C). External heat gain by solar radiation was used to increase body temperature. This optimised foraging energetics by increasing suction speed. In determinations of insect respiratory critical thermal limits

  13. Utilisation of respirometry to assess organic matter reduction of effluents from the Camaçari industrial complex (BA, Brazil

    Directory of Open Access Journals (Sweden)

    Carla A. Oliveira

    2007-03-01

    Full Text Available The treatment efficiency of industrial effluents, after biological treatment by activated sludge in aeration tanks (AT, was assessed through the utilisation of respirometry tests at the Cetrel's-wastewater treatment plant (WTP. Samples of the equalised effluent (EE, prior to treatment, and of the treated effluent (TE, after treatment, were analysed. Twenty bioassays batch were carried out to AT (AT-2, AT-3 and AT-4. Each test consisted of: a basic test, a basic test with peptone added, a test using EE and a test using TE. The data showed that there was no statistically significant difference (p>0.05 in the respiration activity between the aeration tanks. Regarding the specific oxygen uptake rate there was a mean reduction of 70.8% between the tests performed with EE and TE. The results demonstratd that respirometry tests could successfully assess the efficiency of the activated sludge process and, therefore, be adopted as tool for the monitoring from the WTP.Este trabalho avaliou, através de ensaios de respirometria, a eficiência do tratamento de efluentes industriais, após tratamento por lodo ativado em tanques de aeração (TA. Foram analisadas amostras do efluente equalisado (EE, antes do tratamento, e do efluente tratado (ET. Vinte baterias de ensaios foram realizadas com cada um dos TA (TA-2, TA-3 e TA-4. Cada bateria consistiu de um ensaio básico, contendo apenas o licor misto, um ensaio com adição de peptona, um ensaio com o EE e um com o ET. Não houve diferença estatística significativa (p>0,05 na atividade respiratória dos TA. Quanto à taxa de consumo de oxigênio específica houve uma redução média de 70,8% entre os ensaios realizados com EE e ET. Os ensaios de respirometria determinaram com eficiência o nível de tratamento através de lodo ativado, e deve ser adotado no monitoramento dos efluentes da estação de tratamento de efluentes do Pólo Industrial de Camaçari.

  14. Assessing honeybee and wasp thermoregulation and energetics-New insights by combination of flow-through respirometry with infrared thermography.

    Science.gov (United States)

    Stabentheiner, Anton; Kovac, Helmut; Hetz, Stefan K; Käfer, Helmut; Stabentheiner, Gabriel

    2012-04-20

    Endothermic insects like honeybees and some wasps have to cope with an enormous heat loss during foraging because of their small body size in comparison to endotherms like mammals and birds. The enormous costs of thermoregulation call for optimisation. Honeybees and wasps differ in their critical thermal maximum, which enables the bees to kill the wasps by heat. We here demonstrate the benefits of a combined use of body temperature measurement with infrared thermography, and respiratory measurements of energy turnover (O(2) consumption or CO(2) production via flow-through respirometry) to answer questions of insect ecophysiological research, and we describe calibrations to receive accurate results.To assess the question of what foraging honeybees optimise, their body temperature was compared with their energy turnover. Honeybees foraging from an artificial flower with unlimited sucrose flow increased body surface temperature and energy turnover with profitability of foraging (sucrose content of the food; 0.5 or 1.5 mol/L). Costs of thermoregulation, however, were rather independent of ambient temperature (13-30 °C). External heat gain by solar radiation was used to increase body temperature. This optimised foraging energetics by increasing suction speed.In determinations of insect respiratory critical thermal limits, the combined use of respiratory measurements and thermography made possible a more conclusive interpretation of respiratory traces.

  15. Respirometry in activated sludge

    NARCIS (Netherlands)

    Spanjers, H.

    1993-01-01

    The purpose of the study was (1) to develop a respiration meter capable of continuously measuring, using different procedures, the oxygen uptake rate of activated sludge and (2) to expand knowledge about respiration related characteristics of wastewater and activated sludge.

    A

  16. High-resolution respirometry of fine-needle muscle biopsies in pre-manifest Huntington's disease expansion mutation carriers shows normal mitochondrial respiratory function.

    Directory of Open Access Journals (Sweden)

    Eva Buck

    Full Text Available Alterations in mitochondrial respiration are an important hallmark of Huntington's disease (HD, one of the most common monogenetic causes of neurodegeneration. The ubiquitous expression of the disease causing mutant huntingtin gene raises the prospect that mitochondrial respiratory deficits can be detected in skeletal muscle. While this tissue is readily accessible in humans, transgenic animal models offer the opportunity to cross-validate findings and allow for comparisons across organs, including the brain. The integrated respiratory chain function of the human vastus lateralis muscle was measured by high-resolution respirometry (HRR in freshly taken fine-needle biopsies from seven pre-manifest HD expansion mutation carriers and nine controls. The respiratory parameters were unaffected. For comparison skeletal muscle isolated from HD knock-in mice (HdhQ111 as well as a broader spectrum of tissues including cortex, liver and heart muscle were examined by HRR. Significant changes of mitochondrial respiration in the HdhQ knock-in mouse model were restricted to the liver and the cortex. Mitochondrial mass as quantified by mitochondrial DNA copy number and citrate synthase activity was stable in murine HD-model tissue compared to control. mRNA levels of key enzymes were determined to characterize mitochondrial metabolic pathways in HdhQ mice. We demonstrated the feasibility to perform high-resolution respirometry measurements from small human HD muscle biopsies. Furthermore, we conclude that alterations in respiratory parameters of pre-manifest human muscle biopsies are rather limited and mirrored by a similar absence of marked alterations in HdhQ skeletal muscle. In contrast, the HdhQ111 murine cortex and liver did show respiratory alterations highlighting the tissue specific nature of mutant huntingtin effects on respiration.

  17. Monitoring and controlling the biological purification process in a waste water treatment plant using a respirometry analyser; Vigilancia y control del proceso de la depuracion biologica en una EDAR por medio de un analizador de respirometria

    Energy Technology Data Exchange (ETDEWEB)

    Serrano, J. E.

    2004-07-01

    In a waste water biological treatment, we have to take into account that the activated sludge is a living and breathing process, and a lack of bioactivity information might cause serious confusion about control criteria on the biological reactor. For this reason, to get bioactivity information in a timely manner through the respiration analysis would be a real breakthrough in better process control. Therefore, to identify the respiration rates and calculate their derived parameters represents the guidelines of respirometry and can be considered as the most sensitive variables on the basis of which activated sludge process theory can be validated. (Author)

  18. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  19. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  20. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  1. Finding Biomass Degrading Enzymes Through an Activity-Correlated Quantitative Proteomics Platform (ACPP)

    Science.gov (United States)

    Ma, Hongyan; Delafield, Daniel G.; Wang, Zhe; You, Jianlan; Wu, Si

    2017-04-01

    The microbial secretome, known as a pool of biomass (i.e., plant-based materials) degrading enzymes, can be utilized to discover industrial enzyme candidates for biofuel production. Proteomics approaches have been applied to discover novel enzyme candidates through comparing protein expression profiles with enzyme activity of the whole secretome under different growth conditions. However, the activity measurement of each enzyme candidate is needed for confident "active" enzyme assignments, which remains to be elucidated. To address this challenge, we have developed an Activity-Correlated Quantitative Proteomics Platform (ACPP) that systematically correlates protein-level enzymatic activity patterns and protein elution profiles using a label-free quantitative proteomics approach. The ACPP optimized a high performance anion exchange separation for efficiently fractionating complex protein samples while preserving enzymatic activities. The detected enzymatic activity patterns in sequential fractions using microplate-based assays were cross-correlated with protein elution profiles using a customized pattern-matching algorithm with a correlation R-score. The ACPP has been successfully applied to the identification of two types of "active" biomass-degrading enzymes (i.e., starch hydrolysis enzymes and cellulose hydrolysis enzymes) from Aspergillus niger secretome in a multiplexed fashion. By determining protein elution profiles of 156 proteins in A. niger secretome, we confidently identified the 1,4-α-glucosidase as the major "active" starch hydrolysis enzyme (R = 0.96) and the endoglucanase as the major "active" cellulose hydrolysis enzyme (R = 0.97). The results demonstrated that the ACPP facilitated the discovery of bioactive enzymes from complex protein samples in a high-throughput, multiplexing, and untargeted fashion.

  2. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  3. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  4. Quantitative Thermochronology

    Science.gov (United States)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  5. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  6. Microplate-based method for high-throughput screening of microalgae growth potential

    DEFF Research Database (Denmark)

    Van Wagenen, Jonathan; Holdt, Susan Løvstad; De Francisci, Davide

    2014-01-01

    Microalgae cultivation conditions in microplates will differ from large-scale photobioreactors in crucial parameters such as light profile, mixing and gas transfer. Hence volumetric productivity (Pv) measurements made in microplates cannot be directly scaled up. Here we demonstrate that it is pos...

  7. Investigation of Mitochondrial Dysfunction by Sequential Microplate-Based Respiration Measurements from Intact and Permeabilized Neurons

    Science.gov (United States)

    Clerc, Pascaline; Polster, Brian M.

    2012-01-01

    Mitochondrial dysfunction is a component of many neurodegenerative conditions. Measurement of oxygen consumption from intact neurons enables evaluation of mitochondrial bioenergetics under conditions that are more physiologically realistic compared to isolated mitochondria. However, mechanistic analysis of mitochondrial function in cells is complicated by changing energy demands and lack of substrate control. Here we describe a technique for sequentially measuring respiration from intact and saponin-permeabilized cortical neurons on single microplates. This technique allows control of substrates to individual electron transport chain complexes following permeabilization, as well as side-by-side comparisons to intact cells. To illustrate the utility of the technique, we demonstrate that inhibition of respiration by the drug KB-R7943 in intact neurons is relieved by delivery of the complex II substrate succinate, but not by complex I substrates, via acute saponin permeabilization. In contrast, methyl succinate, a putative cell permeable complex II substrate, failed to rescue respiration in intact neurons and was a poor complex II substrate in permeabilized cells. Sequential measurements of intact and permeabilized cell respiration should be particularly useful for evaluating indirect mitochondrial toxicity due to drugs or cellular signaling events which cannot be readily studied using isolated mitochondria. PMID:22496810

  8. Inline chemical process analysis in micro-plants based on thermoelectric flow and impedimetric sensors

    International Nuclear Information System (INIS)

    Jacobs, T; Kutzner, C; Hauptmann, P; Kropp, M; Lang, W; Brokmann, G; Steinke, A; Kienle, A

    2010-01-01

    In micro-plants, as used in chemical micro-process engineering, an integrated inline analytics is regarded as an important factor for the development and optimization of chemical processes. Up to now, there is a lack of sensitive, robust and low-priced micro-sensors for monitoring mixing and chemical conversion in micro-fluidic channels. In this paper a novel sensor system combining an impedimetric sensor and a novel pressure stable thermoelectric flow sensor for monitoring chemical reactions in micro-plants is presented. The CMOS-technology-based impedimetric sensor mainly consists of two capacitively coupled interdigital electrodes on a silicon chip. The thermoelectric flow sensor consists of a heater in between two thermopiles on a perforated membrane. The pulsed and constant current feeds of the heater were analyzed. Both sensors enable the analysis of chemical conversion by means of changes in the thermal and electrical properties of the liquid. The homogeneously catalyzed synthesis of n-butyl acetate as a chemical model system was studied. Experimental results revealed that in an overpressure regime, relative changes of less than 1% in terms of thermal and electrical properties can be detected. Furthermore, the transition from one to two liquid phases accompanied by the change in slug flow conditions could be reproducibly detected

  9. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  10. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  11. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  12. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  13. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  14. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  15. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  16. Quantitative secondary electron detection

    Science.gov (United States)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    2018-05-08

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  17. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  18. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  19. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  20. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  1. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  2. Nonlinear analysis of 0-3 polarized PLZT microplate based on the new modified couple stress theory

    Science.gov (United States)

    Wang, Liming; Zheng, Shijie

    2018-02-01

    In this study, based on the new modified couple stress theory, the size- dependent model for nonlinear bending analysis of a pure 0-3 polarized PLZT plate is developed for the first time. The equilibrium equations are derived from a variational formulation based on the potential energy principle and the new modified couple stress theory. The Galerkin method is adopted to derive the nonlinear algebraic equations from governing differential equations. And then the nonlinear algebraic equations are solved by using Newton-Raphson method. After simplification, the new model includes only a material length scale parameter. In addition, numerical examples are carried out to study the effect of material length scale parameter on the nonlinear bending of a simply supported pure 0-3 polarized PLZT plate subjected to light illumination and uniform distributed load. The results indicate the new model is able to capture the size effect and geometric nonlinearity.

  3. Analytical determination of size-dependent natural frequencies of fully clamped rectangular microplates based on the modified couple stress theory

    Energy Technology Data Exchange (ETDEWEB)

    Askari, Amir R.; Tahani, Masoud [Ferdowsi University of Mashhad, Mashhad (Iran, Islamic Republic of)

    2015-05-15

    This paper presents an analytical and size-dependent model for vibrational analysis of fully clamped rectangular microplates. Modified couple stress theory (MCST) and the Kirchhoff plate model are considered, and Hamilton's principle is employed to derive the size dependent equation of motion that accounts for the effect of residual stresses. The natural frequencies of the microplate are extracted analytically by extended Kantorovich method. The present findings are validated with the available results in the literature, and an excellent agreement is observed between them. In addition, a parametric study is conducted to demonstrate the significant effects of couple stress components on the natural frequencies of fully clamped microplates. The ratio of MCST natural frequencies to those obtained with classical theory depends only on the Poisson's ratio of the plate and is independent of the aspect ratio of the plate for cases with no residual stresses.

  4. Analytical determination of size-dependent natural frequencies of fully clamped rectangular microplates based on the modified couple stress theory

    International Nuclear Information System (INIS)

    Askari, Amir R.; Tahani, Masoud

    2015-01-01

    This paper presents an analytical and size-dependent model for vibrational analysis of fully clamped rectangular microplates. Modified couple stress theory (MCST) and the Kirchhoff plate model are considered, and Hamilton's principle is employed to derive the size dependent equation of motion that accounts for the effect of residual stresses. The natural frequencies of the microplate are extracted analytically by extended Kantorovich method. The present findings are validated with the available results in the literature, and an excellent agreement is observed between them. In addition, a parametric study is conducted to demonstrate the significant effects of couple stress components on the natural frequencies of fully clamped microplates. The ratio of MCST natural frequencies to those obtained with classical theory depends only on the Poisson's ratio of the plate and is independent of the aspect ratio of the plate for cases with no residual stresses.

  5. Immunocapture and microplate-based activity and quantity measurement of pyruvate dehydrogenase in human peripheral blood mononuclear cells.

    Science.gov (United States)

    Liu, Xiaowen; Pervez, Hira; Andersen, Lars W; Uber, Amy; Montissol, Sophia; Patel, Parth; Donnino, Michael W

    2015-01-01

    Pyruvate dehydrogenase (PDH) activity is altered in many human disorders. Current methods require tissue samples and yield inconsistent results. We describe a modified method for measuring PDH activity from isolated human peripheral blood mononuclear cells (PBMCs). RESULTS/METHODOLOGY: We found that PDH activity and quantity can be successfully measured in human PBMCs. Freeze-thaw cycles cannot efficiently disrupt the mitochondrial membrane. Processing time of up to 20 h does not affect PDH activity with proteinase inhibitor addition and a detergent concentration of 3.3% showed maximum yield. Sample protein concentration is correlated to PDH activity and quantity in human PBMCs from healthy subjects. Measuring PDH activity from PBMCs is a novel, easy and less invasive way to further understand the role of PDH in human disease.

  6. Immunocapture and microplate-based activity and quantity measurement of pyruvate dehydrogenase in human peripheral blood mononuclear cells

    Science.gov (United States)

    Liu, Xiaowen; Pervez, Hira; Andersen, Lars W; Uber, Amy; Montissol, Sophia; Patel, Parth; Donnino, Michael W

    2015-01-01

    Background Pyruvate dehydrogenase (PDH) activity is altered in many human disorders. Current methods require tissue samples and yield inconsistent results. We describe a modified method for measuring PDH activity from isolated human peripheral blood mononuclear cells (PBMCs). Results/Methodology We found that PDH activity and quantity can be successfully measured in human PBMCs. Freeze-thaw cycles cannot efficiently disrupt the mitochondrial membrane. Processing time of up to 20 h does not affect PDH activity with proteinase inhibitor addition and a detergent concentration of 3.3% showed maximum yield. Sample protein concentration is correlated to PDH activity and quantity in human PBMCs from healthy subjects. Conclusion Measuring PDH activity from PBMCs is a novel, easy and less invasive way to further understand the role of PDH in human disease. PMID:25826140

  7. Sensitive detection of oversulfated chondroitin sulfate in heparin sodium or crude heparin with a colorimetric microplate based assay.

    Science.gov (United States)

    Sommers, Cynthia D; Mans, Daniel J; Mecker, Laura C; Keire, David A

    2011-05-01

    In this work we describe a 96-well microplate assay for oversulfated chondroitin sulfate A (OSCS) in heparin, based on a water-soluble cationic polythiophene polymer (3-(2-(N-(N'-methylimidazole))ethoxy)-4-methylthiophene (LPTP)) and heparinase digestion of heparin. The assay takes advantage of several unique properties of heparin, OSCS, and LPTP, including OSCS inhibition of heparinase I and II activity, the molecular weight dependence of heparin-LPTP spectral shifts, and the distinct association of heparin fragments and OSCS to LPTP. These factors combine to enable detection of the presence of 0.003% w/w spiked OSCS in 10 μg of heparin sodium active pharmaceutical ingredient (API) using a plate reader and with visual detection to 0.1% levels. The same detection limit for OSCS was observed in the presence of 10% levels of dermatan sulfate (DS) or chondroitin sulfate A (CSA) impurities. In addition, we surveyed a selection of crude heparin samples received by the agency in 2008 and 2009 to determine average and extreme DS, CSA, and galactosamine weight percent levels. In the presence of these impurities and the variable heparin content in the crude heparin samples, spiked OSCS was reliably detected to the 0.1% w/w level using a plate reader. Finally, authentically OSCS contaminated heparin sodium API and crude samples were distinguished visually by color from control samples using the LPTP/heparinase test.

  8. Dry respirometric index assessment using open and close respirometry

    International Nuclear Information System (INIS)

    Andreottola, G.; Dallago, L.; Ragazzi, M.

    2001-01-01

    The aim of this work is to compare two different approaches for the evaluation of the respirometric index of dry biodegradable solid matter. The two used tools have been built in the University of Trent. The AIR-A respirometer (Open Respirometric Index Analyser) used in the experimentation is a pilot device made by a small steel reactor kept at steady temperature. The oxygen analyser and the temperature probe are directly connected to a PC through an acquisition device which allows to follow the evolution of the system. A balance between the amount of the volatile solids, the free volume and the decrease of the oxygen allows to assess the respiration rate. The AIR-C respirometer (Close Respirometric Index Analyser) is constituted by a reactor in which the trend of the internal temperature and the total pressure are constantly measured. A basic solution inside the reactor guarantees the absorption of CO 2 allowing to associate an oxygen consumption to the measure of the total pressure. The reactor is built in order to allow a wide exchange surface between compost and atmosphere, limiting the thickness of the layer of compost in which oxygen must diffuse. A correct balance with the effect induced by the temperature allows to calculate the respirometric index, which is related to the microbic activity only in a portion of the total lasting of the test. The analysis of the generated data and the introduction of some technical modification have allowed to obtain with the two different approaches the same values for the respirometric index [it

  9. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  10. Quantitative skeletal scintiscanning

    International Nuclear Information System (INIS)

    Haushofer, R.

    1982-01-01

    330 patients were examined by skeletal scintiscanning with sup(99m)Tc pyrophosphate and sup(99m)methylene diphosphonate in the years between 1977 and 1979. Course control examinations were carried out in 12 patients. The collective of patients presented with primary skeletal tumours, metastases, inflammatory and degenerative skeletal diseases. Bone scintiscanning combined with the ''region of interest'' technique was found to be an objective and reproducible technique for quantitative measurement of skeletal radioactivity concentrations. The validity of nuclear skeletal examinations can thus be enhanced as far as diagnosis, course control, and differential diagnosis are concerned. Quantitative skeletal scintiscanning by means of the ''region of interest'' technique has opened up a new era in skeletal diagnosis by nuclear methods. (orig./MG) [de

  11. Quantitative FDG in depression

    Energy Technology Data Exchange (ETDEWEB)

    Chua, P.; O`Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D. [Austin Hospital, Melbourne, VIC (Australia). Dept of Psychiatry and Centre for PET

    1998-03-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual``s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals.

  12. Quantitative FDG in depression

    International Nuclear Information System (INIS)

    Chua, P.; O'Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D.

    1998-01-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual''s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals

  13. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  14. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  15. Quantitative cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Thelen, M.; Dueber, C.; Wolff, P.; Erbel, R.; Hoffmann, T.

    1985-06-01

    The scope and limitations of quantitative cardiac CT have been evaluated in a series of experimental and clinical studies. The left ventricular muscle mass was estimated by computed tomography in 19 dogs (using volumetric methods, measurements in two axes and planes and reference volume). There was good correlation with anatomical findings. The enddiastolic volume of the left ventricle was estimated in 22 patients with cardiomyopathies; using angiography as a reference, CT led to systematic under-estimation. It is also shown that ECG-triggered magnetic resonance tomography results in improved visualisation and may be expected to improve measurements of cardiac morphology.

  16. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  17. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  18. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  19. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  20. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  1. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  2. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  3. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  4. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  5. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  6. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  7. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  8. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  9. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  10. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  11. Enterococcus faecalis infection causes inflammation, intracellular oxphos-independent ROS production, and DNA damage in human gastric cancer cells.

    Directory of Open Access Journals (Sweden)

    Jesper A B Strickertsson

    Full Text Available BACKGROUND: Achlorhydria caused by e.g. atrophic gastritis allows for bacterial overgrowth, which induces chronic inflammation and damage to the mucosal cells of infected individuals driving gastric malignancies and cancer. Enterococcus faecalis (E. faecalis can colonize achlohydric stomachs and we therefore wanted to study the impact of E. faecalis infection on inflammatory response, reactive oxygen species (ROS formation, mitochondrial respiration, and mitochondrial genetic stability in gastric mucosal cells. METHODS: To separate the changes induced by bacteria from those of the inflammatory cells we established an in vitro E. faecalis infection model system using the gastric carcinoma cell line MKN74. Total ROS and superoxide was measured by fluorescence microscopy. Cellular oxygen consumption was characterized non-invasively using XF24 microplate based respirometry. Gene expression was examined by microarray, and response pathways were identified by Gene Set Analysis (GSA. Selected gene transcripts were verified by quantitative real-time polymerase chain reaction (qRT-PCR. Mitochondrial mutations were determined by sequencing. RESULTS: Infection of MKN74 cells with E. faecalis induced intracellular ROS production through a pathway independent of oxidative phosphorylation (oxphos. Furthermore, E. faecalis infection induced mitochondrial DNA instability. Following infection, genes coding for inflammatory response proteins were transcriptionally up-regulated while DNA damage repair and cell cycle control genes were down-regulated. Cell growth slowed down when infected with viable E. faecalis and responded in a dose dependent manner to E. faecalis lysate. CONCLUSIONS: Infection by E. faecalis induced an oxphos-independent intracellular ROS response and damaged the mitochondrial genome in gastric cell culture. Finally the bacteria induced an NF-κB inflammatory response as well as impaired DNA damage response and cell cycle control gene

  12. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  13. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  14. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  15. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  16. Quantitative nature of overexpression experiments

    Science.gov (United States)

    Moriya, Hisao

    2015-01-01

    Overexpression experiments are sometimes considered as qualitative experiments designed to identify novel proteins and study their function. However, in order to draw conclusions regarding protein overexpression through association analyses using large-scale biological data sets, we need to recognize the quantitative nature of overexpression experiments. Here I discuss the quantitative features of two different types of overexpression experiment: absolute and relative. I also introduce the four primary mechanisms involved in growth defects caused by protein overexpression: resource overload, stoichiometric imbalance, promiscuous interactions, and pathway modulation associated with the degree of overexpression. PMID:26543202

  17. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  18. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  19. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  20. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  1. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  2. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  3. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  4. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  5. Quantitative (real-time) PCR

    International Nuclear Information System (INIS)

    Denman, S.E.; McSweeney, C.S.

    2005-01-01

    Many nucleic acid-based probe and PCR assays have been developed for the detection tracking of specific microbes within the rumen ecosystem. Conventional PCR assays detect PCR products at the end stage of each PCR reaction, where exponential amplification is no longer being achieved. This approach can result in different end product (amplicon) quantities being generated. In contrast, using quantitative, or real-time PCR, quantification of the amplicon is performed not at the end of the reaction, but rather during exponential amplification, where theoretically each cycle will result in a doubling of product being created. For real-time PCR, the cycle at which fluorescence is deemed to be detectable above the background during the exponential phase is termed the cycle threshold (Ct). The Ct values obtained are then used for quantitation, which will be discussed later

  6. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  7. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  8. Qualitative discussion of quantitative radiography

    International Nuclear Information System (INIS)

    Berger, H.; Motz, J.W.

    1975-01-01

    Since radiography yields an image that can be easily related to the tested object, it is superior to many nondestructive testing techniques in revealing the size, shape, and location of certain types of discontinuities. The discussion is limited to a description of the radiographic process, examination of some of the quantitative aspects of radiography, and an outline of some of the new ideas emerging in radiography. The advantages of monoenergetic x-ray radiography and neutron radiography are noted

  9. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  10. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  11. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  12. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  13. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  14. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  15. Quantitative Trait Loci in Inbred Lines

    NARCIS (Netherlands)

    Jansen, R.C.

    2001-01-01

    Quantitative traits result from the influence of multiple genes (quantitative trait loci) and environmental factors. Detecting and mapping the individual genes underlying such 'complex' traits is a difficult task. Fortunately, populations obtained from crosses between inbred lines are relatively

  16. A quantitative framework for assessing ecological resilience

    Science.gov (United States)

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  17. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  18. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  19. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  20. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  1. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  2. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  3. Digital radiography: a quantitative approach

    International Nuclear Information System (INIS)

    Retraint, F.

    2004-01-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  4. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  5. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  6. Quantitative criticism of literary relationships.

    Science.gov (United States)

    Dexter, Joseph P; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A; Bonilla Lopez, Jorge A; Schroeder, Lea A; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-04-18

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships ("intertextuality") and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term "quantitative criticism," focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca's main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions.

  7. A simple microplate-based method for the determination of α-amylase activity using the glucose assay kit (GOD method).

    Science.gov (United States)

    Visvanathan, Rizliya; Jayathilake, Chathuni; Liyanage, Ruvini

    2016-11-15

    For the first time, a reliable, simple, rapid and high-throughput analytical method for the detection and quantification of α-amylase inhibitory activity using the glucose assay kit was developed. The new method facilitates rapid screening of a large number of samples, reduces labor, time and reagents and is also suitable for kinetic studies. This method is based on the reaction of maltose with glucose oxidase (GOD) and the development of a red quinone. The test is done in microtitre plates with a total volume of 260μL and an assay time of 40min including the pre-incubation steps. The new method is tested for linearity, sensitivity, precision, reproducibility and applicability. The new method is also compared with the most commonly used 3,5-dinitrosalicylic acid (DNSA) method for determining α-amylase activity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. FIPRONIL (ICON) EFFECTS ON COPEPOD SURVIVAL, DEVELOPMENT, AND REPRODUCTIVE SUCCESS IN A MICROPLATE-BASED LIFE-CYCLE SCREENING ASSAY. (R827397)

    Science.gov (United States)

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  9. Quantitative evaluation of dermatological antiseptics.

    Science.gov (United States)

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. © 2015 British Association of Dermatologists.

  10. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  11. Quantitative Activities for Introductory Astronomy

    Science.gov (United States)

    Keohane, Jonathan W.; Bartlett, J. L.; Foy, J. P.

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: "The Mass of Neptune” "The Temperature on Titan” "Rocks in the Early Solar System” "Comets Hitting Planets” "Ages of Meteorites” "How Flat are Saturn's Rings?” "Tides of the Sun and Moon on the Earth” "The Gliese 581 Solar System"; "Buckets in the Rain” "How Hot, Bright and Big is Betelgeuse?” "Bombs and the Sun” "What Forms Stars?” "Lifetimes of Cars and Stars” "The Mass of the Milky” "How Old is the Universe?” "Is The Universe Speeding up or Slowing Down?"

  12. Quantitative patterns in drone wars

    Science.gov (United States)

    Garcia-Bernardo, Javier; Dodds, Peter Sheridan; Johnson, Neil F.

    2016-02-01

    Attacks by drones (i.e., unmanned combat air vehicles) continue to generate heated political and ethical debates. Here we examine the quantitative nature of drone attacks, focusing on how their intensity and frequency compare with that of other forms of human conflict. Instead of the power-law distribution found recently for insurgent and terrorist attacks, the severity of attacks is more akin to lognormal and exponential distributions, suggesting that the dynamics underlying drone attacks lie beyond these other forms of human conflict. We find that the pattern in the timing of attacks is consistent with one side having almost complete control, an important if expected result. We show that these novel features can be reproduced and understood using a generative mathematical model in which resource allocation to the dominant side is regulated through a feedback loop.

  13. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  14. Quantitative variation in natural populations

    International Nuclear Information System (INIS)

    Parsons, P.A.

    1975-01-01

    Quantitative variation is considered in natural populations using Drosophila as the example. A knowledge of such variation enables its rapid exploitation in directional selection experiments as shown for scutellar chaeta number. Where evidence has been obtained, genetic architectures are in qualitative agreement with Mather's concept of balance for traits under stabilizing selection. Additive genetic control is found for acute environmental stresses, but not for less acute stresses as shown by exposure to 60 Co-γ rays. D. simulans probably has a narrower ecological niche than its sibling species D. melanogaster associated with lower genetic heterogeneity. One specific environmental stress to which D. simulans is sensitive in nature is ethyl alcohol as shown by winery data. (U.S.)

  15. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  16. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  17. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  18. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  19. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  20. Quantitative organ visualization using SPECT

    International Nuclear Information System (INIS)

    Kircos, L.T.; Carey, J.E. Jr.; Keyes, J.W. Jr.

    1987-01-01

    Quantitative organ visualization (QOV) was performed using single photon emission computed tomography (SPECT). Organ size was calculated from serial, contiguous ECT images taken through the organ of interest with image boundaries determined using a maximum directional gradient edge finding technique. Organ activity was calculated using ECT counts bounded by the directional gradient, imaging system efficiency, and imaging time. The technique used to perform QOV was evaluated using phantom studies, in vivo canine liver, spleen, bladder, and kidney studies, and in vivo human bladder studies. It was demonstrated that absolute organ activity and organ size could be determined with this system and total imaging time restricted to less than 45 min to an accuracy of about +/- 10% providing the minimum dimensions of the organ are greater than the FWHM of the imaging system and the total radioactivity within the organ of interest exceeds 15 nCi/cc for dog-sized torsos. In addition, effective half-lives of approximately 1.5 hr or greater could be determined

  1. Quantitative isotopes miction cystoureterography (QIMCU)

    International Nuclear Information System (INIS)

    Szy, D.A.G.; Stroetges, M.W.; Funke-Voelkers, R.

    1982-01-01

    A simple method for a quantitative evaluation of vesicoureteral reflux was developed. It allows the determination of a) the volume of reflux b) the volume of the bladder at each point of time during the examination. The QIMCU gives an insight into the dynamic of reflux, of reflux volume, and of actual bladder volume. The clinical application in 37 patients with 53 insufficient ureteral orifices (i.e. reflux) showed that the onset of reflux occured in 60% as early as in the first five minutes of the examination but later in the remaining 40%. The maximal reflux was found only in 26% during the first five minutes. The reflux volume exceeded in more than 50% the amount of 3.5 ml. The international grading corresponds with the reflux volume determined by this method. Radionuclide cystoureterography can be used as well in childhood as in adults. Because the radiaction exposure is low, the method can be recommended for the initial examination and for follow up studies. (Author)

  2. A quantitative philology of introspection

    Directory of Open Access Journals (Sweden)

    Carlos eDiuk

    2012-09-01

    Full Text Available The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the ``Axial Age'', saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy - which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single ``arrow of time'' in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the 20th century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus.

  3. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  4. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  5. Quantitative Ultrasound Measurements at the Heel

    DEFF Research Database (Denmark)

    Daugschies, M.; Brixen, K.; Hermann, P.

    2015-01-01

    Calcaneal quantitative ultrasound can be used to predict osteoporotic fracture risk, but its ability to monitor therapy is unclear possibly because of its limited precision. We developed a quantitative ultrasound device (foot ultrasound scanner) that measures the speed of sound at the heel...... with the foot ultrasound scanner reduced precision errors by half (p quantitative ultrasound measurements is feasible. (E-mail: m.daugschies@rad.uni-kiel.de) (C) 2015 World Federation for Ultrasound in Medicine & Biology....

  6. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  7. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  8. Quantitative PET of liver functions.

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[ 18 F]fluoro- D -galactose ( 18 F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value ( SUV ) from a static liver 18 F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11 C-palmitate and with the conjugated bile acid tracer [ N -methyl- 11 C]cholylsarcosine ( 11 C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood ( K 1 ; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion , SUV of non-invasive static PET with 18 F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET.

  9. Quantitative PET of liver functions

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[18F]fluoro-D-galactose (18F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value (SUV) from a static liver 18F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11C-palmitate and with the conjugated bile acid tracer [N-methyl-11C]cholylsarcosine (11C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood (K 1; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion, SUV of non-invasive static PET with 18F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET. PMID:29755841

  10. Validating quantitative precipitation forecast for the Flood ...

    Indian Academy of Sciences (India)

    In order to issue an accurate warning for flood, a better or appropriate quantitative forecasting of precipitationis required. In view of this, the present study intends to validate the quantitative precipitationforecast (QPF) issued during southwest monsoon season for six river catchments (basin) under theflood meteorological ...

  11. 78 FR 64202 - Quantitative Messaging Research

    Science.gov (United States)

    2013-10-28

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast- track OMB approval... comments. Please submit your comments using only one method and identify that it is for the ``Quantitative...

  12. Applications of quantitative remote sensing to hydrology

    NARCIS (Netherlands)

    Su, Z.; Troch, P.A.A.

    2003-01-01

    In order to quantify the rates of the exchanges of energy and matter among hydrosphere, biosphere and atmosphere, quantitative description of land surface processes by means of measurements at different scales are essential. Quantitative remote sensing plays an important role in this respect. The

  13. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  14. Quantitative Phase Imaging Using Hard X Rays

    International Nuclear Information System (INIS)

    Nugent, K.A.; Gureyev, T.E.; Cookson, D.J.; Paganin, D.; Barnea, Z.

    1996-01-01

    The quantitative imaging of a phase object using 16keV xrays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. We find that our phase image is in quantitative agreement with independent measurements of the object. copyright 1996 The American Physical Society

  15. A Primer on Disseminating Applied Quantitative Research

    Science.gov (United States)

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  16. Using Popular Culture to Teach Quantitative Reasoning

    Science.gov (United States)

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  17. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  18. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  19. Quantitative neutron radiography using neutron absorbing honeycomb

    International Nuclear Information System (INIS)

    Tamaki, Masayoshi; Oda, Masahiro; Takahashi, Kenji; Ohkubo, Kohei; Tasaka, Kanji; Tsuruno, Akira; Matsubayashi, Masahito.

    1993-01-01

    This investigation concerns quantitative neutron radiography and computed tomography by using a neutron absorbing honeycomb collimator. By setting the neutron absorbing honeycomb collimator between object and imaging system, neutrons scattered in the object were absorbed by the honeycomb material and eliminated before coming to the imaging system, but the neutrons which were transmitted the object without interaction could reach the imaging system. The image by purely transmitted neutrons gives the quantitative information. Two honeycombs were prepared with coating of boron nitride and gadolinium oxide and evaluated for the quantitative application. The relation between the neutron total cross section and the attenuation coefficient confirmed that they were in a fairly good agreement. Application to quantitative computed tomography was also successfully conducted. The new neutron radiography method using the neutron-absorbing honeycomb collimator for the elimination of the scattered neutrons improved remarkably the quantitativeness of the neutron radiography and computed tomography. (author)

  20. Induced mutations for quantitative traits in rice

    International Nuclear Information System (INIS)

    Chakrabarti, B.N.

    1974-01-01

    The characteristics and frequency of micro-mutations induced in quantitative traits by radiation treatment and the extent of heterozygotic effects of different recessive chlorophyll-mutant-genes on quantitative trait has been presented. Mutagenic treatments increased the variance for quantitative traits in all cases although the magnitude of increase varied depending on the treatment and the selection procedure adopted. The overall superiority of the chlorophyll-mutant heterozygotes over the corresponding wild homozygotes, as noted in consecutive two seasons, was not observed when these were grown at a high level of nitrogen fertiliser. (author)

  1. Quantitative determination of uranium by SIMS

    International Nuclear Information System (INIS)

    Kuruc, J.; Harvan, D.; Galanda, D.; Matel, L.; Aranyosiova, M.; Velic, D.

    2008-01-01

    The paper presents results of quantitative measurements of uranium-238 by secondary ion mass spectrometry (SIMS) with using alpha spectrometry as well as complementary technique. Samples with specific activity of uranium-238 were prepared by electrodeposition from aqueous solution of UO 2 (NO 3 ) 2 ·6H 2 O. We tried to apply SIMS to quantitative analysis and search for correlation between intensity obtained from SIMS and activity of uranium-238 in dependence on the surface's weight and possibility of using SIMS in quantitative analysis of environmental samples. The obtained results and correlations as well as results of two real samples measurements are presented in this paper. (authors)

  2. Quantitative traits in wheat (Triticum aestivum L

    African Journals Online (AJOL)

    MSS

    2012-11-13

    Nov 13, 2012 ... Of the quantitative traits in wheat, spike length, number of spikes per m2, grain mass per spike, number ... design with four liming variants along with three replications, in which the experimental field .... The sampling was done.

  3. Quantitative Fundus Autofluorescence in Recessive Stargardt Disease

    OpenAIRE

    Burke, Tomas R.; Duncker, Tobias; Woods, Russell L.; Greenberg, Jonathan P.; Zernant, Jana; Tsang, Stephen H.; Smith, R. Theodore; Allikmets, Rando; Sparrow, Janet R.; Delori, François C.

    2014-01-01

    Quantitative fundus autofluorescence (qAF) is significantly increased in Stargardt disease, consistent with previous reports of increased RPE lipofuscin. QAF will help to establish genotype-phenotype correlations and may serve as an outcome measure in clinical trials.

  4. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  5. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. A Quantitative Technique for Beginning Microscopists.

    Science.gov (United States)

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  7. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  8. Quantitative data extraction from transmission electron micrographs

    International Nuclear Information System (INIS)

    Sprague, J.A.

    1982-01-01

    The discussion will cover an overview of quantitative TEM, the digital image analysis process, coherent optical processing, and finally a summary of the author's views on potentially useful advances in TEM image processing

  9. Quantitative Ability as Correlates of Students' Academic ...

    African Journals Online (AJOL)

    Nekky Umera

    The introduction of quantitative topics into the secondary school economics curriculum has ... since the quality of education at any level is highly dependent on the quality and dedication of ..... Ibadan: Constellations Books 466-481. Anderson ...

  10. Laboratory technique for quantitative thermal emissivity ...

    Indian Academy of Sciences (India)

    Emission of radiation from a sample occurs due to thermal vibration of its .... Quantitative thermal emissivity measurements of geological samples. 393. Figure 1. ...... tral mixture modeling: A new analysis of rock and soil types at the Viking ...

  11. A Quantitative Gas Chromatographic Ethanol Determination.

    Science.gov (United States)

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  12. Qualitative vs. quantitative atopic dermatitis criteria

    DEFF Research Database (Denmark)

    Andersen, R M; Thyssen, J P; Maibach, H I

    2016-01-01

    This review summarizes historical aspects, clinical expression and pathophysiology leading to coining of the terms atopy and atopic dermatitis, current diagnostic criteria and further explore the possibility of developing quantitative diagnostic criteria of atopic dermatitis (AD) based on the imp...

  13. Strategies for quantitation of phosphoproteomic data

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Thingholm, Tine Engberg

    2010-01-01

    Recent developments in phosphoproteomic sample-preparation techniques and sensitive mass spectrometry instrumentation have led to large-scale identifications of phosphoproteins and phosphorylation sites from highly complex samples. This has facilitated the implementation of different quantitation...

  14. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  15. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  16. Quantitative approaches in climate change ecology

    DEFF Research Database (Denmark)

    Brown, Christopher J.; Schoeman, David S.; Sydeman, William J.

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between...... climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer‐reviewed articles that examined relationships...

  17. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  18. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  19. Quantitative autoradiography of semiconductor base material

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1983-01-01

    Autoradiographic methods for the quantitative determination of elements interesting in semiconductor technology and their distribution in silicon are described. Whereas the local concentration and distribution of phosphorus has been determined with the aid of silver halide films the neutron-induced autoradiography has been applied in the case of boron. Silicon disks containing diffused phosphorus or implanted or diffused boron have been used as standard samples. Different possibilities of the quantitative evaluation of autoradiograms are considered and compared

  20. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  1. Radiological interpretation 2020: Toward quantitative image assessment

    International Nuclear Information System (INIS)

    Boone, John M.

    2007-01-01

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  2. Respirometry increases cortisol levels in rainbow trout Oncorhynchus mykiss: implications for measurements of metabolic rate

    DEFF Research Database (Denmark)

    Murray, L.; Rennie, M. D.; Svendsen, Jon Christian

    2017-01-01

    This study aimed to assess the extent to which chasing, handling and confining Oncorhynchus mykiss to a small respirometer chamber during respirometric experiments is stressful and affects metabolic measurements. The study observed increased cortisol levels in animals tested using a chase protocol...

  3. A MODEL OF OXYGEN CONDITIONS IN A SCHOOL OF FISH BASED ON EXPERIMENTAL RESPIROMETRY

    DEFF Research Database (Denmark)

    Steffensen, John Fleng

    2010-01-01

    with a logging YSI CTD from a fixed position on the net cage at a depth of one meter. The data did not show lower oxygen levels in the vicinity oflhe fish compared to a position on the edge of the pen. In the near future we hope to be able to verify the model by measuring oxygen levels in large schools...... of Atlantic herring with a ROV instrumented with cameras and a logging YSJ CTD as well as an acoustic Oxyguard oxygen transmitter....

  4. USE OF HYDROGEN RESPIROMETRY TO DETERMINE METAL TOXICITY TO SULFATE REDUCING BACTERIA

    Science.gov (United States)

    Acid mine drainage (AMD), an acidic metal-bearing wastewater poses a severe pollution problem attributed to post-mining activities. The metals (metal sulfates) encountered in AMD and considered of concern for risk assessment are: arsenic, cadmium, aluminum, manganese, iron, zinc ...

  5. Respirometry-based on-line model parameter estimation at a full-scale WWTP

    NARCIS (Netherlands)

    Spanjers, H.; Patry, G.G.; Keesman, K.J.

    2002-01-01

    This paper describes part of a project to develop a systematic approach to knowledge extraction from on-line respirometric measurements in support of wastewater treatment plant control and operation. The paper deals with the following issues: (1) test of the implementation of an automatic set-up

  6. Use of Respirometry To Determine the Effect of Nutrient Enhancement on JP-8 Biodegradability.

    Science.gov (United States)

    1995-11-27

    present, environmental conditions, and certainly indigenous microbial communities present ( Atlas and Bartha , 1993). In general, biodegradation can be...Richard Bartha . Microbial Ecology : Fundementals and Applications. Redwood City, CA: The Benjamin Cummings Publishing Company, Inc, 1993. Autry...products caused by microorganisms or their enzymes ( Atlas and Bartha , 1993). It is greatly influenced by the nature and amount of the target compound

  7. Activated sludge respirometry to assess solar detoxification of a metal finishing effluent

    Energy Technology Data Exchange (ETDEWEB)

    Santos-Juanes, L.; Amat, A.M. [Departamento de Ingenieria Textil y Papelera, Escuela Politecnica Superior de Alcoy, Universidad Politecnica de Valencia, Plaza Ferrandiz y Carbonell s/n, E-03801 Alcoy (Spain); Arques, A. [Departamento de Ingenieria Textil y Papelera, Escuela Politecnica Superior de Alcoy, Universidad Politecnica de Valencia, Plaza Ferrandiz y Carbonell s/n, E-03801 Alcoy (Spain)], E-mail: aarques@txp.upv.es; Bernabeu, A.; Silvestre, M.; Vicente, R. [Departamento de Ingenieria Textil y Papelera, Escuela Politecnica Superior de Alcoy, Universidad Politecnica de Valencia, Plaza Ferrandiz y Carbonell s/n, E-03801 Alcoy (Spain); Ano, E. [Departamento de Gestion e Innovacion, Area de producto y desarrollo sostenible, Asociacion de Investigacion de la Industria del Juguete, Conexas y Afines (AIJU), Avda. de la industria, 23, 03440 Ibi (Spain)], E-mail: m.ambiente@aiju.info

    2008-05-30

    Inhibition of the respiration of activated sludge has been tested as a convenient method to estimate toxicity of aqueous solutions containing copper and cyanide, such as metal finishing effluents; according to this method, an EC{sub 50} of 0.5 mg/l was determined for CN{sup -} and 3.0 mg/l for copper. Solar detoxification of cyanide-containing solutions was studied using TiO{sub 2}, but this process was unfavourable because of the inhibitory role that plays the copper ions present in real effluents on the oxidation of cyanide. On the other hand, the oxidative effect of hydrogen peroxide was greatly enhanced by Cu{sup 2+} and solar irradiation, as complete elimination of free and complexed cyanide could be accomplished, together with precipitation of copper, in experiments carried out at pilot plant scale with real metal finishing effluents. Under these conditions, total detoxification was achieved according to respirometric measurements although some remaining toxicity was determined by more sensitive Vibrio fischeri luminescent assay.

  8. Activated sludge respirometry to assess solar detoxification of a metal finishing effluent

    International Nuclear Information System (INIS)

    Santos-Juanes, L.; Amat, A.M.; Arques, A.; Bernabeu, A.; Silvestre, M.; Vicente, R.; Ano, E.

    2008-01-01

    Inhibition of the respiration of activated sludge has been tested as a convenient method to estimate toxicity of aqueous solutions containing copper and cyanide, such as metal finishing effluents; according to this method, an EC 50 of 0.5 mg/l was determined for CN - and 3.0 mg/l for copper. Solar detoxification of cyanide-containing solutions was studied using TiO 2 , but this process was unfavourable because of the inhibitory role that plays the copper ions present in real effluents on the oxidation of cyanide. On the other hand, the oxidative effect of hydrogen peroxide was greatly enhanced by Cu 2+ and solar irradiation, as complete elimination of free and complexed cyanide could be accomplished, together with precipitation of copper, in experiments carried out at pilot plant scale with real metal finishing effluents. Under these conditions, total detoxification was achieved according to respirometric measurements although some remaining toxicity was determined by more sensitive Vibrio fischeri luminescent assay

  9. Open-circuit respirometry: real-time, laboratory-based systems.

    Science.gov (United States)

    Ward, Susan A

    2018-05-04

    This review explores the conceptual and technological factors integral to the development of laboratory-based, automated real-time open-circuit mixing-chamber and breath-by-breath (B × B) gas-exchange systems, together with considerations of assumptions and limitations. Advances in sensor technology, signal analysis, and digital computation led to the emergence of these technologies in the mid-20th century, at a time when investigators were beginning to recognise the interpretational advantages of nonsteady-state physiological-system interrogation in understanding the aetiology of exercise (in)tolerance in health, sport, and disease. Key milestones include the 'Auchincloss' description of an off-line system to estimate alveolar O 2 uptake B × B during exercise. This was followed by the first descriptions of real-time automated O 2 uptake and CO 2 output B × B measurement by Beaver and colleagues and by Linnarsson and Lindborg, and mixing-chamber measurement by Wilmore and colleagues. Challenges to both approaches soon emerged: e.g., the influence of mixing-chamber washout kinetics on mixed-expired gas concentration determination, and B × B alignment of gas-concentration signals with respired flow. The challenging algorithmic and technical refinements required for gas-exchange estimation at the alveolar level have also been extensively explored. In conclusion, while the technology (both hardware and software) underpinning real-time automated gas-exchange measurement has progressively advanced, there are still concerns regarding accuracy especially under the challenging conditions of changing metabolic rate.

  10. Bioremediation of petroleum contaminated soil at CFS Alert - Laboratory scale respirometry experiment

    International Nuclear Information System (INIS)

    Haidar, S.; Bennett, J.; Jarrett, P.; Biggar, K.

    1998-01-01

    The feasibility of 'biopiling' was tested at Canadian Forces Station 'Alert', located in the high Arctic where the feasibility of bioremediation is yet to be proven. Laboratory respirometer experiments were conducted at 11 degrees C that examined the behaviour of indigenous microorganisms. Experiments were also carried out at one contaminated site. Various soil properties were analyzed, as well as total petroleum hydrocarbons. Results showed that the respirometer system functioned properly in monitoring the behaviour of microorganisms, that indigenous microorganisms were active at 11 degrees C, and that they functioned at a constant rate of oxygen consumption. These results suggest that biopiling may be feasible under the conditions existing at CFS 'Alert'. 12 refs., 5 tabs., 8 figs

  11. Enterococcus faecalis Infection Causes Inflammation, Intracellular Oxphos-Independent ROS Production, and DNA Damage in Human Gastric Cancer Cells

    DEFF Research Database (Denmark)

    Strickertsson, Jesper A. B; Desler, Claus; Martin-Bertelsen, Tomas

    2013-01-01

    Background Achlorhydria caused by e.g. atrophic gastritis allows for bacterial overgrowth, which induces chronic inflammation and damage to the mucosal cells of infected individuals driving gastric malignancies and cancer. Enterococcus faecalis (E. faecalis) can colonize achlohydric stomachs and we...... therefore wanted to study the impact of E. faecalis infection on inflammatory response, reactive oxygen species (ROS) formation, mitochondrial respiration, and mitochondrial genetic stability in gastric mucosal cells. Methods To separate the changes induced by bacteria from those of the inflammatory cells...... we established an in vitro E. faecalis infection model system using the gastric carcinoma cell line MKN74. Total ROS and superoxide was measured by fluorescence microscopy. Cellular oxygen consumption was characterized non-invasively using XF24 microplate based respirometry. Gene expression...

  12. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  13. Quantitative Appearance Inspection for Film Coated Tablets.

    Science.gov (United States)

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  14. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  15. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  16. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    Science.gov (United States)

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  17. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  18. Quantitative whole body scintigraphy - a simplified approach

    International Nuclear Information System (INIS)

    Marienhagen, J.; Maenner, P.; Bock, E.; Schoenberger, J.; Eilles, C.

    1996-01-01

    In this paper we present investigations on a simplified method of quantitative whole body scintigraphy by using a dual head LFOV-gamma camera and a calibration algorithm without the need of additional attenuation or scatter correction. Validation of this approach to the anthropomorphic phantom as well as in patient studies showed a high accuracy concerning quantification of whole body activity (102.8% and 97.72%, resp.), by contrast organ activities were recovered with an error range up to 12%. The described method can be easily performed using commercially available software packages and is recommendable especially for quantitative whole body scintigraphy in a clinical setting. (orig.) [de

  19. Aspects of quantitative secondary ion mass spectrometry

    International Nuclear Information System (INIS)

    Grauer, R.

    1982-05-01

    Parameters which have an influence on the formation of secondary ions by ion bombardment of a solid matrix are discussed. Quantitative SIMS-analysis with the help of calibration standards necessitates a stringent control of these parameters. This is particularly valid for the oxygen partial pressure which for metal analysis has to be maintained constant also under ultra high vacuum. The performance of the theoretical LTE-model (Local Thermal Equilibrium) using internal standards will be compared with the analysis with the help of external standards. The LTE-model does not satisfy the requirements for quantitative analysis. (Auth.)

  20. Accuracy of quantitative visual soil assessment

    Science.gov (United States)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  1. Review of progress in quantitative nondestructive evaluation

    International Nuclear Information System (INIS)

    Thompson, D.O.; Chimenti, D.E.

    1983-01-01

    A comprehensive review of the current state of quantitative nondestructive evaluation (NDE), this volume brings together papers by researchers working in government, private industry, and university laboratories. Their papers cover a wide range of interests and concerns for researchers involved in theoretical and applied aspects of quantitative NDE. Specific topics examined include reliability probability of detection--ultrasonics and eddy currents weldments closure effects in fatigue cracks technology transfer ultrasonic scattering theory acoustic emission ultrasonic scattering, reliability and penetrating radiation metal matrix composites ultrasonic scattering from near-surface flaws ultrasonic multiple scattering

  2. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibanez, Noelia; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp......Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  3. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  4. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  5. Quantitative Mapping of Large Area Graphene Conductance

    DEFF Research Database (Denmark)

    Buron, Jonas Christian Due; Petersen, Dirch Hjorth; Bøggild, Peter

    2012-01-01

    We present quantitative mapping of large area graphene conductance by terahertz time-domain spectroscopy and micro four point probe. We observe a clear correlation between the techniques and identify the observed systematic differences to be directly related to imperfections of the graphene sheet...

  6. The Sampling Issues in Quantitative Research

    Science.gov (United States)

    Delice, Ali

    2010-01-01

    A concern for generalization dominates quantitative research. For generalizability and repeatability, identification of sample size is essential. The present study investigates 90 qualitative master's theses submitted for the Primary and Secondary School Science and Mathematics Education Departments, Mathematic Education Discipline in 10…

  7. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  8. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  9. Quantitative sample preparation of some heavy elements

    International Nuclear Information System (INIS)

    Jaffey, A.H.

    1977-01-01

    A discussion is given of some techniques that have been useful in quantitatively preparing and analyzing samples used in the half-life determinations of some plutonium and uranium isotopes. Application of these methods to the preparation of uranium and plutonium samples used in neutron experiments is discussed

  10. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez, N.; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp...

  11. Proteomic approaches for quantitative cancer cell signaling

    DEFF Research Database (Denmark)

    Voellmy, Franziska

    studies in an effort to contribute to the study of signaling dynamics in cancer systems. This thesis is divided into two parts. Part I begins with a brief introduction in the use of omics in systems cancer research with a focus on mass spectrometry as a means to quantitatively measure protein...

  12. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  13. Quantitative multiplex detection of pathogen biomarkers

    Energy Technology Data Exchange (ETDEWEB)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.

    2016-02-09

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  14. Quantitative angiography after directional coronary atherectomy

    NARCIS (Netherlands)

    P.W.J.C. Serruys (Patrick); V.A.W.M. Umans (Victor); B.H. Strauss (Bradley); R-J. van Suylen (Robert-Jan); M.J.B.M. van den Brand (Marcel); H. Suryapranata (Harry); P.J. de Feyter (Pim); J.R.T.C. Roelandt (Jos)

    1991-01-01

    textabstractOBJECTIVE: To assess by quantitative analysis the immediate angiographic results of directional coronary atherectomy. To compare the effects of successful atherectomy with those of successful balloon dilatation in a series of patients with matched lesions. DESIGN--Case series.

  15. Deforestation since independence: A quantitative assessment of ...

    African Journals Online (AJOL)

    Deforestation since independence: A quantitative assessment of four decades of land-cover change in Malawi. ... pressure and demographic factors are important predictors of deforestation rate within our study area. Keywords: afforestation, Africa, deforestation, drivers, land-use change, reforestation, rural, urban ...

  16. Quantitative SPECT reconstruction of iodine-123 data

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1991-01-01

    Many clinical and research studies in nuclear medicine require quantitation of iodine-123 ( 123 I) distribution for the determination of kinetics or localization. The objective of this study was to implement several reconstruction methods designed for single-photon emission computed tomography (SPECT) using 123 I and to evaluate their performance in terms of quantitative accuracy, image artifacts, and noise. The methods consisted of four attenuation and scatter compensation schemes incorporated into both the filtered backprojection/Chang (FBP) and maximum likelihood-expectation maximization (ML-EM) reconstruction algorithms. The methods were evaluated on data acquired of a phantom containing a hot sphere of 123 I activity in a lower level background 123 I distribution and nonuniform density media. For both reconstruction algorithms, nonuniform attenuation compensation combined with either scatter subtraction or Metz filtering produced images that were quantitatively accurate to within 15% of the true value. The ML-EM algorithm demonstrated quantitative accuracy comparable to FBP and smaller relative noise magnitude for all compensation schemes

  17. Values in Qualitative and Quantitative Research

    Science.gov (United States)

    Duffy, Maureen; Chenail, Ronald J.

    2008-01-01

    The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…

  18. 78 FR 52166 - Quantitative Messaging Research

    Science.gov (United States)

    2013-08-22

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast-track OMB approval... message testing research (for which CFTC received fast-track OMB approval) and is necessary to identify...

  19. Quantitative grading of store separation trajectories

    CSIR Research Space (South Africa)

    Jamison, Kevin A

    2017-09-01

    Full Text Available . This paper describes the development of an automated analysis process and software that can run a multitude of separation scenarios. A key enabler for this software is the development of a quantitative grading algorithm that scores the outcome of each release...

  20. Subjective Quantitative Studies of Human Agency

    Science.gov (United States)

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  1. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  2. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  3. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  4. Quantitative multiplex detection of pathogen biomarkers

    Science.gov (United States)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K

    2014-10-14

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  5. A quantitative lubricant test for deep drawing

    DEFF Research Database (Denmark)

    Olsson, David Dam; Bay, Niels; Andreasen, Jan L.

    2010-01-01

    A tribological test for deep drawing has been developed by which the performance of lubricants may be evaluated quantitatively measuring the maximum backstroke force on the punch owing to friction between tool and workpiece surface. The forming force is found not to give useful information...

  6. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  7. Quantitative muscle ultrasonography in amyotrophic lateral sclerosis.

    NARCIS (Netherlands)

    Arts, I.M.P.; Rooij, F.G. van; Overeem, S.; Pillen, S.; Janssen, H.M.; Schelhaas, H.J.; Zwarts, M.J.

    2008-01-01

    In this study, we examined whether quantitative muscle ultrasonography can detect structural muscle changes in early-stage amyotrophic lateral sclerosis (ALS). Bilateral transverse scans were made of five muscles or muscle groups (sternocleidomastoid, biceps brachii/brachialis, forearm flexor group,

  8. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Pieters, W.; Arnold, F.; Stoelinga, M.I.A.

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Therefore, penetration testing has thus far been used as a qualitative research method. To enable quantitative approaches to security risk management,

  9. QUANTITATIVE EXTRACTION OF MEIOFAUNA: A COMPARISON ...

    African Journals Online (AJOL)

    and A G DE WET. Department of Mathematical Statistics, University of Port Elizabeth. Accepted: May 1978. ABSTRACT. Two methods for the quantitative extraction of meiofauna from natural sandy sediments were investigated and compared: Cobb's decanting and sieving technique and the Oostenbrink elutriator. Both.

  10. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  11. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  12. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  13. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  14. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  15. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  16. Quantitative and Qualitative Extensions of Event Structures

    NARCIS (Netherlands)

    Katoen, Joost P.

    1996-01-01

    An important application of formal methods is the specification, design, and analysis of functional aspects of (distributed) systems. Recently the study of quantitative aspects of such systems based on formal methods has come into focus. Several extensions of formal methods where the occurrence of

  17. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  18. Quantitative Penetration Testing with Item Response Theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle Ida Antoinette

    2014-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  19. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  20. Engaging Business Students in Quantitative Skills Development

    Science.gov (United States)

    Cronin, Anthony; Carroll, Paula

    2015-01-01

    In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…

  1. Leaderless Covert Networks : A Quantitative Approach

    NARCIS (Netherlands)

    Husslage, B.G.M.; Lindelauf, R.; Hamers, H.J.M.

    2012-01-01

    Abstract: Lindelauf et al. (2009a) introduced a quantitative approach to investigate optimal structures of covert networks. This approach used an objective function which is based on the secrecy versus information trade-off these organizations face. Sageman (2008) hypothesized that covert networks

  2. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  3. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  4. Bringing quality and meaning to quantitative data - Bringing quantitative evidence to qualitative observation

    DEFF Research Database (Denmark)

    Karpatschof, Benny

    2007-01-01

    Based on the author's methodological theory defining the distinctive properties of quantitative and qualitative method the article demonstrates the possibilities and advantages of combining the two types of investigation in the same research project. The project being an effect study...

  5. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  6. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  7. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  8. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  9. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  10. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  11. Nanostructured surfaces investigated by quantitative morphological studies

    International Nuclear Information System (INIS)

    Perani, Martina; Carapezzi, Stefania; Mutta, Geeta Rani; Cavalcoli, Daniela

    2016-01-01

    The morphology of different surfaces has been investigated by atomic force microscopy and quantitatively analyzed in this paper. Two different tools have been employed to this scope: the analysis of the height–height correlation function and the determination of the mean grain size, which have been combined to obtain a complete characterization of the surfaces. Different materials have been analyzed: SiO_xN_y, InGaN/GaN quantum wells and Si nanowires, grown with different techniques. Notwithstanding the presence of grain-like structures on all the samples analyzed, they present very diverse surface design, underlying that this procedure can be of general use. Our results show that the quantitative analysis of nanostructured surfaces allows us to obtain interesting information, such as grain clustering, from the comparison of the lateral correlation length and the grain size. (paper)

  12. Quantitative phosphoproteomics to characterize signaling networks

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Blagoev, Blagoy

    2012-01-01

    for analyzing protein phosphorylation at a system-wide scale and has become the intuitive strategy for comprehensive characterization of signaling networks. Contemporary phosphoproteomics use highly optimized procedures for sample preparation, mass spectrometry and data analysis algorithms to identify......Reversible protein phosphorylation is involved in the regulation of most, if not all, major cellular processes via dynamic signal transduction pathways. During the last decade quantitative phosphoproteomics have evolved from a highly specialized area to a powerful and versatile platform...... and quantify thousands of phosphorylations, thus providing extensive overviews of the cellular signaling networks. As a result of these developments quantitative phosphoproteomics have been applied to study processes as diverse as immunology, stem cell biology and DNA damage. Here we review the developments...

  13. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1979-01-01

    Although the overall aim of radiobiology is to understand the biological effects of radiation, it also has the implied practical purpose of developing rational measures for the control of radiation exposure in man. The emphasis in this presentation is to show that the enormous effort expended over the years to develop quantitative dose-effect relationships in biochemical and cellular systems, animals, and human beings now seems to be paying off. The pieces appear to be falling into place, and a framework is evolving to utilize these data. Specifically, quantitative risk assessments will be discussed in terms of the cellular, animal, and human data on which they are based; their use in the development of radiation protection standards; and their present and potential impact and meaning in relation to the quantity dose equivalent and its special unit, the rem

  14. Quantitative sputter profiling at surfaces and interfaces

    International Nuclear Information System (INIS)

    Kirschner, J.; Etzkorn, H.W.

    1981-01-01

    The key problem in quantitative sputter profiling, that of a sliding depth scale has been solved by combined Auger/X-ray microanalysis. By means of this technique and for the model system Ge/Si (amorphous) the following questions are treated quantitatively: shape of the sputter profiles when sputtering through an interface and origin of their asymmetry; precise location of the interface plane on the depth profile; broadening effects due to limited depth of information and their correction; origin and amount of bombardment induced broadening for different primary ions and energies; depth dependence of the broadening, and basic limits to depth resolution. Comparisons are made to recent theoretical calculations based on recoil mixing in the collision cascade and very good agreement is found

  15. Quantitative image processing in fluid mechanics

    Science.gov (United States)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  16. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  17. Quantitative indicators of fruit and vegetable consumption

    OpenAIRE

    Dagmar Kozelová; Dana Országhová; Milan Fiľa; Zuzana Čmiková

    2015-01-01

    The quantitative research of the market is often based on surveys and questionnaires which are finding out the behavior of customers in observed areas. Before purchasing process consumers consider where they will buy fruit and vegetables, what kind to choose and in what quantity of goods. Consumers' behavior is affected by the factors as: regional gastronomic traditions, price, product appearance, aroma, place of buying, own experience and knowledge, taste preferences as well as specific heal...

  18. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  19. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  20. Quantitative imaging of bilirubin by photoacoustic microscopy

    Science.gov (United States)

    Zhou, Yong; Zhang, Chi; Yao, Da-Kang; Wang, Lihong V.

    2013-03-01

    Noninvasive detection of both bilirubin concentration and its distribution is important for disease diagnosis. Here we implemented photoacoustic microscopy (PAM) to detect bilirubin distribution. We first demonstrate that our PAM system can measure the absorption spectra of bilirubin and blood. We also image bilirubin distributions in tissuemimicking samples, both without and with blood mixed. Our results show that PAM has the potential to quantitatively image bilirubin in vivo for clinical applications.

  1. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  2. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  3. Quantitative maps of groundwater resources in Africa

    International Nuclear Information System (INIS)

    MacDonald, A M; Bonsor, H C; Dochartaigh, B É Ó; Taylor, R G

    2012-01-01

    In Africa, groundwater is the major source of drinking water and its use for irrigation is forecast to increase substantially to combat growing food insecurity. Despite this, there is little quantitative information on groundwater resources in Africa, and groundwater storage is consequently omitted from assessments of freshwater availability. Here we present the first quantitative continent-wide maps of aquifer storage and potential borehole yields in Africa based on an extensive review of available maps, publications and data. We estimate total groundwater storage in Africa to be 0.66 million km 3 (0.36–1.75 million km 3 ). Not all of this groundwater storage is available for abstraction, but the estimated volume is more than 100 times estimates of annual renewable freshwater resources on Africa. Groundwater resources are unevenly distributed: the largest groundwater volumes are found in the large sedimentary aquifers in the North African countries Libya, Algeria, Egypt and Sudan. Nevertheless, for many African countries appropriately sited and constructed boreholes can support handpump abstraction (yields of 0.1–0.3 l s −1 ), and contain sufficient storage to sustain abstraction through inter-annual variations in recharge. The maps show further that the potential for higher yielding boreholes ( > 5 l s −1 ) is much more limited. Therefore, strategies for increasing irrigation or supplying water to rapidly urbanizing cities that are predicated on the widespread drilling of high yielding boreholes are likely to be unsuccessful. As groundwater is the largest and most widely distributed store of freshwater in Africa, the quantitative maps are intended to lead to more realistic assessments of water security and water stress, and to promote a more quantitative approach to mapping of groundwater resources at national and regional level. (letter)

  4. Review of progress in quantitative NDE

    International Nuclear Information System (INIS)

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques

  5. Radioimmunoassay to quantitatively measure cell surface immunoglobulins

    International Nuclear Information System (INIS)

    Krishman, E.C.; Jewell, W.R.

    1975-01-01

    A radioimmunoassay techniques developed to quantitatively measure the presence of immunoglobulins on the surface of cells, is described. The amount of immunoglobulins found on different tumor cells varied from 200 to 1140 ng/10 6 cells. Determination of immunoglobulins on the peripheral lymphocytes obtained from different cancer patients varied between 340 to 1040 ng/10 6 cells. Cultured tumor cells, on the other hand, were found to contain negligible quantities of human IgG [pt

  6. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  7. Quantitative evaluation of dysphagia using scintigraphy

    International Nuclear Information System (INIS)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae

    1998-01-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes

  8. Quantitative evaluation of dysphagia using scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae [College of Medicine, Dankook Univ., Cheonnon (Korea, Republic of)

    1998-08-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes.

  9. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  10. Rational quantitative safety goals: a summary

    International Nuclear Information System (INIS)

    Unwin, S.D.; Hayns, M.R.

    1984-08-01

    We introduce the notion of a Rational Quantitative Safety Goal. Such a goal reflects the imprecision and vagueness inherent in any reasonable notion of adequate safety and permits such vagueness to be incorporated into the formal regulatory decision-making process. A quantitative goal of the form, the parameter x, characterizing the safety level of the nuclear plant, shall not exceed the value x 0 , for example, is of a non-rational nature in that it invokes a strict binary logic in which the parameter space underlying x is cut sharply into two portions: that containing those values of x that comply with the goal and that containing those that do not. Here, we utilize an alternative form of logic which, in accordance with any intuitively reasonable notion of safety, permits a smooth transition of a safety determining parameter between the adequately safe and inadequately safe domains. Fuzzy set theory provides a suitable mathematical basis for the formulation of rational quantitative safety goals. The decision-making process proposed here is compatible with current risk assessment techniques and produces results in a transparent and useful format. Our methodology is illustrated with reference to the NUS Corporation risk assessment of the Limerick Generating Station

  11. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Quantitative trait loci and metabolic pathways

    Science.gov (United States)

    McMullen, M. D.; Byrne, P. F.; Snook, M. E.; Wiseman, B. R.; Lee, E. A.; Widstrom, N. W.; Coe, E. H.

    1998-01-01

    The interpretation of quantitative trait locus (QTL) studies is limited by the lack of information on metabolic pathways leading to most economic traits. Inferences about the roles of the underlying genes with a pathway or the nature of their interaction with other loci are generally not possible. An exception is resistance to the corn earworm Helicoverpa zea (Boddie) in maize (Zea mays L.) because of maysin, a C-glycosyl flavone synthesized in silks via a branch of the well characterized flavonoid pathway. Our results using flavone synthesis as a model QTL system indicate: (i) the importance of regulatory loci as QTLs, (ii) the importance of interconnecting biochemical pathways on product levels, (iii) evidence for “channeling” of intermediates, allowing independent synthesis of related compounds, (iv) the utility of QTL analysis in clarifying the role of specific genes in a biochemical pathway, and (v) identification of a previously unknown locus on chromosome 9S affecting flavone level. A greater understanding of the genetic basis of maysin synthesis and associated corn earworm resistance should lead to improved breeding strategies. More broadly, the insights gained in relating a defined genetic and biochemical pathway affecting a quantitative trait should enhance interpretation of the biological basis of variation for other quantitative traits. PMID:9482823

  13. Quantitative learning strategies based on word networks

    Science.gov (United States)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  14. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  15. Some exercises in quantitative NMR imaging

    International Nuclear Information System (INIS)

    Bakker, C.J.G.

    1985-01-01

    The articles represented in this thesis result from a series of investigations that evaluate the potential of NMR imaging as a quantitative research tool. In the first article the possible use of proton spin-lattice relaxation time T 1 in tissue characterization, tumor recognition and monitoring tissue response to radiotherapy is explored. The next article addresses the question whether water proton spin-lattice relaxation curves of biological tissues are adequately described by a single time constant T 1 , and analyzes the implications of multi-exponentiality for quantitative NMR imaging. In the third article the use of NMR imaging as a quantitative research tool is discussed on the basis of phantom experiments. The fourth article describes a method which enables unambiguous retrieval of sign information in a set of magnetic resonance images of the inversion recovery type. The next article shows how this method can be adapted to allow accurate calculation of T 1 pictures on a pixel-by-pixel basis. The sixth article, finally, describes a simulation procedure which enables a straightforward determination of NMR imaging pulse sequence parameters for optimal tissue contrast. (orig.)

  16. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  17. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  18. Quantitative fluorescence microscopy and image deconvolution.

    Science.gov (United States)

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  19. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  20. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  1. Quantitative Imaging in Cancer Evolution and Ecology

    Science.gov (United States)

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral

  2. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  3. Quantitative safety goals for the regulatory process

    International Nuclear Information System (INIS)

    Joksimovic, V.; O'Donnell, L.F.

    1981-01-01

    The paper offers a brief summary of the current regulatory background in the USA, emphasizing nuclear, related to the establishment of quantitative safety goals as a way to respond to the key issue of 'how safe is safe enough'. General Atomic has taken a leading role in advocating the use of probabilistic risk assessment techniques in the regulatory process. This has led to understanding of the importance of quantitative safety goals. The approach developed by GA is discussed in the paper. It is centred around definition of quantitative safety regions. The regions were termed: design basis, safety margin or design capability and safety research. The design basis region is bounded by the frequency of 10 -4 /reactor-year and consequences of no identifiable public injury. 10 -4 /reactor-year is associated with the total projected lifetime of a commercial US nuclear power programme. Events which have a 50% chance of happening are included in the design basis region. In the safety margin region, which extends below the design basis region, protection is provided against some events whose probability of not happening during the expected course of the US nuclear power programme is within the range of 50 to 90%. Setting the lower mean frequency to this region of 10 -5 /reactor-year is equivalent to offering 90% assurance that an accident of given severity will not happen. Rare events with a mean frequency below 10 -5 can be predicted to occur. However, accidents predicted to have a probability of less than 10 -6 are 99% certain not to happen at all, and are thus not anticipated to affect public health and safety. The area between 10 -5 and 10 -6 defines the frequency portion of the safety research region. Safety goals associated with individual risk to a maximum-exposed member of public, general societal risk and property risk are proposed in the paper

  4. Quantitative imaging of turbulent and reacting flows

    Energy Technology Data Exchange (ETDEWEB)

    Paul, P.H. [Sandia National Laboratories, Livermore, CA (United States)

    1993-12-01

    Quantitative digital imaging, using planar laser light scattering techniques is being developed for the analysis of turbulent and reacting flows. Quantitative image data, implying both a direct relation to flowfield variables as well as sufficient signal and spatial dynamic range, can be readily processed to yield two-dimensional distributions of flowfield scalars and in turn two-dimensional images of gradients and turbulence scales. Much of the development of imaging techniques to date has concentrated on understanding the requisite molecular spectroscopy and collision dynamics to be able to determine how flowfield variable information is encoded into the measured signal. From this standpoint the image is seen as a collection of single point measurements. The present effort aims at realizing necessary improvements in signal and spatial dynamic range, signal-to-noise ratio and spatial resolution in the imaging system as well as developing excitation/detection strategies which provide for a quantitative measure of particular flowfield scalars. The standard camera used for the study is an intensified CCD array operated in a conventional video format. The design of the system was based on detailed modeling of signal and image transfer properties of fast UV imaging lenses, image intensifiers and CCD detector arrays. While this system is suitable for direct scalar imaging, derived quantities (e.g. temperature or velocity images) require an exceptionally wide dynamic range imaging detector. To apply these diagnostics to reacting flows also requires a very fast shuttered camera. The authors have developed and successfully tested a new type of gated low-light level detector. This system relies on fast switching of proximity focused image-diode which is direct fiber-optic coupled to a cooled CCD array. Tests on this new detector show significant improvements in detection limit, dynamic range and spatial resolution as compared to microchannel plate intensified arrays.

  5. Quantitative microanalysis with a nuclear microprobe

    International Nuclear Information System (INIS)

    Themner, Klas.

    1989-01-01

    The analytical techniques of paticle induced X-ray emission (PIXE) and Rutherford backscattering (RBS), together with the nuclear microprobe, form a very powerful tool for performing quantitative microanalysis of biological material. Calibration of the X-ray detection system in the microprobe set-up has been performed and the accuracy of the quantitative procedure using RBS for determination of the areal mass density was investigated. The accuracy of the analysis can be affected by alteration in the elemental concentrations during irradiation due to the radiation damage induced by the very intense beams of ionixing radiation. Loss of matrix elements from freeze-dried tissue sections and polymer films have been studied during proton and photon irradiation and the effect on the accuracy discussed. Scanning the beam over an area of the target, with e.g. 32x32 pixels, in order to produce en elemental map, yields a lot of information and, to be able to make an accurate quantitatification, a fast algorithm using descriptions of the different spectral contributions is of need. The production of continuum X-rays by 2.55 MeV protons has been studied and absolute cross-sections for the bremsstrahlung production from thin carbon and some polymer films determined. For the determination of the bremsstrahlung background knowledge of the amounts of the matrix elements is important and a fast program for the evaluation of spectra of proton back- and forward scattering from biological samples has been developed. Quantitative microanalysis with the nuclear microprobe has been performed on brain tissue from rats subjected to different pathological conditions. Increase in calcium levels and decrease in potssium levels for animals subjected to crebral ischaemia and for animals suffering from epileptic seizures were observed coincidentally with or, in some cases before, visible signs of cell necrosis. (author)

  6. Quantitative transmission electron microscopy at atomic resolution

    International Nuclear Information System (INIS)

    Allen, L J; D'Alfonso, A J; Forbes, B D; Findlay, S D; LeBeau, J M; Stemmer, S

    2012-01-01

    In scanning transmission electron microscopy (STEM) it is possible to operate the microscope in bright-field mode under conditions which, by the quantum mechanical principle of reciprocity, are equivalent to those in conventional transmission electron microscopy (CTEM). The results of such an experiment will be presented which are in excellent quantitative agreement with theory for specimens up to 25 nm thick. This is at variance with the large contrast mismatch (typically between two and five) noted in equivalent CTEM experiments. The implications of this will be discussed.

  7. Quantitative spectrographic determination of zirconium minerals

    International Nuclear Information System (INIS)

    Rocal Adell, M.; Alvarez Gonzalez, F.; Fernandez Cellini, R.

    1958-01-01

    The method described in the following report permits the quantitative determination of zirconium in minerals and rocks in a 0,02-100% of ZrO 2 concentration rate. The excitation is carried out by a 10 ampere continuous current arc among carbon electrodes, and placing the sample in a crater of 2 mm depth. For low concentrations a dilution of the sample with the same weight as its own in carbon powder and with 1/25 of its weight of Co 3 O 4 (internal patron) is carried out. Line Zr 2571,4, Co 2585,3 and Co 2587,2 are used. (Author) 6 refs

  8. Quantitative angiography methods for bifurcation lesions

    DEFF Research Database (Denmark)

    Collet, Carlos; Onuma, Yoshinobu; Cavalcante, Rafael

    2017-01-01

    Bifurcation lesions represent one of the most challenging lesion subsets in interventional cardiology. The European Bifurcation Club (EBC) is an academic consortium whose goal has been to assess and recommend the appropriate strategies to manage bifurcation lesions. The quantitative coronary...... angiography (QCA) methods for the evaluation of bifurcation lesions have been subject to extensive research. Single-vessel QCA has been shown to be inaccurate for the assessment of bifurcation lesion dimensions. For this reason, dedicated bifurcation software has been developed and validated. These software...

  9. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  10. MR Fingerprinting for Rapid Quantitative Abdominal Imaging.

    Science.gov (United States)

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D; Wright, Katherine L; Seiberlich, Nicole; Griswold, Mark A; Gulani, Vikas

    2016-04-01

    To develop a magnetic resonance (MR) "fingerprinting" technique for quantitative abdominal imaging. This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue

  11. Chinese legal texts – Quantitative Description

    Directory of Open Access Journals (Sweden)

    Ľuboš GAJDOŠ

    2017-06-01

    Full Text Available The aim of the paper is to provide a quantitative description of legal Chinese. This study adopts the approach of corpus-based analyses and it shows basic statistical parameters of legal texts in Chinese, namely the length of a sentence, the proportion of part of speech etc. The research is conducted on the Chinese monolingual corpus Hanku. The paper also discusses the issues of statistical data processing from various corpora, e.g. the tokenisation and part of speech tagging and their relevance to study of registers variation.

  12. Enhancing quantitative approaches for assessing community resilience

    Science.gov (United States)

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  13. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  14. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1978-01-01

    The bases for developing quantitative assessment of exposure risks in the human being, and the several problems that accompany the assessment and introduction of the risk of exposure to high and low LET radiation into radiation protection, will be evaluated. The extension of the pioneering radiation protection philosophies to the control of other hazardous agents that cannot be eliminated from the environment will be discussed, as will the serious misunderstandings and misuse of concepts and facts that have inevitably surrounded the application to one agent alone, of the protection philosophy that must in time be applied to a broad spectrum of potentially hazardous agents. (orig.) [de

  15. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  16. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  17. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  18. A Quantitative Scale of Oxophilicity and Thiophilicity

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2016-01-01

    Oxophilicity and thiophilicity are widely used concepts with no quantitative definition. In this paper, a simple, generic scale is developed that solves issues with reference states and system dependencies and captures empirically known tendencies toward oxygen. This enables a detailed analysis......, ionic bonding is stronger to metals of low electronegativity. Left-side d-block elements with low effective nuclear charges and electro-negativities are thus highly oxophilic, and the f-block elements, not because of their hardness, which is normal, but as a result of the small ionization energies...

  19. Path to development of quantitative safety goals

    International Nuclear Information System (INIS)

    Joksimovic, V.; Houghton, W.J.

    1980-04-01

    There is a growing interest in defining numerical safety goals for nuclear power plants as exemplified by an ACRS recommendation. This paper proposes a lower frequency limit of approximately 10 -4 /reactor-year for design basis events. Below this frequency, down, to a small frequency such as 10 -5 /reactor-year, safety margin can be provided by, say, site emergency plans. Accident sequences below 10 -5 should not impact public safety, but it is prudent that safety research programs examine sequences with significant consequences. Once tentatively agreed upon, quantitative safety goals together with associated implementation tools would be factored into regulatory and design processes

  20. Expermental Studies of quantitative evaluation using HPLC

    Directory of Open Access Journals (Sweden)

    Ki Rok Kwon

    2005-06-01

    Full Text Available Methods : This study was conducted to carry out quantitative evaluation using HPLC Content analysis was done using HPLC Results : According to HPLC analysis, each BVA-1 contained approximately 0.36㎍ melittin, and BVA-2 contained approximately 0.54㎍ melittin. But the volume of coating was so minute, slight difference exists between each needle. Conclusion : Above results indicate that the bee venom acupuncture can complement shortcomings of syringe usage as a part of Oriental medicine treatment, but extensive researches should be done for further verification.

  1. Quantitative Assessment of the IT Agile Transformation

    Directory of Open Access Journals (Sweden)

    Orłowski Cezary

    2017-03-01

    Full Text Available The aim of this paper is to present the quantitative perspective of the agile transformation processes in IT organisations. The phenomenon of agile transformation becomes a complex challenge for an IT organisation since it has not been analysed in detail so far. There is no research on the readiness of IT organisations to realise agile transformation processes. Such processes also prove to have uncontrolled character. Therefore, to minimise the risk of failure referring to the realisation of transformation processes, it is necessary to monitor them. It is also necessary to identify and analyse such processes to ensure their continuous character.

  2. Quantitative Communication Research: Review, Trends, and Critique

    Directory of Open Access Journals (Sweden)

    Timothy R. Levine

    2013-01-01

    Full Text Available Trends in quantitative communication research are reviewed. A content analysis of 48 articles reporting original communication research published in 1988-1991 and 2008-2011 is reported. Survey research and self-report measurement remain common approaches to research. Null hypothesis significance testing remains the dominant approach to statistical analysis. Reporting the shapes of distributions, estimates of statistical power, and confidence intervals remain uncommon. Trends over time include the increased popularity of health communication and computer mediated communication as topics of research, and increased attention to mediator and moderator variables. The implications of these practices for scientific progress are critically discussed, and suggestions for the future are provided.

  3. Quantitative Methods in the Study of Local History

    Science.gov (United States)

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  4. Developing quantitative tools for measuring aspects of prisonization

    DEFF Research Database (Denmark)

    Kjær Minke, Linda

    2013-01-01

    The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners.......The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners....

  5. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  6. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  7. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  8. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  9. Fundamental quantitative security in quantum key generation

    International Nuclear Information System (INIS)

    Yuen, Horace P.

    2010-01-01

    We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographic context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.

  10. Quantitative image fusion in infrared radiometry

    Science.gov (United States)

    Romm, Iliya; Cukurel, Beni

    2018-05-01

    Towards high-accuracy infrared radiance estimates, measurement practices and processing techniques aimed to achieve quantitative image fusion using a set of multi-exposure images of a static scene are reviewed. The conventional non-uniformity correction technique is extended, as the original is incompatible with quantitative fusion. Recognizing the inherent limitations of even the extended non-uniformity correction, an alternative measurement methodology, which relies on estimates of the detector bias using self-calibration, is developed. Combining data from multi-exposure images, two novel image fusion techniques that ultimately provide high tonal fidelity of a photoquantity are considered: ‘subtract-then-fuse’, which conducts image subtraction in the camera output domain and partially negates the bias frame contribution common to both the dark and scene frames; and ‘fuse-then-subtract’, which reconstructs the bias frame explicitly and conducts image fusion independently for the dark and the scene frames, followed by subtraction in the photoquantity domain. The performances of the different techniques are evaluated for various synthetic and experimental data, identifying the factors contributing to potential degradation of the image quality. The findings reflect the superiority of the ‘fuse-then-subtract’ approach, conducting image fusion via per-pixel nonlinear weighted least squares optimization.

  11. QUANTITATIVE INDICATORS OF THE SECURITIZATION OF ASSETS

    Directory of Open Access Journals (Sweden)

    Denis VOSTRICOV

    2018-02-01

    Full Text Available Securitization is instrumental in return on capital increment through the withdrawal from the balance oflending activities being accompanied by off-balance incomes flow from fees, which are less capital-intensive. Thepurpose of this paper is to analyze the quantitative indicators characterizing the securitization of assets. For draftingthis article, the method of analysis, synthesis method, logic and dialectic method, normative method, the study ofstatistical sampling and time series of expert evaluations (Standard and Poor’s, personal observations, andmonographic studies have been used. The main difference between the securitization of assets from traditional waysof financing is related to the achievement of a plenty of secondary goals in attracting financial resources, whichcan play a significant role in choosing to favour the securitization of assets or other types of financing. Inparticular, it gives a possibility to write off the assets from the balance sheet along with the relevant obligationsunder the securities, to expand the range of potential investors accompanied by the reducing of credit risk, interestrate and liquidity risk, as well as to improve the management quality of assets, liabilities and risks. All of thesesecondary effects are achieved by the isolation of selected assets from the total credit risk of the enterprise, raisingits funds, which forms the pivotal actuality and significance of asset securitization. The article containsdemonstrations of quantitative and qualitative indicators characterizing the securitization of assets.

  12. Quantitating cellular immune responses to cancer vaccines.

    Science.gov (United States)

    Lyerly, H Kim

    2003-06-01

    While the future of immunotherapy in the treatment of cancer is promising, it is difficult to compare the various approaches because monitoring assays have not been standardized in approach or technique. Common assays for measuring the immune response need to be established so that these assays can one day serve as surrogate markers for clinical response. Assays that accurately detect and quantitate T-cell-mediated, antigen-specific immune responses are particularly desired. However, to date, increases in the number of cytotoxic T cells through immunization have not been correlated with clinical tumor regression. Ideally, then, a T-cell assay not only needs to be sensitive, specific, reliable, reproducible, simple, and quick to perform, it must also demonstrate close correlation with clinical outcome. Assays currently used to measure T-cell response are delayed-type hypersensitivity testing, flow cytometry using peptide major histocompatibility complex tetramers, lymphoproliferation assay, enzyme-linked immunosorbant assay, enzyme-linked immunospot assay, cytokine flow cytometry, direct cytotoxicity assay, measurement of cytokine mRNA by quantitative reverse transcriptase polymerase chain reaction, and limiting dilution analysis. The purpose of this review is to describe the attributes of each test and compare their advantages and disadvantages.

  13. A quantitative calculation for software reliability evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young-Jun; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    To meet these regulatory requirements, the software used in the nuclear safety field has been ensured through the development, validation, safety analysis, and quality assurance activities throughout the entire process life cycle from the planning phase to the installation phase. A variety of activities, such as the quality assurance activities are also required to improve the quality of a software. However, there are limitations to ensure that the quality is improved enough. Therefore, the effort to calculate the reliability of the software continues for a quantitative evaluation instead of a qualitative evaluation. In this paper, we propose a quantitative calculation method for the software to be used for a specific operation of the digital controller in an NPP. After injecting random faults in the internal space of a developed controller and calculating the ability to detect the injected faults using diagnostic software, we can evaluate the software reliability of a digital controller in an NPP. We tried to calculate the software reliability of the controller in an NPP using a new method that differs from a traditional method. It calculates the fault detection coverage after injecting the faults into the software memory space rather than the activity through the life cycle process. We attempt differentiation by creating a new definition of the fault, imitating the software fault using the hardware, and giving a consideration and weights for injection faults.

  14. Quantitative stratification of diffuse parenchymal lung diseases.

    Directory of Open Access Journals (Sweden)

    Sushravya Raghunath

    Full Text Available Diffuse parenchymal lung diseases (DPLDs are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients.

  15. Quantitative Stratification of Diffuse Parenchymal Lung Diseases

    Science.gov (United States)

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Maldonado, Fabien; Peikert, Tobias; Moua, Teng; Ryu, Jay H.; Bartholmai, Brian J.; Robb, Richard A.

    2014-01-01

    Diffuse parenchymal lung diseases (DPLDs) are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes) and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients. PMID:24676019

  16. The Quantitative Nature of Autistic Social Impairment

    Science.gov (United States)

    Constantino, John N.

    2011-01-01

    Autism, like intellectual disability, represents the severe end of a continuous distribution of developmental impairments that occur in nature, that are highly inherited, and that are orthogonally related to other parameters of development. A paradigm shift in understanding the core social abnormality of autism as a quantitative trait rather than as a categorically-defined condition has key implications for diagnostic classification, the measurement of change over time, the search for underlying genetic and neurobiologic mechanisms, and public health efforts to identify and support affected children. Here a recent body of research in genetics and epidemiology is presented to examine a dimensional reconceptualization of autistic social impairment—as manifested in clinical autistic syndromes, the broader autism phenotype, and normal variation in the general population. It illustrates how traditional categorical approaches to diagnosis may lead to misclassification of subjects (especially girls and mildly affected boys in multiple-incidence autism families), which can be particularly damaging to biological studies, and proposes continued efforts to derive a standardized quantitative system by which to characterize this family of conditions. PMID:21289537

  17. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  18. Immune chromatography: a quantitative radioimmunological assay

    International Nuclear Information System (INIS)

    Davis, J.W.; Demetriades, M.; Bowen, J.M.

    1984-01-01

    Immune chromatography, a radioimmunological binding assay, employs paper chromatography to separate immune complexes from free antigen and antibodies. During chromatography free antigen and antibodies become distributed throughout the paper, while immune complexes remain near the bottoms of the strips. The chromatographic differences can be made quantitative by using either iodinated antigens or antibodies. Under these conditions nanogram quantities of antigen can be detected or antibodies in sera diluted several 1000-fold. The immune chromatography assay can also be performed as an indirect assay, since the paper strips are cut from nitrocellulose paper. In this case the immune components are absorbed by the paper during chromatography. Antigen is then detected with an iodinated second antibody. The indirect immune chromatography assay is particularly useful for identifying different sera that react with the same antigen. Reaction with the first serum before chromatography reduces the amount of antigen available to the second serum following chromatography. In addition to characterizing the immune chromatography procedure, we discuss the possible applications of chromatography assays for the quantitation of other types of molecular binding interactions. (Auth.)

  19. Quantitative fluorescence nanoscopy for cancer biomedicine

    Science.gov (United States)

    Huang, Tao; Nickerson, Andrew; Peters, Alec; Nan, Xiaolin

    2015-08-01

    Cancer is a major health threat worldwide. Options for targeted cancer therapy, however, are often limited, in a large part due to our incomplete understanding of how key processes including oncogenesis and drug response are mediated at the molecular level. New imaging techniques for visualizing biomolecules and their interactions at the nanometer and single molecule scales, collectively named fluorescence nanoscopy, hold the promise to transform biomedical research by providing direct mechanistic insight into cellular processes. We discuss the principles of quantitative single-molecule localization microscopy (SMLM), a subset of fluorescence nanoscopy, and their applications to cancer biomedicine. In particular, we will examine oncogenesis and drug resistance mediated by mutant Ras, which is associated with ~1/3 of all human cancers but has remained an intractable drug target. At ~20 nm spatial and single-molecule stoichiometric resolutions, SMLM clearly showed that mutant Ras must form dimers to activate its effector pathways and drive oncogenesis. SMLM further showed that the Raf kinase, one of the most important effectors of Ras, also forms dimers upon activation by Ras. Moreover, treatment of cells expressing wild type Raf with Raf inhibitors induces Raf dimer formation in a manner dependent on Ras dimerization. Together, these data suggest that Ras dimers mediate oncogenesis and drug resistance in tumors with hyperactive Ras and can potentially be targeted for cancer therapy. We also discuss recent advances in SMLM that enable simultaneous imaging of multiple biomolecules and their interactions at the nanoscale. Our work demonstrates the power of quantitative SMLM in cancer biomedicine.

  20. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  1. Quantitative diagnosis of skeletons with demineralizing osteopathy

    International Nuclear Information System (INIS)

    Banzer, D.

    1979-01-01

    The quantitative diagnosis of bone diseases must be assessed according to the accuracy of the applied method, the expense in apparatus, personnel and financial resources and the comparability of results. Nuclide absorptiometry and in the future perhaps computed tomography represent the most accurate methods for determining the mineral content of bones. Their application is the clinics' prerogative because of the costs. Morphometry provides quantiative information, in particular in course control, and enables an objective judgement of visual pictures. It requires little expenditure and should be combined with microradioscopy. Direct comparability of the findings of different working groups is most easy in morphometry; it depends on the equipment in computerized tomography and is still hardly possible in nuclide absorptiometry. For fundamental physical reason, it will hardly be possible to produce a low-cost, fast and easy-to-handle instrument for the determination of the mineral salt concentration in bones. Instead, there is rather a trend towards more expensive equipment, e.g. CT instruments; the universal use of these instruments, however, will help to promote quantitative diagnoses. (orig.) [de

  2. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  3. Quantitative fluorescence angiography for neurosurgical interventions.

    Science.gov (United States)

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  4. Developments in quantitative electron probe microanalysis

    International Nuclear Information System (INIS)

    Tixier, R.

    1977-01-01

    A study of the range of validity of the formulae for corrections used with massive specimen analysis is made. The method used is original; we have shown that it was possible to use a property of invariability of corrected intensity ratios for standards. This invariance property provides a test for the self consistency of the theory. The theoretical and experimental conditions required for quantitative electron probe microanalysis of thin transmission electron microscope specimens are examined. The correction formulae for atomic number, absorption and fluorescence effects are calculated. Several examples of experimental results are given, relative to the quantitative analysis of intermetallic precipitates and carbides in steels. Advances in applications of electron probe instruments related to the use of computer and the present development of fully automated instruments are reviewed. The necessary statistics for measurements of X ray count data are studied. Estimation procedure and tests are developed. These methods are used to perform a statistical check of electron probe microanalysis measurements and to reject rogue values. An estimator of the confidence interval of the apparent concentration is derived. Formulae were also obtained to optimize the counting time in order to obtain the best precision in a minimum amount of time [fr

  5. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  6. Quantitative evaluations of male pattern baldness.

    Science.gov (United States)

    Tsuji, Y; Ishino, A; Hanzawa, N; Uzuka, M; Okazaki, K; Adachi, K; Imamura, S

    1994-07-01

    Several methods for the evaluation of hair growth have been reported; however, none of the hitherto reported methods are satisfactory as unbiased double blind studies to evaluate the efficacy of hair growth agents. In the present paper, we describe quantitative evaluation methods for hair growth by measuring the anagen ratio and hair diameters in 56 Japanese subjects aged 23-56 for 3 years. The average anagen ratio decreased by 3.8% in 3 years. The average hair diameters showed a statistically significant decrease each year totalling 3.4 microns. Subjects were sorted according to their anagen ratio into 4 groups. Each group showed different distribution patterns of hair diameters. The higher anagen ratio group has a high frequency peak at thicker hair diameters and the lower anagen ratio group has a high frequency peak at thinner hair diameters. The number of thicker hairs decreased and the high frequency peak shifted to thinner hair diameters in 3 years. These methods are useful to evaluate both the progression of male pattern baldness and the effects of hair growth agents with double blind studies in an unbiased quantitative fashion.

  7. Technological innovation in neurosurgery: a quantitative study.

    Science.gov (United States)

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-07-01

    Technological innovation within health care may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technology-intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical techniques. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation, respectively. The authors searched a patent database for articles published between 1960 and 2010 using the Boolean search term "neurosurgeon OR neurosurgical OR neurosurgery." The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top-performing technology cluster was then selected as an exemplar for a more detailed analysis of individual patents. In all, 11,672 patents and 208,203 publications related to neurosurgery were identified. The top-performing technology clusters during these 50 years were image-guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes, and endoscopes. In relation to image-guidance and neuromodulation devices, the authors found a highly correlated rapid rise in the numbers of patents and publications, which suggests that these are areas of technology expansion. An in-depth analysis of neuromodulation-device patents revealed that the majority of well-performing patents were related to deep brain stimulation. Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery.

  8. Quantitative assessment of growth plate activity

    International Nuclear Information System (INIS)

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies

  9. Quantitative tomographic measurements of opaque multiphase flows

    Energy Technology Data Exchange (ETDEWEB)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN; O' HERN,TIMOTHY J.; CECCIO,STEVEN L.

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDT and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.

  10. Allometric trajectories and "stress": a quantitative approach

    Directory of Open Access Journals (Sweden)

    Tommaso Anfodillo

    2016-11-01

    Full Text Available The term stress is an important but vague term in plant biology. We show situations in which thinking in terms of stress is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between source and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, too little leaf area (e.g. due to herbivory or disease per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to stress, without need for recourse to this term. Our approach contrasts with traditional approaches for studying stress, e.g. revealing that small stressed plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as stress, plasticity, adaptation, and acclimation.

  11. Allometric Trajectories and "Stress": A Quantitative Approach.

    Science.gov (United States)

    Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E

    2016-01-01

    The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.

  12. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  13. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  14. Another Curriculum Requirement? Quantitative Reasoning in Economics: Some First Steps

    Science.gov (United States)

    O'Neill, Patrick B.; Flynn, David T.

    2013-01-01

    In this paper, we describe first steps toward focusing on quantitative reasoning in an intermediate microeconomic theory course. We find student attitudes toward quantitative aspects of economics improve over the duration of the course (as we would hope). Perhaps more importantly, student attitude toward quantitative reasoning improves, in…

  15. Quantitative Literacy Courses as a Space for Fusing Literacies

    Science.gov (United States)

    Tunstall, Samuel Luke; Matz, Rebecca L.; Craig, Jeffrey C.

    2016-01-01

    In this article, we examine how students in a general education quantitative literacy course reason with public issues when unprompted to use quantitative reasoning. Michigan State University, like many institutions, not only has a quantitative literacy requirement for all undergraduates but also offers two courses specifically for meeting the…

  16. Videodensitometric quantitative angiography after coronary balloon angioplasty, compared to edge-detection quantitative angiography and intracoronary ultrasound imaging

    NARCIS (Netherlands)

    Peters, R. J.; Kok, W. E.; Pasterkamp, G.; von Birgelen, C.; Prins, M. [=Martin H.; Serruys, P. W.

    2000-01-01

    AIMS: To assess the value of videodensitometric quantification of the coronary lumen after angioplasty by comparison to two other techniques of coronary artery lumen quantification. METHODS AND RESULTS: Videodensitometric quantitative angiography, edge detection quantitative angiography and 30 MHz

  17. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  18. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  19. Real time quantitative amplification detection on a microarray: towards high multiplex quantitative PCR.

    NARCIS (Netherlands)

    Pierik, A.; Moamfa, M; van Zelst, M.; Clout, D.; Stapert, H.; Dijksman, Johan Frederik; Broer, D.; Wimberger-Friedl, R.

    2012-01-01

    Quantitative real-time polymerase chain reaction (qrtPCR) is widely used as a research and diagnostic tool. Notwithstanding its many powerful features, the method is limited in the degree of multiplexing to about 6 due to spectral overlap of the available fluorophores. A new method is presented that

  20. Real time quantitative amplification detection on a microarray : towards high multiplex quantitative PCR

    NARCIS (Netherlands)

    Pierik, Anke; Boamfa, M.; Zelst, van M.; Clout, D.; Stapert, H.R.; Dijksman, J.F.; Broer, D.J.; Wimberger-Friedl, R.

    2012-01-01

    Quantitative real-time polymerase chain reaction (qrtPCR) is widely used as a research and diagnostic tool. Notwithstanding its many powerful features, the method is limited in the degree of multiplexing to about 6 due to spectral overlap of the available fluorophores. A new method is presented that

  1. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  2. Progress towards in vitro quantitative imaging of human femur using compound quantitative ultrasonic tomography

    International Nuclear Information System (INIS)

    Lasaygues, Philippe; Ouedraogo, Edgard; Lefebvre, Jean-Pierre; Gindre, Marcel; Talmant, Marilyne; Laugier, Pascal

    2005-01-01

    The objective of this study is to make cross-sectional ultrasonic quantitative tomography of the diaphysis of long bones. Ultrasonic propagation in bones is affected by the severe mismatch between the acoustic properties of this biological solid and those of the surrounding soft medium, namely, the soft tissues in vivo or water in vitro. Bone imaging is then a nonlinear inverse-scattering problem. In this paper, we showed that in vitro quantitative images of sound velocities in a human femur cross section could be reconstructed by combining ultrasonic reflection tomography (URT), which provides images of the macroscopic structure of the bone, and ultrasonic transmission tomography (UTT), which provides quantitative images of the sound velocity. For the shape, we developed an image-processing tool to extract the external and internal boundaries and cortical thickness measurements. For velocity mapping, we used a wavelet analysis tool adapted to ultrasound, which allowed us to detect precisely the time of flight from the transmitted signals. A brief review of the ultrasonic tomography that we developed using correction algorithms of the wavepaths and compensation procedures are presented. Also shown are the first results of our analyses on models and specimens of long bone using our new iterative quantitative protocol

  3. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  4. Quantitative recurrence for free semigroup actions

    Science.gov (United States)

    Carvalho, Maria; Rodrigues, Fagner B.; Varandas, Paulo

    2018-03-01

    We consider finitely generated free semigroup actions on a compact metric space and obtain quantitative information on Poincaré recurrence, average first return time and hitting frequency for the random orbits induced by the semigroup action. Besides, we relate the recurrence to balls with the rates of expansion of the semigroup generators and the topological entropy of the semigroup action. Finally, we establish a partial variational principle and prove an ergodic optimization for this kind of dynamical action. MC has been financially supported by CMUP (UID/MAT/00144/2013), which is funded by FCT (Portugal) with national (MEC) and European structural funds (FEDER) under the partnership agreement PT2020. FR and PV were partially supported by BREUDS. PV has also benefited from a fellowship awarded by CNPq-Brazil and is grateful to the Faculty of Sciences of the University of Porto for the excellent research conditions.

  5. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  6. Quantitative radiation monitors for containment and surveillance

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1983-01-01

    Quantitative radiation monitors make it possible to differentiate between shielded and unshielded nuclear materials. The hardness of the gamma-ray spectrum is the attribute that characterizes bare or shielded material. Separate high- and low-energy gamma-ray regions are obtained from a single-channel analyzer through its window and discriminator outputs. The monitor counts both outputs and computes a ratio of the high- and low-energy region counts whenever an alarm occurs. The ratio clearly differentiates between shielded and unshielded nuclear material so that the net alarm count may be identified with a small quantity of unshielded material or a large quantity of shielded material. Knowledge of the diverted quantity helps determine whether an inventory should be called to identify the loss

  7. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  8. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  9. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  10. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  11. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  12. Quantitative aspects of myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Vogel, R.A.

    1980-01-01

    Myocardial perfusion measurements have traditionally been performed in a quantitative fashion using application of the Sapirstein, Fick, Kety-Schmidt, or compartmental analysis principles. Although global myocardial blood flow measurements have not proven clinically useful, regional determinations have substantially advanced our understanding of and ability to detect myocardial ischemia. With the introduction of thallium-201, such studies have become widely available, although these have generally undergone qualitative evaluation. Using computer-digitized data, several methods for the quantification of myocardial perfusion images have been introduced. These include orthogonal and polar coordinate systems and anatomically oriented region of interest segmentation. Statistical ranges of normal and time-activity analyses have been applied to these data, resulting in objective and reproducible means of data evaluation

  13. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  14. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  15. Quantitative Accelerated Life Testing of MEMS Accelerometers.

    Science.gov (United States)

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-11-20

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing thereliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shownin this paper and an attempt to assess the reliability level for a batch of MEMSaccelerometers is reported. The testing plan is application-driven and contains combinedtests: thermal (high temperature) and mechanical stress. Two variants of mechanical stressare used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tiltingand high temperature is used. Tilting is appropriate as application-driven stress, because thetilt movement is a natural environment for devices used for automotive and aerospaceapplications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The testresults demonstrated the excellent reliability of the studied devices, the failure rate in the"worst case" being smaller than 10 -7 h -1 .

  16. Nonparametric functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yang, Jie; Wu, Rongling; Casella, George

    2009-03-01

    Functional mapping is a useful tool for mapping quantitative trait loci (QTL) that control dynamic traits. It incorporates mathematical aspects of biological processes into the mixture model-based likelihood setting for QTL mapping, thus increasing the power of QTL detection and the precision of parameter estimation. However, in many situations there is no obvious functional form and, in such cases, this strategy will not be optimal. Here we propose to use nonparametric function estimation, typically implemented with B-splines, to estimate the underlying functional form of phenotypic trajectories, and then construct a nonparametric test to find evidence of existing QTL. Using the representation of a nonparametric regression as a mixed model, the final test statistic is a likelihood ratio test. We consider two types of genetic maps: dense maps and general maps, and the power of nonparametric functional mapping is investigated through simulation studies and demonstrated by examples.

  17. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  18. Quantitative MFM on superconducting thin films

    Energy Technology Data Exchange (ETDEWEB)

    Stopfel, Henry; Vock, Silvia; Shapoval, Tetyana; Neu, Volker; Wolff, Ulrike; Haindl, Silvia; Engelmann, Jan; Schaefer, Rudolf; Holzapfel, Bernhard; Schultz, Ludwig [IFW Dresden, Institute for Metallic Material (Germany); Inosov, Dmytro S. [Max Planck Institute for Solid State Research, Stuttgart (Germany)

    2012-07-01

    Quantitative interpretation of magnetic force microscopy (MFM) data is a challenge, because the measured signal is a convolution between the magnetization of the tip and the stray field emanated by the sample. It was established theoretically that the field distribution just above the surface of the superconductor can be well approximated by the stray field of a magnetic monopole. The description of the MFM tip, however, needs a second approximation. The temperature-dependent vortex-distribution images on a NbN thin film were fitted using two different tip models. Firstly, the magnetic tip was assumed to be a monopole that leads to the simple monopole-monopole model for the tip-sample interaction force. Performing a 2D fitting of the data with this model, we extracted λ, Δ and the vortex pinning force. Secondly, a geometrical model was applied to calculate the tip-transfer-function of the MFM tip using the numerical BEM method.

  19. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  20. Safety culture management and quantitative indicator evaluation

    International Nuclear Information System (INIS)

    Mandula, J.

    2002-01-01

    This report discuses a relationship between safety culture and evaluation of quantitative indicators. It shows how a systematic use of generally shared operational safety indicators may contribute to formation and reinforcement of safety culture characteristics in routine plant operation. The report also briefly describes the system of operational safety indicators used at the Dukovany plant. It is a PC database application enabling an effective work with the indicators and providing all users with an efficient tool for making synoptic overviews of indicator values in their links and hierarchical structure. Using color coding, the system allows quick indicator evaluation against predefined limits considering indicator value trends. The system, which has resulted from several-year development, was completely established at the plant during the years 2001 and 2002. (author)

  1. Geomorphology: now a more quantitative science

    International Nuclear Information System (INIS)

    Lal, D.

    1995-01-01

    Geomorphology, one of the oldest branches of planetary science, is now growing into a quantitative field with the development of a nuclear method capable of providing numeric time controls on a great variety of superficial processes. The method complement the conventional dating methods, e.g. 40 K/ 40 Ar, 87 Rb/ 87 Sr, by providing information on geomorphic processes., e.g. the dwell times of rocks on the earth's surface with strict geometrical constraints; e.g., rates of physical and chemical weathering in the past, chronology of events associated with glaciation, etc. This article attempts to discuss the new possibilities that now exist for studying a wide range of geomorphic processes, with examples of some specific isotopic changes that allow one to model glacial chronology, and evolutionary histories of alluvial fans and sand dunes. (author). 9 refs., 3 figs., 4 tabs

  2. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  3. Quantitative measurement of the cerebral blood flow

    International Nuclear Information System (INIS)

    Houdart, R.; Mamo, H.; Meric, P.; Seylaz, J.

    1976-01-01

    The value of the cerebral blood flow measurement (CBF) is outlined, its limits are defined and some future prospects discussed. The xenon 133 brain clearance study is at present the most accurate quantitative method to evaluate the CBF in different regions of the brain simultaneously. The method and the progress it has led to in the physiological, physiopathological and therapeutic fields are described. The major disadvantage of the method is shown to be the need to puncture the internal carotid for each measurement. Prospects are discussed concerning methods derived from the same general principle but using a simpler, non-traumatic way to introduce the radio-tracer, either by breathing into the lungs or intraveinously [fr

  4. Quantitative Susceptibility Mapping in Parkinson's Disease.

    Science.gov (United States)

    Langkammer, Christian; Pirpamer, Lukas; Seiler, Stephan; Deistung, Andreas; Schweser, Ferdinand; Franthal, Sebastian; Homayoon, Nina; Katschnig-Winter, Petra; Koegl-Wallner, Mariella; Pendl, Tamara; Stoegerer, Eva Maria; Wenzel, Karoline; Fazekas, Franz; Ropele, Stefan; Reichenbach, Jürgen Rainer; Schmidt, Reinhold; Schwingenschuh, Petra

    2016-01-01

    Quantitative susceptibility mapping (QSM) and R2* relaxation rate mapping have demonstrated increased iron deposition in the substantia nigra of patients with idiopathic Parkinson's disease (PD). However, the findings in other subcortical deep gray matter nuclei are converse and the sensitivity of QSM and R2* for morphological changes and their relation to clinical measures of disease severity has so far been investigated only sparsely. The local ethics committee approved this study and all subjects gave written informed consent. 66 patients with idiopathic Parkinson's disease and 58 control subjects underwent quantitative MRI at 3T. Susceptibility and R2* maps were reconstructed from a spoiled multi-echo 3D gradient echo sequence. Mean susceptibilities and R2* rates were measured in subcortical deep gray matter nuclei and compared between patients with PD and controls as well as related to clinical variables. Compared to control subjects, patients with PD had increased R2* values in the substantia nigra. QSM also showed higher susceptibilities in patients with PD in substantia nigra, in the nucleus ruber, thalamus, and globus pallidus. Magnetic susceptibility of several of these structures was correlated with the levodopa-equivalent daily dose (LEDD) and clinical markers of motor and non-motor disease severity (total MDS-UPDRS, MDS-UPDRS-I and II). Disease severity as assessed by the Hoehn & Yahr scale was correlated with magnetic susceptibility in the substantia nigra. The established finding of higher R2* rates in the substantia nigra was extended by QSM showing superior sensitivity for PD-related tissue changes in nigrostriatal dopaminergic pathways. QSM additionally reflected the levodopa-dosage and disease severity. These results suggest a more widespread pathologic involvement and QSM as a novel means for its investigation, more sensitive than current MRI techniques.

  5. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  6. Reproducibility of quantitative planar thallium-201 scintigraphy: quantitative criteria for reversibility of myocardial perfusion defects

    International Nuclear Information System (INIS)

    Sigal, S.L.; Soufer, R.; Fetterman, R.C.; Mattera, J.A.; Wackers, F.J.

    1991-01-01

    Fifty-two paired stress/delayed planar 201 TI studies (27 exercise studies, 25 dipyridamole studies) were processed twice by seven technologists to assess inter- and intraobserver variability. The reproducibility was inversely related to the size of 201 Tl perfusion abnormalities. Intraobserver variability was not different between exercise and dipyridamole studies for lesions of similar size. Based upon intraobserver variability, objective quantitative criteria for reversibility of perfusion abnormalities were defined. These objective criteria were tested prospectively in a separate group of 35 201 Tl studies and compared with the subjective interpretation of quantitative circumferential profiles. Overall, exact agreement existed in 78% of images (kappa statistic k = 0.66). We conclude that quantification of planar 201 Tl scans is highly reproducible, with acceptable inter- and intraobserver variability. Objective criteria for lesion reversibility correlated well with analysis by experienced observers

  7. Evolutionary Quantitative Genomics of Populus trichocarpa.

    Directory of Open Access Journals (Sweden)

    Ilga Porth

    Full Text Available Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance was investigated for signatures of selection (comparing QST-FST using clustering of individuals by climate of origin (temperature and precipitation. 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation; 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes were associated with adaptive traits (based on significant QST. Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show

  8. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  9. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  10. Progress in quantitative GPR development at CNDE

    Energy Technology Data Exchange (ETDEWEB)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott [Center for Nondestructive Evaluation, Iowa State University, 1915 Scholl Road, Ames, IA 50011-3042 (United States)

    2014-02-18

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  11. Quantitative Ultrasond in the assessment of Osteoporosis

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Terlizzi, Francesca de

    2009-01-01

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  12. Quantitation of vitamin K in human milk

    International Nuclear Information System (INIS)

    Canfield, L.M.; Hopkinson, J.M.; Lima, A.F.; Martin, G.S.; Sugimoto, K.; Burr, J.; Clark, L.; McGee, D.L.

    1990-01-01

    A quantitative method was developed for the assay of vitamin K in human colostrum and milk. The procedure combines preparative and analytical chromatography on silica gel in a nitrogen atmosphere followed by reversed phase high performance liquid chromatography (HPLC). Two HPLC steps were used: gradient separation with ultraviolet (UV) detection followed by isocratic separation detected electrochemically. Due to co-migrating impurities, UV detection alone is insufficient for identification of vitamin K. Exogenous vitamin K was shown to equilibrate with endogenous vitamin K in the samples. A statistical method was incorporated to control for experimental variability. Vitamin K1 was analyzed in 16 pooled milk samples from 7 donors and in individual samples from 15 donors at 1 month post-partum. Vitamin K1 was present at 2.94 +/- 1.94 and 3.15 +/- 2.87 ng/mL in pools and in individuals, respectively. Menaquinones, the bacterial form of the vitamin, were not detected. The significance of experimental variation to studies of vitamin K in individuals is discussed

  13. Quantitative theory of driven nonlinear brain dynamics.

    Science.gov (United States)

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Quantitative risk assessment of drinking water contaminants

    International Nuclear Information System (INIS)

    Cothern, C.R.; Coniglio, W.A.; Marcus, W.L.

    1986-01-01

    The development of criteria and standards for the regulation of drinking water contaminants involves a variety of processes, one of which is risk estimation. This estimation process, called quantitative risk assessment, involves combining data on the occurrence of the contaminant in drinking water and its toxicity. The human exposure to a contaminant can be estimated from occurrence data. Usually the toxicity or number of health effects per concentration level is estimated from animal bioassay studies using the multistage model. For comparison, other models will be used including the Weibull, probit, logit and quadratic ones. Because exposure and toxicity data are generally incomplete, assumptions need to be made and this generally results in a wide range of certainty in the estimates. This range can be as wide as four to six orders of magnitude in the case of the volatile organic compounds in drinking water and a factor of four to five for estimation of risk due to radionuclides in drinking water. As examples of the differences encountered in risk assessment of drinking water contaminants, discussions are presented on benzene, lead, radon and alachlor. The lifetime population risk estimates for these contaminants are, respectively, in the ranges of: <1 - 3000, <1 - 8000, 2000-40,000 and <1 - 80. 11 references, 1 figure, 1 table

  15. Progress in quantitative GPR development at CNDE

    Science.gov (United States)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-02-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  16. Progress in quantitative GPR development at CNDE

    International Nuclear Information System (INIS)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-01-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability

  17. Quantitative phenotyping via deep barcode sequencing.

    Science.gov (United States)

    Smith, Andrew M; Heisler, Lawrence E; Mellor, Joseph; Kaper, Fiona; Thompson, Michael J; Chee, Mark; Roth, Frederick P; Giaever, Guri; Nislow, Corey

    2009-10-01

    Next-generation DNA sequencing technologies have revolutionized diverse genomics applications, including de novo genome sequencing, SNP detection, chromatin immunoprecipitation, and transcriptome analysis. Here we apply deep sequencing to genome-scale fitness profiling to evaluate yeast strain collections in parallel. This method, Barcode analysis by Sequencing, or "Bar-seq," outperforms the current benchmark barcode microarray assay in terms of both dynamic range and throughput. When applied to a complex chemogenomic assay, Bar-seq quantitatively identifies drug targets, with performance superior to the benchmark microarray assay. We also show that Bar-seq is well-suited for a multiplex format. We completely re-sequenced and re-annotated the yeast deletion collection using deep sequencing, found that approximately 20% of the barcodes and common priming sequences varied from expectation, and used this revised list of barcode sequences to improve data quality. Together, this new assay and analysis routine provide a deep-sequencing-based toolkit for identifying gene-environment interactions on a genome-wide scale.

  18. Qualitative and quantitative descriptions of glenohumeral motion.

    Science.gov (United States)

    Hill, A M; Bull, A M J; Wallace, A L; Johnson, G R

    2008-02-01

    Joint modelling plays an important role in qualitative and quantitative descriptions of both normal and abnormal joints, as well as predicting outcomes of alterations to joints in orthopaedic practice and research. Contemporary efforts in modelling have focussed upon the major articulations of the lower limb. Well-constrained arthrokinematics can form the basis of manageable kinetic and dynamic mathematical predictions. In order to contain computation of shoulder complex modelling, glenohumeral joint representations in both limited and complete shoulder girdle models have undergone a generic simplification. As such, glenohumeral joint models are often based upon kinematic descriptions of inadequate degrees of freedom (DOF) for clinical purposes and applications. Qualitative descriptions of glenohumeral motion range from the parody of a hinge joint to the complex realism of a spatial joint. In developing a model, a clear idea of intention is required in order to achieve a required application. Clinical applicability of a model requires both descriptive and predictive output potentials, and as such, a high level of validation is required. Without sufficient appreciation of the clinical intention of the arthrokinematic foundation to a model, error is all too easily introduced. Mathematical description of joint motion serves to quantify all relevant clinical parameters. Commonly, both the Euler angle and helical (screw) axis methods have been applied to the glenohumeral joint, although concordance between these methods and classical anatomical appreciation of joint motion is limited, resulting in miscommunication between clinician and engineer. Compounding these inconsistencies in motion quantification is gimbal lock and sequence dependency.

  19. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  20. Limits of qualitative detection and quantitative determination

    International Nuclear Information System (INIS)

    Curie, L.A.

    1976-01-01

    The fact that one can find a series of disagreeing and limiting definitions of the detection limit leads to the reinvestigation of the problems of signal detection and signal processing in analytical and nuclear chemistry. Three cut-off levels were fixed: Lsub(C) - the net signal level (sensitivity of the equipment), above which an observed signal can be reliably recognized as 'detected'; Lsub(D) - the 'true' net signal level, from which one can a priori expect a detection; Lsub(Q) - the level at which the measuring accuracy is sufficient for quantitative determination. Exact definition equations as well as a series of working formulae are given for the general analytical case and for the investigation of radioactivity. As it is assumed that the radioactivity of the Poisson distribution is determined, it is dealt with in such a manner that precise limits can be derived for short-lived and long-lived radionuclides with or without disturbance. The fundamentals are made clear by simple examples for spectrophotometry and radioactivity and by a complicated example for activation analysis in which one must choose between alternative nuclear reactions. (orig./LH) [de

  1. Precision of different quantitative ultrasound densitometers

    International Nuclear Information System (INIS)

    Pocock, N.A.; Harris, N.D.; Griffiths, M.R.

    1998-01-01

    Full text: Quantitative ultrasound (QUS) of the calcaneus, which measures Speed of Sound (SOS) and Broadband ultrasound attenuation (BUA), is predictive of the risk of osteoporotic fracture. However, the utility of QUS for predicting fracture risk or for monitoring treatment efficacy depends on its precision and reliability. Published results and manufacturers data vary significantly due to differences in statistical methodology. We have assessed the precision of the current model of the Lunar Achilles and the McCue Cuba QUS densitometers; the most commonly used QUS machines in Australia. Twenty seven subjects had duplicate QUS measurements performed on the same day on both machines. These data were used to calculate the within pair standard deviation (SD) the co-efficient of variation, CV and the standardised co efficient of variation (sCV) which is corrected for the dynamic range. In addition, the co-efficient of reliability (R) was calculated as an index of reliability which is independent of the population mean value, or the dynamic range of the measurements. R ranges between 0 (for no reliability) to 1(for a perfect reliability). The results indicate that the precision of QUS is dependent on the dynamic range and the instrument. Furthermore, they suggest that while QUS is a useful predictor of fracture risk, at present it has limited clinical value in monitoring short term age-related bone loss of 1-2% per year

  2. Quantitative Measurements using Ultrasound Vector Flow Imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    scanner for pulsating flow mimicking the femoral artery from a CompuFlow 1000 pump (Shelley Medical). Data were used in four estimators based on directional transverse oscillation for velocity, flow angle, volume flow, and turbulence estimation and their respective precisions. An adaptive lag scheme gave...... the ability to estimate a large velocity range, or alternatively measure at two sites to find e.g. stenosis degree in a vessel. The mean angle at the vessel center was estimated to 90.9◦±8.2◦ indicating a laminar flow from a turbulence index being close to zero (0.1 ±0.1). Volume flow was 1.29 ±0.26 mL/stroke...... (true: 1.15 mL/stroke, bias: 12.2%). Measurements down to 160 mm were obtained with a relative standard deviation and bias of less than 10% for the lateral component for stationary, parabolic flow. The method can, thus, find quantitative velocities, angles, and volume flows at sites currently...

  3. Quantitative NDE of Composite Structures at NASA

    Science.gov (United States)

    Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.

    2015-01-01

    The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.

  4. Neuropathic pain: is quantitative sensory testing helpful?

    Science.gov (United States)

    Krumova, Elena K; Geber, Christian; Westermann, Andrea; Maier, Christoph

    2012-08-01

    Neuropathic pain arises as a consequence of a lesion or disease affecting the somatosensory system and is characterised by a combination of positive and negative sensory symptoms. Quantitative sensory testing (QST) examines the sensory perception after application of different mechanical and thermal stimuli of controlled intensity and the function of both large (A-beta) and small (A-delta and C) nerve fibres, including the corresponding central pathways. QST can be used to determine detection, pain thresholds and stimulus-response curves and can thus detect both negative and positive sensory signs, the second ones not being assessed by other methods. Similarly to all other psychophysical tests QST requires standardised examination, instructions and data evaluation to receive valid and reliable results. Since normative data are available, QST can contribute also to the individual diagnosis of neuropathy, especially in the case of isolated small-fibre neuropathy, in contrast to the conventional electrophysiology which assesses only large myelinated fibres. For example, detection of early stages of subclinical neuropathy in symptomatic or asymptomatic patients with diabetes mellitus can be helpful to optimise treatment and identify diabetic foot at risk of ulceration. QST assessed the individual's sensory profile and thus can be valuable to evaluate the underlying pain mechanisms which occur in different frequencies even in the same neuropathic pain syndromes. Furthermore, assessing the exact sensory phenotype by QST might be useful in the future to identify responders to certain treatments in accordance to the underlying pain mechanisms.

  5. Quantitative topographic differentiation of the neonatal EEG.

    Science.gov (United States)

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  6. Marketing communications: Qualitative and quantitative paradigm

    Directory of Open Access Journals (Sweden)

    Uzelac Nikola

    2005-01-01

    Full Text Available This paper focuses on key issues in relation to the choice of basic language of communication of marketing as a practical and academic field. Principally, marketing managers prefer descriptive way of expression, but they should use the advantages of language of numbers much more. By doing so, they will advance decision-making process - and the communication with finance and top management. In this regard, models offered by academic community could be helpful. This especially pertains to those positive or normative verbal approaches and models in which mathematics and statistical solutions have been embedded, as well as to those which emphasize financial criteria in decision-making. Concerning the process of creation and verification of scientific knowledge, the choice between languages of words and numbers is the part of much wider dimension, because it is inseparable from the decision on basic research orientation. Quantitative paradigm is more appropriate for hypotheses testing, while qualitative paradigm gives greater contribution in their generation. Competition factor could become the key driver of changes by which existing "parallel worlds" of main paradigms would be integrating, for the sake of disciplinary knowledge advancement.

  7. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  8. Individual patient dosimetry using quantitative SPECT imaging

    International Nuclear Information System (INIS)

    Gonzalez, J.; Oliva, J.; Baum, R.; Fisher, S.

    2002-01-01

    An approach is described to provide individual patient dosimetry for routine clinical use. Accurate quantitative SPECT imaging was achieved using appropriate methods. The volume of interest (VOI) was defined semi-automatically using a fixed threshold value obtained from phantom studies. The calibration factor to convert the voxel counts from SPECT images into activity values was determine from calibrated point source using the same threshold value as in phantom studies. From selected radionuclide the dose within and outside a sphere of voxel dimension at different distances was computed through dose point-kernels to obtain a discrete absorbed dose kernel representation around the volume source with uniform activity distribution. The spatial activity distribution from SPECT imaging was convolved with this kernel representation using the discrete Fourier transform method to yield three-dimensional absorbed dose rate distribution. The accuracy of dose rates calculation was validated by software phantoms. The absorbed dose was determined by integration of the dose rate distribution for each volume of interest (VOI). Parameters for treatment optimization such as dose rate volume histograms and dose rate statistic are provided. A patient example was used to illustrate our dosimetric calculations

  9. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  10. Hepatic iron overload: Quantitative MR imaging

    International Nuclear Information System (INIS)

    Gomori, J.M.; Horev, G.; Tamary, H.; Zandback, J.; Kornreich, L.; Zaizov, R.; Freud, E.; Krief, O.; Ben-Meir, J.; Rotem, H.

    1991-01-01

    Iron deposits demonstrate characteristically shortened T2 relaxation times. Several previously published studies reported poor correlation between the in vivo hepatic 1/T2 measurements made by means of midfield magnetic resonance (MR) units and the hepatic iron content of iron-overloaded patients. In this study, the authors assessed the use of in vivo 1/T2 measurements obtained by means of MR imaging at 0.5 T using short echo times (13.4 and 30 msec) and single-echo-sequences as well as computed tomographic (CT) attenuation as a measure of liver iron concentration in 10 severely iron-overloaded patients with beta-thalassemia major. The iron concentrations in surgical wedge biopsy samples of the liver, which varied between 3 and 9 mg/g of wet weight (normal, less than or equal to 0.5 mg/g), correlated well (r = .93, P less than or equal to .0001) with the preoperative in vivo hepatic 1/T2 measurements. The CT attenuation did not correlate with liver iron concentration. Quantitative MR imaging is a readily available noninvasive method for the assessment of hepatic iron concentration in iron-overloaded patients, reducing the need for needle biopsies of the liver

  11. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  12. A Quantitative Index of Forest Structural Sustainability

    Directory of Open Access Journals (Sweden)

    Jonathan A. Cale

    2014-07-01

    Full Text Available Forest health is a complex concept including many ecosystem functions, interactions and values. We develop a quantitative system applicable to many forest types to assess tree mortality with respect to stable forest structure and composition. We quantify impacts of observed tree mortality on structure by comparison to baseline mortality, and then develop a system that distinguishes between structurally stable and unstable forests. An empirical multivariate index of structural sustainability and a threshold value (70.6 derived from 22 nontropical tree species’ datasets differentiated structurally sustainable from unsustainable diameter distributions. Twelve of 22 species populations were sustainable with a mean score of 33.2 (median = 27.6. Ten species populations were unsustainable with a mean score of 142.6 (median = 130.1. Among them, Fagus grandifolia, Pinus lambertiana, P. ponderosa, and Nothofagus solandri were attributable to known disturbances; whereas the unsustainability of Abies balsamea, Acer rubrum, Calocedrus decurrens, Picea engelmannii, P. rubens, and Prunus serotina populations were not. This approach provides the ecological framework for rational management decisions using routine inventory data to objectively: determine scope and direction of change in structure and composition, assess excessive or insufficient mortality, compare disturbance impacts in time and space, and prioritize management needs and allocation of scarce resources.

  13. Quantitative Ultrasond in the assessment of Osteoporosis

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, Giuseppe [Department of Radiology, University of Foggia, Viale L. Pinto, 71100 Foggia (Italy); Department of Radiology, Scientific Institute Hospital, San Giovanni Rotondo (Italy)], E-mail: g.guglielmi@unifg.it; Terlizzi, Francesca de [IGEA srl, Via Parmenide 10/A 41012 Carpi, MO (Italy)], E-mail: f.deterlizzi@igeamedical.com

    2009-09-15

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  14. Quantitative rotating frame relaxometry methods in MRI.

    Science.gov (United States)

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  16. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  17. Quantitative magnetotail characteristics of different magnetospheric states

    Directory of Open Access Journals (Sweden)

    M. A. Shukhtina

    2004-03-01

    Full Text Available Quantitative relationships allowing one to compute the lobe magnetic field, flaring angle and tail radius, and to evaluate magnetic flux based on solar wind/IMF parameters and spacecraft position are obtained for the middle magnetotail, X=(–15,–35RE, using 3.5 years of simultaneous Geotail and Wind spacecraft observations. For the first time it was done separately for different states of magnetotail including the substorm onset (SO epoch, the steady magnetospheric convection (SMC and quiet periods (Q. In the explored distance range the magnetotail parameters appeared to be similar (within the error bar for Q and SMC states, whereas at SO their values are considerably larger. In particular, the tail radius is larger by 1–3 RE at substorm onset than during Q and SMC states, for which the radius value is close to previous magnetopause model values. The calculated lobe magnetic flux value at substorm onset is ~1GWb, exceeding that at Q (SMC states by ~50%. The model magnetic flux values at substorm onset and SMC show little dependence on the solar wind dynamic pressure and distance in the tail, so the magnetic flux value can serve as an important discriminator of the state of the middle magnetotail. Key words. Magnetospheric physics (solar windmagnetosphere- interactions, magnetotail, storms and substorms

  18. Quantitative magnetotail characteristics of different magnetospheric states

    Directory of Open Access Journals (Sweden)

    M. A. Shukhtina

    2004-03-01

    Full Text Available Quantitative relationships allowing one to compute the lobe magnetic field, flaring angle and tail radius, and to evaluate magnetic flux based on solar wind/IMF parameters and spacecraft position are obtained for the middle magnetotail, X=(–15,–35RE, using 3.5 years of simultaneous Geotail and Wind spacecraft observations. For the first time it was done separately for different states of magnetotail including the substorm onset (SO epoch, the steady magnetospheric convection (SMC and quiet periods (Q. In the explored distance range the magnetotail parameters appeared to be similar (within the error bar for Q and SMC states, whereas at SO their values are considerably larger. In particular, the tail radius is larger by 1–3 RE at substorm onset than during Q and SMC states, for which the radius value is close to previous magnetopause model values. The calculated lobe magnetic flux value at substorm onset is ~1GWb, exceeding that at Q (SMC states by ~50%. The model magnetic flux values at substorm onset and SMC show little dependence on the solar wind dynamic pressure and distance in the tail, so the magnetic flux value can serve as an important discriminator of the state of the middle magnetotail.

    Key words. Magnetospheric physics (solar windmagnetosphere- interactions, magnetotail, storms and substorms

  19. Quantitative assessment of integrated phrenic nerve activity.

    Science.gov (United States)

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  1. Quantitative biological measurement in Transmission Electron Tomography

    International Nuclear Information System (INIS)

    Mantell, Judith M; Verkade, Paul; Arkill, Kenton P

    2012-01-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.

  2. Quantitative biological measurement in Transmission Electron Tomography

    Science.gov (United States)

    Mantell, Judith M.; Verkade, Paul; Arkill, Kenton P.

    2012-07-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.

  3. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  4. Quantitative myocardial perfusion by O-15-water PET

    DEFF Research Database (Denmark)

    Thomassen, Anders; Petersen, Henrik; Johansen, Allan

    2015-01-01

    AIMS: Reporting of quantitative myocardial blood flow (MBF) is typically performed in standard coronary territories. However, coronary anatomy and myocardial vascular territories vary among individuals, and a coronary artery may erroneously be deemed stenosed or not if territorial demarcation...... disease (CAD). METHODS AND RESULTS: Forty-four patients with suspected CAD were included prospectively and underwent coronary CT-angiography and quantitative MBF assessment with O-15-water PET followed by invasive, quantitative coronary angiography, which served as reference. MBF was calculated...

  5. Quantitative valuation of platform technology based intangibles companies

    OpenAIRE

    Achleitner, Ann-Kristin; Nathusius, Eva; Schraml, Stephanie

    2007-01-01

    In the course of raising external equity, e.g. from venture capitalists, a quantitative valuation is usually required for entrepreneurial ventures. This paper examines the challenges of quantitatively valuing platform technology based entrepreneurial ventures. The distinct characteristics of such companies pose specific requirements on the applicability of quantitative valuation methods. The entrepreneur can choose from a wide range of potential commercialization strategies to pursue in the c...

  6. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  7. Quantitative Measurement of Oxygen in Microgravity Combustion

    Science.gov (United States)

    Silver, Joel A.

    1997-01-01

    A low-gravity environment, in space or in ground-based facilities such as drop towers, provides a unique setting for studying combustion mechanisms. Understanding the physical phenomena controlling the ignition and spread of flames in microgravity has importance for space safety as well as for better characterization of dynamical and chemical combustion processes which are normally masked by buoyancy and other gravity-related effects. Due to restrictions associated with performing measurements in reduced gravity, diagnostic methods which have been applied to microgravity combustion studies have generally been limited to capture of flame emissions on film or video, laser Schlieren imaging and (intrusive) temperature measurements using thermocouples. Given the development of detailed theoretical models, more sophisticated diagnostic methods are needed to provide the kind of quantitative data necessary to characterize the properties of microgravity combustion processes as well as provide accurate feedback to improve the predictive capabilities of the models. When the demands of space flight are considered, the need for improved diagnostic systems which are rugged, compact, reliable, and operate at low power becomes apparent. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in both microgravity combustion research and as a sensor on-board Spacelab as either an air quality monitor or as part of a fire detection system. In our prior microgravity work, an eight line-of-sight fiber optic system measured

  8. Quantitative imaging of coronary blood flow

    Directory of Open Access Journals (Sweden)

    Adam M. Alessio

    2010-04-01

    Full Text Available Adam M. Alessio received his PhD in Electrical Engineering from the University of Notre Dame in 2003. During his graduate studies he developed tomographic reconstruction methods for correlated data and helped construct a high-resolution PET system. He is currently a Research Assistant Professor in Radiology at the University of Washington. His research interests focus on improved data processing and reconstruction algorithms for PET/CT systems with an emphasis on quantitative imaging. Erik Butterworth recieved the BA degree in Mathematics from the University of Chicago in 1977. Between 1977 and 1987 he worked as a computer programmer/analyst for several small commercial software firms. Since 1988, he has worked as a software engineer on various research projects at the University of Washington. Between 1988 and 1993 he developed a real-time data aquisition for the analysis of estuarine sediment transport in the department of Geophysics. Between 1988 and 2002 he developed I4, a system for the display and analysis of cardic PET images in the department of Cardiology. Since 1993 he has worked on physiological simulation systems (XSIM from 1993 to 1999, JSim since 1999 at the National Simulation Resource Facility in Cirulatory Mass Transport and Exchange, in the Department of Bioengineering. His research interests include simulation systems and medical imaging. James H. Caldwell, MD, University of Missouri-Columbia 1970, is Professor of Medicine (Cardiology and Radiology and Adjunct Professor of Bioengineering at the University of Washington School of Medicine and Acting Head, Division of Cardiology and Director of Nuclear Cardiology for the University of Washington Hospitals, Seattle WA, USA. James B. Bassingthwaighte, MD, Toronto 1955, PhD Mayo Grad Sch Med 1964, was Professor of Physiology and of Medicine at Mayo Clinic until 1975 when he moved to the University of Washington to chair Bioengineering. He is Professor of Bioengineering and

  9. Computer code for quantitative ALARA evaluations

    International Nuclear Information System (INIS)

    Voilleque, P.G.

    1984-01-01

    A FORTRAN computer code has been developed to simplify the determination of whether dose reduction actions meet the as low as is reasonably achievable (ALARA) criterion. The calculations are based on the methodology developed for the Atomic Industrial Forum. The code is used for analyses of eight types of dose reduction actions, characterized as follows: reduce dose rate, reduce job frequency, reduce productive working time, reduce crew size, increase administrative dose limit for the task, and increase the workers' time utilization and dose utilization through (a) improved working conditions, (b) basic skill training, or (c) refresher training for special skills. For each type of action, two analysis modes are available. The first is a generic analysis in which the program computes potential benefits (in dollars) for a range of possible improvements, e.g., for a range of lower dose rates. Generic analyses are most useful in the planning stage and for evaluating the general feasibility of alternative approaches. The second is a specific analysis in which the potential annual benefits of a specific level of improvement and the annual implementation cost are compared. The potential benefits reflect savings in operational and societal costs that can be realized if occupational radiation doses are reduced. Because the potential benefits depend upon many variables which characterize the job, the workplace, and the workers, there is no unique relationship between the potential dollar savings and the dose savings. The computer code permits rapid quantitative analyses of alternatives and is a tool that supplements the health physicist's professional judgment. The program output provides a rational basis for decision-making and a record of the assumptions employed

  10. Quantitative histological models suggest endothermy in plesiosaurs

    Directory of Open Access Journals (Sweden)

    Corinna V. Fleischle

    2018-06-01

    Full Text Available Background Plesiosaurs are marine reptiles that arose in the Late Triassic and survived to the Late Cretaceous. They have a unique and uniform bauplan and are known for their very long neck and hydrofoil-like flippers. Plesiosaurs are among the most successful vertebrate clades in Earth’s history. Based on bone mass decrease and cosmopolitan distribution, both of which affect lifestyle, indications of parental care, and oxygen isotope analyses, evidence for endothermy in plesiosaurs has accumulated. Recent bone histological investigations also provide evidence of fast growth and elevated metabolic rates. However, quantitative estimations of metabolic rates and bone growth rates in plesiosaurs have not been attempted before. Methods Phylogenetic eigenvector maps is a method for estimating trait values from a predictor variable while taking into account phylogenetic relationships. As predictor variable, this study employs vascular density, measured in bone histological sections of fossil eosauropterygians and extant comparative taxa. We quantified vascular density as primary osteon density, thus, the proportion of vascular area (including lamellar infillings of primary osteons to total bone area. Our response variables are bone growth rate (expressed as local bone apposition rate and resting metabolic rate (RMR. Results Our models reveal bone growth rates and RMRs for plesiosaurs that are in the range of birds, suggesting that plesiosaurs were endotherm. Even for basal eosauropterygians we estimate values in the range of mammals or higher. Discussion Our models are influenced by the availability of comparative data, which are lacking for large marine amniotes, potentially skewing our results. However, our statistically robust inference of fast growth and fast metabolism is in accordance with other evidence for plesiosaurian endothermy. Endothermy may explain the success of plesiosaurs consisting in their survival of the end-Triassic extinction

  11. Rapid quantitative assay for chloramphenicol acetyltransferase

    International Nuclear Information System (INIS)

    Neumann, J.R.; Morency, C.A.; Russian, K.O.

    1987-01-01

    Measuring the expression of exogenous genetic material in mammalian cells is commonly done by fusing the DNA of interest to a gene encoding an easily-detected enzyme. Chloramphenicol acetyltransferase(CAT) is a convenient marker because it is not normally found in eukaryotes. CAT activity has usually been detected using a thin-layer chromatographic separation followed by autoradiography. An organic solvent extraction-based method for CAT detection has also been described, as well as a procedure utilizing HPLC analysis. Building on the extraction technique, they developed a rapid sensitive kinetic method for measuring CAT activity in cell homogenates. The method exploits the differential organic solubility of the substrate ([ 3 H] or [ 14 C]acetyl CoA) and the product (labeled acetylchloramphenicol). The assay is a simple one-vial, two-phase procedure and requires no tedious manipulations after the initial setup. Briefly, a 0.25 ml reaction with 100mM Tris-HCL, 1mM chloramphenicol, 0.1mM [ 14 C]acetyl CoA and variable amounts of cell homogenate is pipetted into a miniscintillation vial, overlaid with 5 ml of a water-immiscible fluor, and incubated at 37 0 C. At suitable intervals the vial is counted and the CAT level is quantitatively determined as the rate of increase in counts/min of the labeled product as it diffuses into the fluor phase, compared to a standard curve. When used to measure CAT in transfected Balb 3T3 cells the method correlated well with the other techniques

  12. Quantitative indicators of fruit and vegetable consumption

    Directory of Open Access Journals (Sweden)

    Dagmar Kozelová

    2015-12-01

    Full Text Available The quantitative research of the market is often based on surveys and questionnaires which are finding out the behavior of customers in observed areas. Before purchasing process consumers consider where they will buy fruit and vegetables, what kind to choose and in what quantity of goods. Consumers' behavior is affected by the factors as: regional gastronomic traditions, price, product appearance, aroma, place of buying, own experience and knowledge, taste preferences as well as specific health issues of consumers and others. The consumption of fruit and vegetables brings into the human body biological active substances that favorably affect the health of consumers. In the presented research study we were interested in differences of consumers' behavior in the consumption of fruit and vegetables according to the place of residence and gender. In the survey 200 respondents has participated; their place of residence was city or village. The existence of dependences and statistical significance were examined by selected statistical testing methods. Firstly we analyzed the responses via statistical F-test whether observed random samples have the same variance. Then we applied two-sample unpaired t-test with equal variance and χ2-test of statistical independence. The statistical significance was tested by corresponding p values. Correlations were proved by the Cramer's V coefficient. We found that place of residence has no impact on the respondents' consumption of fruit. The gender of respondents does not affect their consumption of fruit. Equally, the gender does not affect the respondents' consumption of vegetables. Only in one observed case the significant differences proved that the place of respondent residence has impact on the consumption of vegetables. Higher consumption of vegetables is due to the fact that the majority of citizens, who live in villages, have a possibility to grow their own vegetables and, thus, the demand for it in village

  13. Quantitative tomography simulations and reconstruction algorithms

    International Nuclear Information System (INIS)

    Martz, H.E.; Aufderheide, M.B.; Goodman, D.; Schach von Wittenau, A.; Logan, C.; Hall, J.; Jackson, J.; Slone, D.

    2000-01-01

    X-ray, neutron and proton transmission radiography and computed tomography (CT) are important diagnostic tools that are at the heart of LLNL's effort to meet the goals of the DOE's Advanced Radiography Campaign. This campaign seeks to improve radiographic simulation and analysis so that radiography can be a useful quantitative diagnostic tool for stockpile stewardship. Current radiographic accuracy does not allow satisfactory separation of experimental effects from the true features of an object's tomographically reconstructed image. This can lead to difficult and sometimes incorrect interpretation of the results. By improving our ability to simulate the whole radiographic and CT system, it will be possible to examine the contribution of system components to various experimental effects, with the goal of removing or reducing them. In this project, we are merging this simulation capability with a maximum-likelihood (constrained-conjugate-gradient-CCG) reconstruction technique yielding a physics-based, forward-model image-reconstruction code. In addition, we seek to improve the accuracy of computed tomography from transmission radiographs by studying what physics is needed in the forward model. During FY 2000, an improved version of the LLNL ray-tracing code called HADES has been coupled with a recently developed LLNL CT algorithm known as CCG. The problem of image reconstruction is expressed as a large matrix equation relating a model for the object being reconstructed to its projections (radiographs). Using a constrained-conjugate-gradient search algorithm, a maximum likelihood solution is sought. This search continues until the difference between the input measured radiographs or projections and the simulated or calculated projections is satisfactorily small

  14. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  15. The National Benchmark Test of quantitative literacy: Does it ...

    African Journals Online (AJOL)

    This article explores the relationship between these two standardised assessments in the domain of mathematical/quantitative literacy. This is accomplished through a Pearson correlation analysis of 6,363 test scores obtained by Grade 12 learners on the NSC Mathematical Literacy examination and the Quantitative ...

  16. PCA-based groupwise image registration for quantitative MRI

    NARCIS (Netherlands)

    Huizinga, W.; Poot, D. H. J.; Guyader, J.-M.; Klaassen, R.; Coolen, B. F.; van Kranenburg, M.; van Geuns, R. J. M.; Uitterdijk, A.; Polfliet, M.; Vandemeulebroucke, J.; Leemans, A.; Niessen, W. J.; Klein, S.

    2016-01-01

    Quantitative magnetic resonance imaging (qMRI) is a technique for estimating quantitative tissue properties, such as the T5 and T2 relaxation times, apparent diffusion coefficient (ADC), and various perfusion measures. This estimation is achieved by acquiring multiple images with different

  17. Resources on quantitative/statistical research for applied linguists

    OpenAIRE

    Brown , James Dean

    2004-01-01

    Abstract The purpose of this review article is to survey and evaluate existing books on quantitative/statistical research in applied linguistics. The article begins by explaining the types of texts that will not be reviewed, then it briefly describes nine books that address how to do quantitative/statistical applied linguistics research. The review then compares (in prose and tables) the general characteris...

  18. Resources on Quantitative/Statistical Research for Applied Linguists

    Science.gov (United States)

    Brown, James Dean

    2004-01-01

    The purpose of this review article is to survey and evaluate existing books on quantitative/statistical research in applied linguistics. The article begins by explaining the types of texts that will not be reviewed, then it briefly describes nine books that address how to do quantitative/statistical applied linguistics research. The review then…

  19. Quantitative trait loci (QTL) mapping for inflorescence length traits in ...

    African Journals Online (AJOL)

    Lablab purpureus (L.) sweet is an ancient legume species whose immature pods serve as a vegetable in south and south-east Asia. The objective of this study is to identify quantitative trait loci (QTLs) associated with quantitative traits such as inflorescence length, peduncle length from branch to axil, peduncle length from ...

  20. Using the Blended Learning Approach in a Quantitative Literacy Course

    Science.gov (United States)

    Botts, Ryan T.; Carter, Lori; Crockett, Catherine

    2018-01-01

    The efforts to improve the quantitative reasoning (quantitative literacy) skills of college students in the United States have been gaining momentum in recent years. At the same time, the blended learning approach to course delivery has gained in popularity, promising better learning with flexible modalities and pace. This paper presents the…

  1. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  2. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    Science.gov (United States)

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  3. urrent status and assessment of quantitative and qualitative one leg ...

    African Journals Online (AJOL)

    ... of only a quantitative assessment. These findings indicate that, when evaluating the one leg balance in children aged 3-6 years, a quantitative and qualitative assessment should be used in combination together to assure a more accurate assessment. (S. African J. for Research in Sport, Physical Ed. and Recreation: 2001 ...

  4. Quantitative Theoretical and Conceptual Framework Use in Agricultural Education Research

    Science.gov (United States)

    Kitchel, Tracy; Ball, Anna L.

    2014-01-01

    The purpose of this philosophical paper was to articulate the disciplinary tenets for consideration when using theory in agricultural education quantitative research. The paper clarified terminology around the concept of theory in social sciences and introduced inaccuracies of theory use in agricultural education quantitative research. Finally,…

  5. Quantitative Approaches to Group Research: Suggestions for Best Practices

    Science.gov (United States)

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  6. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  7. Undergraduate Students' Quantitative Reasoning in Economic Contexts

    Science.gov (United States)

    Mkhatshwa, Thembinkosi Peter; Doerr, Helen M.

    2018-01-01

    Contributing to a growing body of research on undergraduate students' quantitative reasoning, the study reported in this article used task-based interviews to investigate business calculus students' quantitative reasoning when solving two optimization tasks situated in the context of revenue and profit maximization. Analysis of verbal responses…

  8. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  9. Exploring Phytoplankton Population Investigation Growth to Enhance Quantitative Literacy

    Science.gov (United States)

    Baumgartner, Erin; Biga, Lindsay; Bledsoe, Karen; Dawson, James; Grammer, Julie; Howard, Ava; Snyder, Jeffrey

    2015-01-01

    Quantitative literacy is essential to biological literacy (and is one of the core concepts in "Vision and Change in Undergraduate Biology Education: A Call to Action"; AAAS 2009). Building quantitative literacy is a challenging endeavor for biology instructors. Integrating mathematical skills into biological investigations can help build…

  10. Genetic variability, heritability and genetic advance of quantitative ...

    African Journals Online (AJOL)

    Genetic variation has led to an increase in the quantitative traits of crops. The variability on genome is induced by mutation, which enhances the productivity. We evaluated variability on quantitative characters such as, plant height, number of branches/plant, number of leaves/plant, number of fruit clusters/plant, number of ...

  11. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study

  12. Quantitative Methods Intervention: What Do the Students Want?

    Science.gov (United States)

    Frankland, Lianne; Harrison, Jacqui

    2016-01-01

    The shortage of social science graduates with competent quantitative skills jeopardises the competitive UK economy, public policy making effectiveness and the status the UK has as a world leader in higher education and research (British Academy for Humanities and Social Sciences, 2012). There is a growing demand for quantitative skills across all…

  13. Mapcurves: a quantitative method for comparing categorical maps.

    Science.gov (United States)

    William W. Hargrove; M. Hoffman Forrest; Paul F. Hessburg

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if...

  14. Statistical mechanics and the evolution of polygenic quantitative traits

    NARCIS (Netherlands)

    Barton, N.H.; De Vladar, H.P.

    The evolution of quantitative characters depends on the frequencies of the alleles involved, yet these frequencies cannot usually be measured. Previous groups have proposed an approximation to the dynamics of quantitative traits, based on an analogy with statistical mechanics. We present a modified

  15. Quantitative Phase Determination by Using a Michelson Interferometer

    Science.gov (United States)

    Pomarico, Juan A.; Molina, Pablo F.; D'Angelo, Cristian

    2007-01-01

    The Michelson interferometer is one of the best established tools for quantitative interferometric measurements. It has been, and is still successfully used, not only for scientific purposes, but it is also introduced in undergraduate courses for qualitative demonstrations as well as for quantitative determination of several properties such as…

  16. Unraveling possible association between quantitative trait loci (QTL ...

    African Journals Online (AJOL)

    Unraveling possible association between quantitative trait loci (QTL) for partial resistance and nonhost resistance in food barley ( Hordeum vulgaris L.) ... Abstract. Many quantitative trait loci (QTLs) in different barley populations were discovered for resistance to Puccinia hordei and heterologous rust species. Partial ...

  17. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  18. Optimization method for quantitative calculation of clay minerals in soil

    Indian Academy of Sciences (India)

    However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative ... 2. Basic principles. The mineralogical constitution of soil is rather complex. ... K2O, MgO, and TFe as variables for the calculation.

  19. Quantitation of Proteinuria in Women With Pregnancy Induced ...

    African Journals Online (AJOL)

    This creates the need for a more accurate method for early detection and quantitation of proteinuria. Objective:To compare the accuracy of the Spot urine Protein to Creatinine ratio with that of Dipstick Tests in the quantitation of proteinuria in Nigerian women with Pregnancy Induced Hypertension. Methods: A cross-sectional ...

  20. Quantitative image of bone mineral content

    International Nuclear Information System (INIS)

    Katoh, Tsuguhisa

    1990-01-01

    A dual energy subtraction system was constructed on an experimental basis for the quantitative image of bone mineral content. The system consists of a radiographing system and an image processor. Two radiograms were taken with dual x-ray energy in a single exposure using an x-ray beam dichromized by a tin filter. In this system, a film cassette was used where a low speed film-screen system, a copper filter and a high speed film-screen system were layered on top of each other. The images were read by a microdensitometer and processed by a personal computer. The image processing included the corrections of the film characteristics and heterogeneity in the x-ray field, and the dual energy subtraction in which the effect of the high energy component of the dichromized beam on the tube side image was corrected. In order to determine the accuracy of the system, experiments using wedge phantoms made of mixtures of epoxy resin and bone mineral-equivalent materials in various fractions were performed for various tube potentials and film processing conditions. The results indicated that the relative precision of the system was within ±4% and that the propagation of the film noise was within ±11 mg/cm 2 for the 0.2 mm pixels. The results also indicated that the system response was independent of the tube potential and the film processing condition. The bone mineral weight in each phalanx of the freshly dissected hand of a rhesus monkey was measured by this system and compared with the ash weight. The results showed an error of ±10%, slightly larger than that of phantom experiments, which is probably due to the effect of fat and the variation of focus-object distance. The air kerma in free air at the object was approximately 0.5 mGy for one exposure. The results indicate that this system is applicable to clinical use and provides useful information for evaluating a time-course of localized bone disease. (author)

  1. An overview of quantitative approaches in Gestalt perception.

    Science.gov (United States)

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  3. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  4. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  5. QTest: Quantitative Testing of Theories of Binary Choice.

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  6. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  7. Prospective Middle-School Mathematics Teachers' Quantitative Reasoning and Their Support for Students' Quantitative Reasoning

    Science.gov (United States)

    Kabael, Tangul; Akin, Ayca

    2018-01-01

    The aim of this research is to examine prospective mathematics teachers' quantitative reasoning, their support for students' quantitative reasoning and the relationship between them, if any. The teaching experiment was used as the research method in this qualitatively designed study. The data of the study were collected through a series of…

  8. Quantitative assessment of 201TlCl myocardial SPECT

    International Nuclear Information System (INIS)

    Uehara, Toshiisa

    1987-01-01

    Clinical evaluation of the quantitative analysis of Tl-201 myocardial tomography by SPECT (Single Photon Emission Computed Tomography) was performed in comparison with visual evaluation. The method of quantitative analysis has been already reported in our previous paper. In this study, the program of re-standardization in the case of lateral myocardial infarction was added. This program was useful mainly for the evaluation of lesions in the left circumflex coronary artery. Regarding the degree of diagnostic accuracy of myocardial infarction in general, quantitative evaluation of myocardial SPECT images was highest followed by visual evaluation of myocardial SPECT images, and visual evaluation of myocardial planar images. However, in the case of anterior myocardial infarction, visual evaluation of myocardial SPECT images has almost the same detectability as quantitative evaluation of myocardial SPECT images. In the case of infero-posterior myocardial infarction, quantitative evaluation was superior to visual evaluation. As for specificity, quantitative evaluation of SPECT images was slightly inferior to visual evaluation of SPECT images. An infarction map was made by quantitative analysis and this enabled us to determine the infarction site, extent and degree according to easily recognizable patterns. As a result, the responsible coronary artery lesion could be inferred correctly and the calculated infarction score could be correlated with the residual left ventricular function after myocardial infarction. (author)

  9. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  10. Development of Quantitative Framework for Event Significance Evaluation

    International Nuclear Information System (INIS)

    Lee, Durk Hun; Kim, Min Chull; Kim, Inn Seock

    2010-01-01

    There is an increasing trend in quantitative evaluation of the safety significance of operational events using Probabilistic Safety Assessment (PSA) technique. An integrated framework for evaluation of event significance has been developed by Korea Institute of Nuclear Safety (KINS), which consists of an assessment hierarchy and a number of matrices. The safety significance of various events, e.g., internal or external initiating events that occurred during at-power or shutdown conditions, can be quantitatively analyzed using this framework, and then, the events rated according to their significance. This paper briefly describes the basic concept of the integrated quantitative framework for evaluation of event significance, focusing on the assessment hierarchy

  11. The value and limitation of quantitative safety goals

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1982-01-01

    Some of the philosophical and practical complexities of quantitative safety goals are reviewed with examples of how the problems have been dealt with in current safety objectives in Britain and by the International Commisson on Radiological Protection. Where possible, quantitative comparisons are shown. It is concluded that progress towards quantitative safety goals should be deliberate rather than rapid and that attention should be paid to the possible implications for industries other than the nuclear power industry and countries other than the United States of America

  12. Problems of standardized handling and quantitative evaluation of autoradiograms

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1985-01-01

    In the last years autoradiography has gained increasing importance as a quantitative method of measuring radioactivity or element concentration. Mostly relative measurements are carried out. The optical density of the photographic emulsion produced by a calibrated radiation source is compared with that produced by a sample. The influences of different parameters, such as beta particle energy, backscattering, fading of the latent image, developing conditions, matrix effects and others on the results are described and the errors of the quantitative evaluation of autoradiograms are assessed. The performance of the method is demonstrated taking the quantitative determination of gold in silicon as an example

  13. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  14. USING RESPIROMETRY TO MEASURE HYDROGEN UTILIZATION IN SULFATE REDUCING BACTERIA IN THE PRESENCE OF COPPER AND ZINC

    Science.gov (United States)

    A respirometric method has been developed to measure hydrogen utilization by sulfate reducing bacteria (SRB). One application of this method has been to test inhibitory metals effects on the SRB culture used in a novel acid mine drainage treatment technology. As a control param...

  15. Respirometry applied for biological nitrogen removal process; Aplicacion de la respirometria al tratamiento biologico para la eliminacion del nitrogeno

    Energy Technology Data Exchange (ETDEWEB)

    Serrano, E.

    2004-07-01

    In waste water treatment plants, the Biological Nitrogen Removal (BNR) has acquired a fundamental importance. The BNR processes are Nitrification ( aerobic) and Denitrification (anoxic). Since both processes are carried on living microorganisms, a lack of their bioactivity information might cause serious confusion about their control criteria and following up purposes. For this reason, the Re spirometry applied to those processes has reached an important role by getting an essential information in a timely manner through respiration rate measurements in static and dynamic modes and applications such as AUR (Ammonium Uptake Rate), Nitrification Capacity. RBCOD (Readily Biodegradable COD) as well as AUR related to SRT (Sludge age), RBCOD related to NUR (Specific Nitrate Uptake Rate) and others. By other side in this article we have introduced a not very well known applications related to denitrification, about the methanol acclimatization and generated bioactivity. (Author) 6 refs.

  16. Solid respirometry to characterize nitrification kinetics: a better insight for modelling nitrogen conversion in vertical flow constructed wetlands.

    Science.gov (United States)

    Morvannou, Ania; Choubert, Jean-Marc; Vanclooster, Marnik; Molle, Pascal

    2011-10-15

    We developed an original method to measure nitrification rates at different depths of a vertical flow constructed wetland (VFCW) with variable contents of organic matter (sludge, colonized gravel). The method was adapted for organic matter sampled in constructed wetland (sludge, colonized gravel) operated under partially saturated conditions and is based on respirometric principles. Measurements were performed on a reactor, containing a mixture of organic matter (sludge, colonized gravel) mixed with a bulking agent (wood), on which an ammonium-containing liquid was applied. The oxygen demand was determined from analysing oxygen concentration of the gas passing through the reactor with an on-line analyzer equipped with a paramagnetic detector. Within this paper we present the overall methodology, the factors influencing the measurement (sample volume, nature and concentration of the applied liquid, number of successive applications), and the robustness of the method. The combination of this new method with a mass balance approach also allowed determining the concentration and maximum growth rate of the autotrophic biomass in different layers of a VFCW. These latter parameters are essential inputs for the VFCW plant modelling. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  18. Collecting data for quantitative research on pluvial flooding

    NARCIS (Netherlands)

    Spekkers, M.H.; Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.

    2011-01-01

    Urban pluvial flood management requires detailed spatial and temporal information on flood characteristics and damaging consequences. There is lack of quantitative field data on pluvial flooding resulting in large uncertainties in urban flood model calculations and ensuing decisions for investments

  19. Quantitative stem cell biology: the threat and the glory.

    Science.gov (United States)

    Pollard, Steven M

    2016-11-15

    Major technological innovations over the past decade have transformed our ability to extract quantitative data from biological systems at an unprecedented scale and resolution. These quantitative methods and associated large datasets should lead to an exciting new phase of discovery across many areas of biology. However, there is a clear threat: will we drown in these rivers of data? On 18th July 2016, stem cell biologists gathered in Cambridge for the 5th annual Cambridge Stem Cell Symposium to discuss 'Quantitative stem cell biology: from molecules to models'. This Meeting Review provides a summary of the data presented by each speaker, with a focus on quantitative techniques and the new biological insights that are emerging. © 2016. Published by The Company of Biologists Ltd.

  20. Quantitative aspects of oxygen and carbon dioxide exchange ...

    African Journals Online (AJOL)

    Quantitative aspects of oxygen and carbon dioxide exchange through the ... ceratophthalmus (Crustacea: Decapoda) during rest and exercise in water and ... intersects zero time on the x-axis, indicating rapid gas exchange at the lung surface.

  1. Variation in quantitative characters of faba bean after seed ...

    African Journals Online (AJOL)

    Variation in quantitative characters of faba bean after seed irradiation and associated molecular changes. Sonia Mejri, Yassine Mabrouk, Marie Voisin, Philippe Delavault, Philippe Simier, Mouldi Saidi, Omrane Belhadj ...

  2. Quantitative Market Research Regarding Funding of District 8 Construction Projects

    Science.gov (United States)

    1995-05-01

    The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...

  3. A quantitative assessment of Arctic shipping in 2010–2014

    KAUST Repository

    Eguí luz, Victor M.; Ferná ndez-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-01-01

    considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant

  4. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  5. The National Benchmark Test of quantitative literacy: Does it ...

    African Journals Online (AJOL)

    Windows User

    determine whether Grade 12 learners have mastered subject knowledge at the ... the NSC Mathematical Literacy examination and the Quantitative Literacy test of the ..... Method. Sample. The sample for this study consisted of 6,363 Grade. 12 ...

  6. Quantitative determination of grain sizes by means of scattered ultrasound

    International Nuclear Information System (INIS)

    Goebbels, K.; Hoeller, P.

    1976-01-01

    The scattering of ultrasounds makes possible the quantitative determination of grain sizes in metallic materials. Examples of measurements on steels with grain sizes between ASTM 1 and ASTM 12 are given

  7. Quantitative reconstruction from a single diffraction-enhanced image

    International Nuclear Information System (INIS)

    Paganin, D.M.; Lewis, R.A.; Kitchen, M.

    2003-01-01

    Full text: We develop an algorithm for using a single diffraction-enhanced image (DEI) to obtain a quantitative reconstruction of the projected thickness of a single-material sample which is embedded within a substrate of approximately constant thickness. This algorithm is used to quantitatively map inclusions in a breast phantom, from a single synchrotron DEI image. In particular, the reconstructed images quantitatively represent the projected thickness in the bulk of the sample, in contrast to DEI images which greatly emphasise sharp edges (high spatial frequencies). In the context of an ultimate aim of improved methods for breast cancer detection, the reconstructions are potentially of greater diagnostic value compared to the DEI data. Lastly, we point out that the methods of analysis presented here are also applicable to the quantitative analysis of differential interference contrast (DIC) images

  8. Quantitative Assays for RAS Pathway Proteins and Phosphorylation States

    Science.gov (United States)

    The NCI CPTAC program is applying its expertise in quantitative proteomics to develop assays for RAS pathway proteins. Targets include key phosphopeptides that should increase our understanding of how the RAS pathway is regulated.

  9. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    Science.gov (United States)

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  10. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  11. Normal values for quantitative muscle ultrasonography in adults.

    NARCIS (Netherlands)

    Arts, I.M.P.; Pillen, S.; Schelhaas, H.J.; Overeem, S.; Zwarts, M.J.

    2010-01-01

    Ultrasonography can detect structural muscle changes caused by neuromuscular disease. Quantitative analysis is the preferred method to determine if ultrasound findings are within normal limits, but normative data are incomplete. The purpose of this study was to provide normative muscle

  12. Quantile-Based Permutation Thresholds for Quantitative Trait Loci Hotspots

    NARCIS (Netherlands)

    Neto, Elias Chaibub; Keller, Mark P.; Broman, Andrew F.; Attie, Alan D.; Jansen, Ritsert C.; Broman, Karl W.; Yandell, Brian S.; Borevitz, J.

    Quantitative trait loci (QTL) hotspots (genomic locations affecting many traits) are a common feature in genetical genomics studies and are biologically interesting since they may harbor critical regulators. Therefore, statistical procedures to assess the significance of hotspots are of key

  13. Mapping of quantitative trait loci controlling Orobanche foetida Poir ...

    African Journals Online (AJOL)

    Mapping of quantitative trait loci controlling Orobanche foetida Poir. resistance in faba bean (Vicia faba L.) R Díaz-Ruiz, A Torres, MV Gutierrez, D Rubiales, JI Cubero, M Kharrat, Z Satovic, B Román ...

  14. Cloning and semi-quantitative expression of endochitinase ( ech42 ...

    African Journals Online (AJOL)

    Cloning and semi-quantitative expression of endochitinase (ech42) gene from Trichoderma spp. Pratibha Sharma, K Saravanan, R Ramesh, P Vignesh Kumar, Dinesh Singh, Manika Sharma, Monica S. Henry, Swati Deep ...

  15. A quantitative framework for estimating water resources in India

    Digital Repository Service at National Institute of Oceanography (India)

    Shankar, D.; Kotamraju, V.; Shetye, S.R

    of information on the variables associated with hydrology, and second, the absence of an easily accessible quantitative framework to put these variables in perspective. In this paper, we discuss a framework that has been assembled to address both these issues...

  16. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  17. Book Review: Qualitative-Quantitative Analyses of Dutch and ...

    African Journals Online (AJOL)

    Abstract. Book Title: Qualitative-Quantitative Analyses of Dutch and Afrikaans Grammar and Lexicon. Book Author: Robert S. Kirsner. 2014. John Benjamins Publishing Company ISBN 9789027215772, price ZAR481.00. 239 pages ...

  18. Quantitative assessment of target dependence of pion fluctuation in ...

    Indian Academy of Sciences (India)

    journal of. December 2012 physics pp. 1395–1405. Quantitative assessment ... The analysis reveals the erratic behaviour of the produced pions signifying ..... authors (Sitaram Pal) gratefully acknowledges the financial help from the University.

  19. Quantitative trait loci mapping for stomatal traits in interspecific ...

    Indian Academy of Sciences (India)

    M. Sumathi

    2018-02-23

    Feb 23, 2018 ... Journal of Genetics, Vol. ... QTL analysis was carried out to identify the chromosomal regions affecting ... Keywords. linkage map; quantitative trait loci; stomata; stress ..... of India for providing financial support for the project.

  20. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  1. Integrated quantitative pharmacology for treatment optimization in oncology

    NARCIS (Netherlands)

    Hasselt, J.G.C. van

    2014-01-01

    This thesis describes the development and application of quantitative pharmacological models in oncology for treatment optimization and for the design and analysis of clinical trials with respect to pharmacokinetics, toxicity, efficacy and cost-effectiveness. A recurring theme throughout this

  2. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  3. Lesion detection and quantitation of positron emission mammography

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2001-01-01

    A Positron Emission Mammography (PEM) scanner dedicated to breast imaging is being developed at our laboratory. We have developed a list mode likelihood reconstruction algorithm for this scanner. Here we theoretically study the lesion detection and quantitation. The lesion detectability is studied theoretically using computer observers. We found that for the zero-order quadratic prior, the region of interest observer can achieve the performance of the prewhitening observer with a properly selected smoothing parameter. We also study the lesion quantitation using the test statistic of the region of interest observer. The theoretical expressions for the bias, variance, and ensemble mean squared error of the quantitation are derived. Computer simulations show that the theoretical predictions are in good agreement with the Monte Carlo results for both lesion detection and quantitation

  4. A quantitative exploration of the effects of workplace bullying on ...

    African Journals Online (AJOL)

    hefere

    types of hostile communication and behaviour are used (Tracy, .... Qualitative and quantitative studies explored the effects of WPB on educators. Whereas ... An array of research methods has thus been used to investigate the effects of WPB on.

  5. Doing Quantitative Grounded Theory: A theory of trapped travel consumption

    Directory of Open Access Journals (Sweden)

    Mark S. Rosenbaum, Ph.D.

    2008-11-01

    Full Text Available All is data. Grounded theorists employ this sentence in their quest to create original theoretical frameworks. Yet researchers typically interpret the word gdatah to mean qualitative data or, more specifically, interview data collected from respondents. This is not to say that qualitative data is deficient; however, grounded theorists may be missing vast opportunities to create pioneering theories from quantitative data. Indeed, Glaser and Strauss (1967 argued that researchers would use qualitative and/or quantitative data to fashion original frameworks and related hypotheses, and Glaserfs (2008 recently published book, titledDoing Quantitative Grounded Theory, is an attempt to help researchers understand how to use quantitative data for grounded theory (GT.

  6. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  7. Effects of feed forms, levels of quantitative feed restriction on ...

    African Journals Online (AJOL)

    Nigerian Journal of Animal Production ... Data were collected on growth performance, carcass characteristics, and cost benefits were calculated. Data were subjected to ... Keywords: Broilers, carcass, performance, quantitative feed restriction ...

  8. Methods and instrumentation for quantitative microchip capillary electrophoresis

    NARCIS (Netherlands)

    Revermann, T.

    2007-01-01

    The development of novel instrumentation and analytical methodology for quantitative microchip capillary electrophoresis (MCE) is described in this thesis. Demanding only small quantities of reagents and samples, microfluidic instrumentation is highly advantageous. Fast separations at high voltages

  9. Smile line assessment comparing quantitative measurement and visual estimation

    NARCIS (Netherlands)

    Geld, P. Van der; Oosterveld, P.; Schols, J.; Kuijpers-Jagtman, A.M.

    2011-01-01

    INTRODUCTION: Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation

  10. Quantitation of glial fibrillary acidic protein in human brain tumours

    DEFF Research Database (Denmark)

    Rasmussen, S; Bock, E; Warecka, K

    1980-01-01

    The glial fibrillary acidic protein (GFA) content of 58 human brain tumours was determined by quantitative immunoelectrophoresis, using monospecific antibody against GFA. Astrocytomas, glioblastomas, oligodendrogliomas, spongioblastomas, ependymomas and medulloblastomas contained relatively high...

  11. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  12. A quantitative comparison of corrective and perfective maintenance

    Science.gov (United States)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  13. Spectroscopic Tools for Quantitative Studies of DNA Structure and Dynamics

    DEFF Research Database (Denmark)

    Preus, Søren

    The main objective of this thesis is to develop quantitative fluorescence-based, spectroscopic tools for probing the 3D structure and dynamics of DNA and RNA. The thesis is founded on six peer-reviewed papers covering mainly the development, characterization and use of fluorescent nucleobase...... analogues. In addition, four software packages is presented for the simulation and quantitative analysis of time-resolved and steady-state UV-Vis absorption and fluorescence experiments....

  14. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  15. Quantitative Information Flow as Safety and Liveness Hyperproperties

    Directory of Open Access Journals (Sweden)

    Hirotoshi Yasuoka

    2012-07-01

    Full Text Available We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.

  16. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    OpenAIRE

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-01-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connecti...

  17. Quantitative method for determination of body inorganic iodine

    International Nuclear Information System (INIS)

    Filatov, A.A.; Tatsievskij, V.A.

    1991-01-01

    An original method of quantitation of body inorganic iodine, based upon a simultaneous administration of a known dose of stable and radioactive iodine with subsequent radiometry of the thyroid was proposed. The calculation is based upon the principle of the dilution of radiactive iodine in human inorganic iodine space. The method permits quantitation of the amount of inorganic iodine with regard to individual features of inorganic space. The method is characterized by simplicity and is not invasive for a patient

  18. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    OpenAIRE

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2007-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often ...

  19. Sexual Harassment Prevention Initiatives: Quantitative and Qualitative Approaches

    Science.gov (United States)

    2010-10-28

    Quantitative Approach: The Survey The quantitative approach appears to be the dominant form of mainstream psychological research today , and Gelo et al. (2008...that viewpoint and remark that the characteristics of today ‟s psychological research demonstrate realities that can be replicated through studies...2000). The right of passage? The experiences of female pilots in commercial aviation. Feminism & Psychology, 10, 195-225. Davis, A., & Bremner, G

  20. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE

    OpenAIRE

    Ojagbemi, A.

    2017-01-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan...