WorldWideScience

Sample records for yfp 10c quantitative

  1. Highly sensitive and quantitative FRET-FLIM imaging in single dendritic spines using improved non-radiative YFP.

    Science.gov (United States)

    Murakoshi, Hideji; Lee, Seok-Jin; Yasuda, Ryohei

    2008-08-01

    Two-photon fluorescence lifetime imaging microscopy (TPFLIM) enables the quantitative measurements of fluorescence resonance energy transfer (FRET) in small subcellular compartments in light scattering tissue. We evaluated and optimized the FRET pair of mEGFP (monomeric EGFP with the A206K mutation) and REACh (non-radiative YFP variants) for TPFLIM. We characterized several mutants of REACh in terms of their "darkness," and their ability to act as a FRET acceptor for mEGFP in HeLa cells and hippocampal neurons. Since the commonly used monomeric mutation A206K increases the brightness of REACh, we introduced a different monomeric mutation (F223R) which does not affect the brightness. Also, we found that the folding efficiency of original REACh, as measured by the fluorescence lifetime of a mEGFP-REACh tandem dimer, was low and variable from cell to cell. Introducing two folding mutations (F46L, Q69M) into REACh increased the folding efficiency by approximately 50%, and reduced the variability of FRET signal. Pairing mEGFP with the new REACh (super-REACh, or sREACh) improved the signal-to-noise ratio compared to the mEGFP-mRFP or mEGFP-original REACh pair by approximately 50%. Using this new pair, we demonstrated that the fraction of actin monomers in filamentous and globular forms in single dendritic spines can be quantitatively measured with high sensitivity. Thus, the mEGFP-sREACh pair is suited for quantitative FRET measurement by TPFLIM, and enables us to measure protein-protein interactions in individual dendritic spines in brain slices with high sensitivity.

  2. A genetically-encoded YFP sensor with enhanced chloride sensitivity, photostability and reduced ph interference demonstrates augmented transmembrane chloride movement by gerbil prestin (SLC26a5.

    Directory of Open Access Journals (Sweden)

    Sheng Zhong

    Full Text Available Chloride is the major anion in cells, with many diseases arising from disordered Cl- regulation. For the non-invasive investigation of Cl- flux, YFP-H148Q and its derivatives chameleon and Cl-Sensor previously were introduced as genetically encoded chloride indicators. Neither the Cl- sensitivity nor the pH-susceptibility of these modifications to YFP is optimal for precise measurements of Cl- under physiological conditions. Furthermore, the relatively poor photostability of YFP derivatives hinders their application for dynamic and quantitative Cl- measurements. Dynamic and accurate measurement of physiological concentrations of chloride would significantly affect our ability to study effects of chloride on cellular events.In this study, we developed a series of YFP derivatives to remove pH interference, increase photostability and enhance chloride sensitivity. The final product, EYFP-F46L/Q69K/H148Q/I152L/V163S/S175G/S205V/A206K (monomeric Cl-YFP, has a chloride Kd of 14 mM and pKa of 5.9. The bleach time constant of 175 seconds is over 15-fold greater than wild-type EYFP. We have used the sensor fused to the transmembrane protein prestin (gerbil prestin, SLC26a5, and shown for the first time physiological (mM chloride flux in HEK cells expressing this protein. This modified fluorescent protein will facilitate investigations of dynamics of chloride ions and their mediation of cell function.Modifications to YFP (EYFP-F46L/Q69K/H148Q/I152L/V163S/S175G/S205V/A206K (monomeric Cl-YFP results in a photostable fluorescent protein that allows measurement of physiological changes in chloride concentration while remaining minimally affected by changes in pH.

  3. A genetically-encoded YFP sensor with enhanced chloride sensitivity, photostability and reduced ph interference demonstrates augmented transmembrane chloride movement by gerbil prestin (SLC26a5).

    Science.gov (United States)

    Zhong, Sheng; Navaratnam, Dhasakumar; Santos-Sacchi, Joseph

    2014-01-01

    Chloride is the major anion in cells, with many diseases arising from disordered Cl- regulation. For the non-invasive investigation of Cl- flux, YFP-H148Q and its derivatives chameleon and Cl-Sensor previously were introduced as genetically encoded chloride indicators. Neither the Cl- sensitivity nor the pH-susceptibility of these modifications to YFP is optimal for precise measurements of Cl- under physiological conditions. Furthermore, the relatively poor photostability of YFP derivatives hinders their application for dynamic and quantitative Cl- measurements. Dynamic and accurate measurement of physiological concentrations of chloride would significantly affect our ability to study effects of chloride on cellular events. In this study, we developed a series of YFP derivatives to remove pH interference, increase photostability and enhance chloride sensitivity. The final product, EYFP-F46L/Q69K/H148Q/I152L/V163S/S175G/S205V/A206K (monomeric Cl-YFP), has a chloride Kd of 14 mM and pKa of 5.9. The bleach time constant of 175 seconds is over 15-fold greater than wild-type EYFP. We have used the sensor fused to the transmembrane protein prestin (gerbil prestin, SLC26a5), and shown for the first time physiological (mM) chloride flux in HEK cells expressing this protein. This modified fluorescent protein will facilitate investigations of dynamics of chloride ions and their mediation of cell function. Modifications to YFP (EYFP-F46L/Q69K/H148Q/I152L/V163S/S175G/S205V/A206K (monomeric Cl-YFP) results in a photostable fluorescent protein that allows measurement of physiological changes in chloride concentration while remaining minimally affected by changes in pH.

  4. Forster Resonance Energy Transfer (FRET) Analysis of Dual CFP/YFP Labeled AMPA Receptors Reveals Structural Rearrangement within the C-Terminal Domain during Receptor Activation

    DEFF Research Database (Denmark)

    Zachariassen, Linda Grønborg; Katchan, Mila; Plested, Andrew

    2014-01-01

    that retain function and display intrareceptor FRET. This includes a construct (GluA2-6Y-10C) containing YFP in the intracellular loop between the M1 and M2 membrane-embedded segments and CFP inserted in the C-ter- minal domain (CTD). GluA2-6Y-10C displays FRET with an efficiency of 0.11 while retaining wild......-type receptor expression and kinetic properties. We have used GluA2-6Y-10C to study conformational changes in homomeric GluA2 receptors during receptor activation. Our results show that the FRET efficiency is dependent on functional state of GluA2-6Y-10C and hereby indi- cates that the intracellular CTD...

  5. Isolation of hair follicle bulge stem cells from YFP-expressing reporter mice.

    Science.gov (United States)

    Nakrieko, Kerry-Ann; Irvine, Timothy S; Dagnino, Lina

    2013-01-01

    In this article we provide a method to isolate hair follicle stem cells that have undergone targeted gene inactivation. The mice from which these cells are isolated are bred into a Rosa26-yellow fluorescent protein (YFP) reporter background, which results in YFP expression in the targeted stem cell population. These cells are isolated and purified by fluorescence-activated cell sorting, using epidermal stem cell-specific markers in conjunction with YFP fluorescence. The purified cells can be used for gene expression studies, clonogenic experiments, and biological assays, such as viability and capacity for directional migration.

  6. A Genetically-Encoded YFP Sensor with Enhanced Chloride Sensitivity, Photostability and Reduced pH Interference Demonstrates Augmented Transmembrane Chloride Movement by Gerbil Prestin (SLC26a5)

    Science.gov (United States)

    Zhong, Sheng; Navaratnam, Dhasakumar; Santos-Sacchi, Joseph

    2014-01-01

    Background Chloride is the major anion in cells, with many diseases arising from disordered Cl− regulation. For the non-invasive investigation of Cl− flux, YFP-H148Q and its derivatives chameleon and Cl-Sensor previously were introduced as genetically encoded chloride indicators. Neither the Cl− sensitivity nor the pH-susceptibility of these modifications to YFP is optimal for precise measurements of Cl− under physiological conditions. Furthermore, the relatively poor photostability of YFP derivatives hinders their application for dynamic and quantitative Cl− measurements. Dynamic and accurate measurement of physiological concentrations of chloride would significantly affect our ability to study effects of chloride on cellular events. Methodology/Principal Findings In this study, we developed a series of YFP derivatives to remove pH interference, increase photostability and enhance chloride sensitivity. The final product, EYFP-F46L/Q69K/H148Q/I152L/V163S/S175G/S205V/A206K (monomeric Cl-YFP), has a chloride Kd of 14 mM and pKa of 5.9. The bleach time constant of 175 seconds is over 15-fold greater than wild-type EYFP. We have used the sensor fused to the transmembrane protein prestin (gerbil prestin, SLC26a5), and shown for the first time physiological (mM) chloride flux in HEK cells expressing this protein. This modified fluorescent protein will facilitate investigations of dynamics of chloride ions and their mediation of cell function. Conclusions Modifications to YFP (EYFP-F46L/Q69K/H148Q/I152L/V163S/S175G/S205V/A206K (monomeric Cl-YFP) results in a photostable fluorescent protein that allows measurement of physiological changes in chloride concentration while remaining minimally affected by changes in pH. PMID:24901231

  7. Fluorescence resonance energy transfer imaging of CFP/YFP labeled NDH in cyanobacterium cell

    International Nuclear Information System (INIS)

    Ji Dongmei; Lv Wei; Huang Zhengxi; Xia Andong; Xu Min; Ma Weimin; Mi Hualing; Ogawa Teruo

    2007-01-01

    The laser confocal scanning microscopy combined with time-correlated single photon counting imaging technique to obtain fluorescence intensity and fluorescence lifetime images for fluorescence resonance energy transfer measurement is reported. Both the fluorescence lifetime imaging microscopy (FLIM) and intensity images show inhomogeneous cyan fluorescent protein and yellow fluorescent protein (CFP /YFP) expression or inhomogeneous energy transfer between CFP and YFP over whole cell. The results presented in this work show that FLIM could be a potential method to reveal the structure-function behavior of NAD(P)H dehydrogenase complexes in living cell

  8. Birbeck granule-like "organized smooth endoplasmic reticulum" resulting from the expression of a cytoplasmic YFP-tagged langerin.

    Directory of Open Access Journals (Sweden)

    Cédric Lenormand

    Full Text Available Langerin is required for the biogenesis of Birbeck granules (BGs, the characteristic organelles of Langerhans cells. We previously used a Langerin-YFP fusion protein having a C-terminal luminal YFP tag to dynamically decipher the molecular and cellular processes which accompany the traffic of Langerin. In order to elucidate the interactions of Langerin with its trafficking effectors and their structural impact on the biogenesis of BGs, we generated a YFP-Langerin chimera with an N-terminal, cytosolic YFP tag. This latter fusion protein induced the formation of YFP-positive large puncta. Live cell imaging coupled to a fluorescence recovery after photobleaching approach showed that this coalescence of proteins in newly formed compartments was static. In contrast, the YFP-positive structures present in the pericentriolar region of cells expressing Langerin-YFP chimera, displayed fluorescent recovery characteristics compatible with active membrane exchanges. Using correlative light-electron microscopy we showed that the coalescent structures represented highly organized stacks of membranes with a pentalaminar architecture typical of BGs. Continuities between these organelles and the rough endoplasmic reticulum allowed us to identify the stacks of membranes as a form of "Organized Smooth Endoplasmic Reticulum" (OSER, with distinct molecular and physiological properties. The involvement of homotypic interactions between cytoplasmic YFP molecules was demonstrated using an A206K variant of YFP, which restored most of the Langerin traffic and BG characteristics observed in Langerhans cells. Mutation of the carbohydrate recognition domain also blocked the formation of OSER. Hence, a "double-lock" mechanism governs the behavior of YFP-Langerin, where asymmetric homodimerization of the YFP tag and homotypic interactions between the lectin domains of Langerin molecules participate in its retention and the subsequent formation of BG-like OSER. These

  9. Integrating Energy Efficiency into the 10 Year Framework of Programmes on Sustainable Consumption and Production Patterns (10YFP)

    DEFF Research Database (Denmark)

    This report summarises the discussions and conclusions from the workshop 'Integrating Energy Efficiency into the 10 Year Framework of Programmes on Sustainable Consumption and Production Patterns (10YFP)' jointly organised by the 10YFP Secretariat and the Copenhagen Centre on Energy Efficiency (C2E...

  10. Development of myenteric cholinergic neurons in ChAT-Cre;R26R-YFP mice.

    Science.gov (United States)

    Hao, Marlene M; Bornstein, Joel C; Young, Heather M

    2013-10-01

    Cholinergic neurons are the major excitatory neurons of the enteric nervous system (ENS), and include intrinsic sensory neurons, interneurons, and excitatory motor neurons. Cholinergic neurons have been detected in the embryonic ENS; however, the development of these neurons has been difficult to study as they are difficult to detect prior to birth using conventional immunohistochemistry. In this study we used ChAT-Cre;R26R-YFP mice to examine the development of cholinergic neurons in the gut of embryonic and postnatal mice. Cholinergic (YFP+) neurons were first detected at embryonic day (E)11.5, and the proportion of cholinergic neurons gradually increased during pre- and postnatal development. At birth, myenteric cholinergic neurons comprised less than half of their adult proportions in the small intestine (25% of myenteric neurons were YFP+ at P0 compared to 62% in adults). The earliest cholinergic neurons appear to mainly project anally. Projections into the presumptive circular muscle were first observed at E14.5. A subpopulation of cholinergic neurons coexpress calbindin through embryonic and postnatal development, but only a small proportion coexpressed neuronal nitric oxide synthase. Our study shows that cholinergic neurons in the ENS develop over a protracted period of time. © 2013 Wiley Periodicals, Inc.

  11. Thy1.2 YFP-16 transgenic mouse labels a subset of large-diameter sensory neurons that lack TRPV1 expression.

    Directory of Open Access Journals (Sweden)

    Thomas E Taylor-Clark

    Full Text Available The Thy1.2 YFP-16 mouse expresses yellow fluorescent protein (YFP in specific subsets of peripheral and central neurons. The original characterization of this model suggested that YFP was expressed in all sensory neurons, and this model has been subsequently used to study sensory nerve structure and function. Here, we have characterized the expression of YFP in the sensory ganglia (DRG, trigeminal and vagal of the Thy1.2 YFP-16 mouse, using biochemical, functional and anatomical analyses. Despite previous reports, we found that YFP was only expressed in approximately half of DRG and trigeminal neurons and less than 10% of vagal neurons. YFP-expression was only found in medium and large-diameter neurons that expressed neurofilament but not TRPV1. YFP-expressing neurons failed to respond to selective agonists for TRPV1, P2X(2/3 and TRPM8 channels in Ca2+ imaging assays. Confocal analysis of glabrous skin, hairy skin of the back and ear and skeletal muscle indicated that YFP was expressed in some peripheral terminals with structures consistent with their presumed non-nociceptive nature. In summary, the Thy1.2 YFP-16 mouse expresses robust YFP expression in only a subset of sensory neurons. But this mouse model is not suitable for the study of nociceptive nerves or the function of such nerves in pain and neuropathies.

  12. Local-circuit phenotypes of layer 5 neurons in motor-frontal cortex of YFP-H mice

    Directory of Open Access Journals (Sweden)

    Jianing Yu

    2008-12-01

    Full Text Available Layer 5 pyramidal neurons comprise an important but heterogeneous group of cortical projection neurons. In motor-frontal cortex, these neurons are centrally involved in the cortical control of movement. Recent studies indicate that local excitatory networks in mouse motor-frontal cortex are dominated by descending pathways from layer 2/3 to 5. However, those pathways were identified in experiments involving unlabeled neurons in wild type mice. Here, to explore the possibility of class-specific connectivity in this descending pathway, we mapped the local sources of excitatory synaptic input to a genetically labeled population of cortical neurons: YFP-positive layer 5 neurons of YFP-H mice. We found, first, that in motor cortex, YFP-positive neurons were distributed in a double blade, consistent with the idea of layer 5B having greater thickness in frontal neocortex. Second, whereas unlabeled neurons in upper layer 5 received their strongest inputs from layer 2, YFP-positive neurons in the upper blade received prominent layer 3 inputs. Third, YFP-positive neurons exhibited distinct electrophysiological properties, including low spike frequency adaptation, as reported previously. Our results with this genetically labeled neuronal population indicate the presence of distinct local-circuit phenotypes among layer 5 pyramidal neurons in mouse motor-frontal cortex, and present a paradigm for investigating local circuit organization in other genetically labeled populations of cortical neurons.

  13. Selectable high-yield recombinant protein production in human cells using a GFP/YFP nanobody affinity support.

    Science.gov (United States)

    Schellenberg, Matthew J; Petrovich, Robert M; Malone, Christine C; Williams, R Scott

    2018-03-25

    Recombinant protein expression systems that produce high yields of pure proteins and multi-protein complexes are essential to meet the needs of biologists, biochemists, and structural biologists using X-ray crystallography and cryo-electron microscopy. An ideal expression system for recombinant human proteins is cultured human cells where the correct translation and chaperone machinery are present. However, compared to bacterial expression systems, human cell cultures present several technical challenges to their use as an expression system. We developed a method that utilizes a YFP fusion-tag to generate recombinant proteins using suspension-cultured HEK293F cells. YFP is a dual-function tag that enables direct visualization and fluorescence-based selection of high expressing clones for and rapid purification using a high-stringency, high-affinity anti-GFP/YFP nanobody support. We demonstrate the utility of this system by expressing two large human proteins, TOP2α (340 KDa dimer) and a TOP2β catalytic core (260 KDa dimer). This robustly and reproducibly yields >10 mg/L liter of cell culture using transient expression or 2.5 mg/L using stable expression. Published 2018. This article is a US Government work and is in the public domain in the USA.

  14. Some secrets of fluorescent proteins: distinct bleaching in various mounting fluids and photoactivation of cyan fluorescent proteins at YFP-excitation.

    Science.gov (United States)

    Malkani, Naila; Schmid, Johannes A

    2011-04-07

    The use of spectrally distinct variants of green fluorescent protein (GFP) such as cyan or yellow mutants (CFP and YFP, respectively) is very common in all different fields of life sciences, e.g. for marking specific proteins or cells or to determine protein interactions. In the latter case, the quantum physical phenomenon of fluorescence resonance energy transfer (FRET) is exploited by specific microscopy techniques to visualize proximity of proteins. When we applied a commonly used FRET microscopy technique--the increase in donor (CFP)-fluorescence after bleaching of acceptor fluorophores (YFP), we obtained good signals in live cells, but very weak signals for the same samples after fixation and mounting in commercial microscopy mounting fluids. This observation could be traced back to much faster bleaching of CFP in these mounting media. Strikingly, the opposite effect of the mounting fluid was observed for YFP and also for other proteins such as Cerulean, TFP or Venus. The changes in photostability of CFP and YFP were not caused by the fixation but directly dependent on the mounting fluid. Furthermore we made the interesting observation that the CFP-fluorescence intensity increases by about 10-15% after illumination at the YFP-excitation wavelength--a phenomenon, which was also observed for Cerulean. This photoactivation of cyan fluorescent proteins at the YFP-excitation can cause false-positive signals in the FRET-microscopy technique that is based on bleaching of a yellow FRET acceptor. Our results show that photostability of fluorescent proteins differs significantly for various media and that CFP bleaches significantly faster in commercial mounting fluids, while the opposite is observed for YFP and some other proteins. Moreover, we show that the FRET microscopy technique that is based on bleaching of the YFP is prone to artifacts due to photoactivation of cyan fluorescent proteins under these conditions.

  15. Photochemical properties and sensor applications of modified yellow fluorescent protein (YFP) covalently attached to the surfaces of etched optical fibers (EOFs).

    Science.gov (United States)

    Veselov, Alexey A; Abraham, Bobin George; Lemmetyinen, Helge; Karp, Matti T; Tkachenko, Nikolai V

    2012-01-01

    Fluorescent proteins have the inherent ability to act as sensing components which function both in vitro and inside living cells. We describe here a novel study on a covalent site-specific bonding of fluorescent proteins to form self-assembled monolayers (SAMs) on the surface of etched optical fibers (EOFs). Deposition of fluorescent proteins on EOFs gives the opportunity to increase the interaction of guided light with deposited molecules relative to plane glass surfaces. The EOF modification is carried out by surface activation using 3-aminopropylthrimethoxysilane (APTMS) and bifunctional crosslinker sulfosuccinimidyl 4-[N-maleimidomethyl]cyclohexane-1-carboxylate (sulfo-SMCC) which exposes sulfhydryl-reactive maleimide groups followed by covalent site-specific coupling of modified yellow fluorescent protein (YFP). Steady-state and fluorescence lifetime measurements confirm the formation of SAM. The sensor applications of YPF SAMs on EOF are demonstrated by the gradual increase of emission intensity upon addition of Ca(2+) ions in the concentration range from a few tens of micromolars up to a few tens of millimolars. The studies on the effect of pH, divalent cations, denaturing agents, and proteases reveal the stability of YFP on EOFs at normal physiological conditions. However, treatments with 0.5% SDS at pH 8.5 and protease trypsin are found to denaturate or cleave the YFP from fiber surfaces.

  16. A bradykinin-potentiating peptide (BPP-10c) from Bothrops jararaca induces changes in seminiferous tubules.

    Science.gov (United States)

    Gilio, Joyce M; Portaro, Fernanda Cv; Borella, Maria I; Lameu, Claudiana; Camargo, Antonio Cm; Alberto-Silva, Carlos

    2013-11-06

    The testis-specific isoform of angiotensin-converting enzyme (tACE) is exclusively expressed in germ cells during spermatogenesis. Although the exact role of tACE in male fertility is unknown, it clearly plays a critical function in spermatogenesis. The dipeptidase domain of tACE is identical to the C-terminal catalytic domain of somatic ACE (sACE). Bradykinin potentiating peptides (BPPs) from snake venoms are the first natural sACE inhibitors described and their structure-activity relationship studies were the basis for the development of antihypertensive drugs such as captopril. In recent years, it has been showed that a number of BPPs - including BPP-10c - are able to distinguish between the N- and C-active sites of sACE, what is not applicable to captopril. Considering the similarity between tACE and sACE (and since BPPs are able to distinguish between the two active sites of sACE), the effects of the BPP-10c and captopril on the structure and function of the seminiferous epithelium were characterized in the present study. BPP-10c and captopril were administered in male Swiss mice by intraperitoneal injection (4.7 μmol/kg for 15 days) and histological sections of testes were analyzed. Classification of seminiferous tubules and stage analysis were carried out for quantitative evaluation of germ cells of the seminiferous epithelium. The blood-testis barrier (BTB) permeability and distribution of claudin-1 in the seminiferous epithelium were analyzed by hypertonic fixative method and immunohistochemical analyses of testes, respectively. The morphology of seminiferous tubules from animals treated with BPP-10c showed an intense disruption of the epithelium, presence of atypical multinucleated cells in the lumen and degenerated germ cells in the adluminal compartment. BPP-10c led to an increase in the number of round spermatids and total support capacity of Sertoli cell in stages I, V, VII/VIII of the seminiferous epithelium cycle, without affecting BTB permeability

  17. A bradykinin-potentiating peptide (BPP-10c) from Bothrops jararaca induces changes in seminiferous tubules

    Science.gov (United States)

    2013-01-01

    Background The testis-specific isoform of angiotensin-converting enzyme (tACE) is exclusively expressed in germ cells during spermatogenesis. Although the exact role of tACE in male fertility is unknown, it clearly plays a critical function in spermatogenesis. The dipeptidase domain of tACE is identical to the C-terminal catalytic domain of somatic ACE (sACE). Bradykinin potentiating peptides (BPPs) from snake venoms are the first natural sACE inhibitors described and their structure–activity relationship studies were the basis for the development of antihypertensive drugs such as captopril. In recent years, it has been showed that a number of BPPs – including BPP-10c – are able to distinguish between the N- and C-active sites of sACE, what is not applicable to captopril. Considering the similarity between tACE and sACE (and since BPPs are able to distinguish between the two active sites of sACE), the effects of the BPP-10c and captopril on the structure and function of the seminiferous epithelium were characterized in the present study. BPP-10c and captopril were administered in male Swiss mice by intraperitoneal injection (4.7 μmol/kg for 15 days) and histological sections of testes were analyzed. Classification of seminiferous tubules and stage analysis were carried out for quantitative evaluation of germ cells of the seminiferous epithelium. The blood-testis barrier (BTB) permeability and distribution of claudin-1 in the seminiferous epithelium were analyzed by hypertonic fixative method and immunohistochemical analyses of testes, respectively. Results The morphology of seminiferous tubules from animals treated with BPP-10c showed an intense disruption of the epithelium, presence of atypical multinucleated cells in the lumen and degenerated germ cells in the adluminal compartment. BPP-10c led to an increase in the number of round spermatids and total support capacity of Sertoli cell in stages I, V, VII/VIII of the seminiferous epithelium cycle, without

  18. Measurement of the super-allowed branching ratio of $^{10}$C

    CERN Multimedia

    We propose to measure the super-allowed branching ratio of $^{10}$C, the lightest of all nuclei decaying by a $0^{+} \\rightarrow 0^{+}$ transition. The light nuclei have a much stronger impact on limits of physics beyond the standard model than heavier nuclei. We propose a measurement which should reach a precision similar to the two latest measurements, however, with a different method employing a precisely efficiency-calibrated germanium detector. As no method exists to greatly improve on previous results, the branching ratio has to be measured with independent methods.

  19. Tissue distribution in mice of BPP 10c, a potent proline-rich anti-hypertensive peptide of Bothrops jararaca.

    Science.gov (United States)

    Silva, Carlos A; Portaro, Fernanda C V; Fernandes, Beatriz L; Ianzer, Danielle A; Guerreiro, Juliano R; Gomes, Claudiana L; Konno, Katsuhiro; Serrano, Solange M T; Nascimento, Nanci; Camargo, Antonio C M

    2008-03-15

    The snake venom proline-rich peptide BPP 10c is an active somatic angiotensin-converting enzyme (sACE) inhibitors. Recently we demonstrated that the anti-hypertensive effect of BPP 10c is not related to the inhibition of sACE alone, thus suggesting that this enzyme is not its only target for blood pressure reduction. In the present work, a biodistribution study in Swiss mice of [(125)I]-BPP 10c in the absence or in the presence of a saturating concentration of captopril, a selective active-site inhibitor of sACE, demonstrated that: (1) [(125)I]-BPP 10c was present in several organs and the renal absorption was significantly high; (2) [(125)I]-BPP 10c showed a clear preference for the kidney, maintaining a high concentration in this organ in the presence of captopril for at least 3h; (3) The residual amount of [(125)I]-BPP 10c in the kidney of animals simultaneously treated with captopril suggest that the peptide can interact with other targets different from sACE in this organ. We also showed that Cy3-labeled BPP 10c was internalized by human embryonic kidney cells (HEK-293T). Taken together, these results suggest that sACE inhibition by captopril affects the tissue distribution of [(125)I]-BPP 10c and that the anti-hypertensive effects of BPP 10c are not only dependent on sACE inhibition.

  20. 10C survey of radio sources at 15.7 GHz - II. First results

    Science.gov (United States)

    AMI Consortium; Davies, Mathhew L.; Franzen, Thomas M. O.; Waldram, Elizabeth M.; Grainge, Keith J. B.; Hobson, Michael P.; Hurley-Walker, Natasha; Lasenby, Anthony; Olamaie, Malak; Pooley, Guy G.; Riley, Julia M.; Rodríguez-Gonzálvez, Carmen; Saunders, Richard D. E.; Scaife, Anna M. M.; Schammel, Michel P.; Scott, Paul F.; Shimwell, Timothy W.; Titterington, David J.; Zwart, Jonathan T. L.

    2011-08-01

    In a previous paper (Paper I), the observational, mapping and source-extraction techniques used for the Tenth Cambridge (10C) Survey of Radio Sources were described. Here, the first results from the survey, carried out using the Arcminute Microkelvin Imager Large Array (LA) at an observing frequency of 15.7 GHz, are presented. The survey fields cover an area of ≈27 deg2 to a flux-density completeness of 1 mJy. Results for some deeper areas, covering ≈12 deg2, wholly contained within the total areas and complete to 0.5 mJy, are also presented. The completeness for both areas is estimated to be at least 93 per cent. The 10C survey is the deepest radio survey of any significant extent (≳0.2 deg2) above 1.4 GHz. The 10C source catalogue contains 1897 entries and is available online. The source catalogue has been combined with that of the Ninth Cambridge Survey to calculate the 15.7-GHz source counts. A broken power law is found to provide a good parametrization of the differential count between 0.5 mJy and 1 Jy. The measured source count has been compared with that predicted by de Zotti et al. - the model is found to display good agreement with the data at the highest flux densities. However, over the entire flux-density range of the measured count (0.5 mJy to 1 Jy), the model is found to underpredict the integrated count by ≈30 per cent. Entries from the source catalogue have been matched with those contained in the catalogues of the NRAO VLA Sky Survey and the Faint Images of the Radio Sky at Twenty-cm survey (both of which have observing frequencies of 1.4 GHz). This matching provides evidence for a shift in the typical 1.4-GHz spectral index to 15.7-GHz spectral index of the 15.7-GHz-selected source population with decreasing flux density towards sub-mJy levels - the spectra tend to become less steep. Automated methods for detecting extended sources, developed in Paper I, have been applied to the data; ≈5 per cent of the sources are found to be extended

  1. KEPLER-10 c: A 2.2 EARTH RADIUS TRANSITING PLANET IN A MULTIPLE SYSTEM

    International Nuclear Information System (INIS)

    Fressin, Francois; Torres, Guillermo; Desert, Jean-Michel; Charbonneau, David; Holman, Matthew J.; Batalha, Natalie M.; Fortney, Jonathan J.; Fabrycky, Daniel C.; Rowe, Jason F.; Allen, Christopher; Borucki, William J.; Bryson, Stephen T.; Henze, Christopher E.; Brown, Timothy M.; Ciardi, David R.; Cochran, William D.; Deming, Drake; Dunham, Edward W.; Gautier III, Thomas N.; Gilliland, Ronald L.

    2011-01-01

    The Kepler mission has recently announced the discovery of Kepler-10 b, the smallest exoplanet discovered to date and the first rocky planet found by the spacecraft. A second, 45 day period transit-like signal present in the photometry from the first eight months of data could not be confirmed as being caused by a planet at the time of that announcement. Here we apply the light curve modeling technique known as BLENDER to explore the possibility that the signal might be due to an astrophysical false positive (blend). To aid in this analysis we report the observation of two transits with the Spitzer Space Telescope at 4.5 μm. When combined, they yield a transit depth of 344 ± 85 ppm that is consistent with the depth in the Kepler passband (376 ± 9 ppm, ignoring limb darkening), which rules out blends with an eclipsing binary of a significantly different color than the target. Using these observations along with other constraints from high-resolution imaging and spectroscopy, we are able to exclude the vast majority of possible false positives. We assess the likelihood of the remaining blends, and arrive conservatively at a false alarm rate of 1.6 x 10 -5 that is small enough to validate the candidate as a planet (designated Kepler-10 c) with a very high level of confidence. The radius of this object is measured to be R p = 2.227 +0.052 -0.057 R + (in which the error includes the uncertainty in the stellar properties), but currently available radial-velocity measurements only place an upper limit on its mass of about 20 M + . Kepler-10 c represents another example (with Kepler-9 d and Kepler-11 g) of statistical 'validation' of a transiting exoplanet, as opposed to the usual 'confirmation' that can take place when the Doppler signal is detected or transit timing variations are measured. It is anticipated that many of Kepler's smaller candidates will receive a similar treatment since dynamical confirmation may be difficult or impractical with the sensitivity of

  2. Trichoderma reesei FS10-C enhances phytoremediation of Cd-contaminated soil by Sedum plumbizincicola and associated soil microbial activities

    Science.gov (United States)

    Teng, Ying; Luo, Yang; Ma, Wenting; Zhu, Lingjia; Ren, Wenjie; Luo, Yongming; Christie, Peter; Li, Zhengao

    2015-01-01

    This study aimed to explore the effects of Trichoderma reesei FS10-C on the phytoremediation of Cd-contaminated soil by the hyperaccumulator Sedum plumbizincicola and on soil fertility. The Cd tolerance of T. reesei FS10-C was characterized and then a pot experiment was conducted to investigate the growth and Cd uptake of S. plumbizincicola with the addition of inoculation agents in the presence and absence of T. reesei FS10-C. The results indicated that FS10-C possessed high Cd resistance (up to 300 mg L-1). All inoculation agents investigated enhanced plant shoot biomass by 6–53% of fresh weight and 16–61% of dry weight and Cd uptake by the shoots by 10–53% compared with the control. All inoculation agents also played critical roles in increasing soil microbial biomass and microbial activities (such as biomass C, dehydrogenase activity and fluorescein diacetate hydrolysis activity). Two inoculation agents accompanied by FS10-C were also superior to the inoculation agents, indicating that T. reesei FS10-C was effective in enhancing both Cd phytoremediation by S. plumbizincicola and soil fertility. Furthermore, solid fermentation powder of FS10-C showed the greatest capacity to enhance plant growth, Cd uptake, nutrient release, microbial biomass and activities, as indicated by its superior ability to promote colonization by Trichoderma. The solid fermentation powder of FS10-C might serve as a suitable inoculation agent for T. reesei FS10-C to enhance both the phytoremediation efficiency of Cd-contaminated soil and soil fertility. PMID:26113858

  3. In vivo imaging and quantitative monitoring of autophagic flux in tobacco BY-2 cells.

    Science.gov (United States)

    Hanamata, Shigeru; Kurusu, Takamitsu; Okada, Masaaki; Suda, Akiko; Kawamura, Koki; Tsukada, Emi; Kuchitsu, Kazuyuki

    2013-01-01

    Autophagy has been shown to play essential roles in the growth, development and survival of eukaryotic cells. However, simple methods for quantification and visualization of autophagic flux remain to be developed in living plant cells. Here, we analyzed the autophagic flux in transgenic tobacco BY-2 cell lines expressing fluorescence-tagged NtATG8a as a marker for autophagosome formation. Under sucrose-starved conditions, the number of punctate signals of YFP-NtATG8a increased, and the fluorescence intensity of the cytoplasm and nucleoplasm decreased. Conversely, these changes were not observed in BY-2 cells expressing a C-terminal glycine deletion mutant of the NtATG8a protein (NtATG8aΔG). To monitor the autophagic flux more easily, we generated a transgenic BY-2 cell line expressing NtATG8a fused to a pH-sensitive fluorescent tag, a tandem fusion of the acid-insensitive RFP and the acid-sensitive YFP. In sucrose-rich conditions, both fluorescent signals were detected in the cytoplasm and only weakly in the vacuole. In contrast, under sucrose-starved conditions, the fluorescence intensity of the cytoplasm decreased, and the RFP signal clearly increased in the vacuole, corresponding to the fusion of the autophagosome to the vacuole and translocation of ATG8 from the cytoplasm to the vacuole. Moreover, we introduce a novel simple easy way to monitor the autophagic flux non-invasively by only measuring the ratio of fluorescence of RFP and YFP in the cell suspension using a fluorescent image analyzer without microscopy. The present in vivo quantitative monitoring system for the autophagic flux offers a powerful tool for determining the physiological functions and molecular mechanisms of plant autophagy induced by environmental stimuli.

  4. TENCompetence Learning Design Toolkit, Runtime component, ccsi_v3_2_10c_v1_4

    NARCIS (Netherlands)

    Sharples, Paul; Popat, Kris; Llobet, Lau; Santos, Patricia; Hernández-Leo, Davinia; Miao, Yongwu; Griffiths, David; Beauvoir, Phillip

    2010-01-01

    Sharples, P., Popat, K., Llobet, L., Santos, P., Hernandez-Leo, D., Miao, Y., Griffiths, D. & Beauvoir, P. (2009) TENCompetence Learning Design Toolkit, Runtime component, ccsi_v3_2_10c_v1_4 This release is composed of three files corresponding to CopperCore Service Integration (CCSI) v3.2-10cv1.4,

  5. 76 FR 37059 - Siuslaw National Forest; Oregon; Oregon Dunes NRA Management Area 10 (C) Route and Area Designation

    Science.gov (United States)

    2011-06-24

    ... DEPARTMENT OF AGRICULTURE Forest Service Siuslaw National Forest; Oregon; Oregon Dunes NRA Management Area 10 (C) Route and Area Designation AGENCY: Forest Service, USDA. ACTION: Notice of intent to... (C) today are not designated routes. This has in turn led to greater and unnecessary impacts to...

  6. Search for resonant states in 10C and 11C and their impact on the primordial 7Li abundance

    Science.gov (United States)

    Hammache, F.; Coc, A.; de Séréville, N.; Stefan, I.; Roussel, P.; Assié, M.; Audouin, L.; Beaumel, D.; Franchoo, S.; Fernandez-Dominguez, B.; Fox, S.; Hamadache, C.; Kiener, J.; Laird, A.; Le Crom, B.; Lefebvre-Schuhl, A.; Lefebvre, L.; Matea, I.; Matta, A.; Mavilla, G.; Mrazek, J.; Morfouace, P.; de Oliveira Santos, F.; Parikh, A.; Perrot, L.; Sanchez-Benitez, A. M.; Suzuki, D.; Tatischeff, V.; Ujic, P.; Vandebrouck, Marine

    2018-01-01

    The cosmological 7Li problem arises from the significant discrepancy of about a factor 3 between the predicted primordial 7Li abundance and the observed one. The main process for the production of 7Li during Big-Bang nucleosynthesis is the decay of 7Be. Many key nuclear reactions involved in the production and destruction of 7Be were investigated in attempt to explain the 7Li deficit but none of them led to successful conclusions. However, some authors suggested recently the possibility that the destruction of 7Be by 3He and 4He may reconcile the predictions and observations if missing resonant states in the compound nuclei 10C and 11C exist. Hence, a search of these missing resonant states in 10C and 11C was investigated at the Orsay Tandem-Alto facility through 10B(3He,t)10C and 11B(3He,t)11C charge-exchange reactions respectively. After a short overview of the cosmological 7Li problem from a nuclear physics point of view, a description of the Orsay experiment will be given as well as the obtained results and their impact on the 7Li problem.

  7. A quantitative characterization of the yeast heterotrimeric G protein cycle

    Science.gov (United States)

    Yi, Tau-Mu; Kitano, Hiroaki; Simon, Melvin I.

    2003-01-01

    The yeast mating response is one of the best understood heterotrimeric G protein signaling pathways. Yet, most descriptions of this system have been qualitative. We have quantitatively characterized the heterotrimeric G protein cycle in yeast based on direct in vivo measurements. We used fluorescence resonance energy transfer to monitor the association state of cyan fluorescent protein (CFP)-Gα and Gβγ-yellow fluorescent protein (YFP), and we found that receptor-mediated G protein activation produced a loss of fluorescence resonance energy transfer. Quantitative time course and dose–response data were obtained for both wild-type and mutant cells possessing an altered pheromone response. These results paint a quantitative portrait of how regulators such as Sst2p and the C-terminal tail of α-factor receptor modulate the kinetics and sensitivity of G protein signaling. We have explored critical features of the dynamics including the rapid rise and subsequent decline of active G proteins during the early response, and the relationship between the G protein activation dose–response curve and the downstream dose–response curves for cell-cycle arrest and transcriptional induction. Fitting the data to a mathematical model produced estimates of the in vivo rates of heterotrimeric G protein activation and deactivation in yeast. PMID:12960402

  8. Bradykinin-potentiating PEPTIDE-10C, an argininosuccinate synthetase activator, protects against H2O2-induced oxidative stress in SH-SY5Y neuroblastoma cells.

    Science.gov (United States)

    Querobino, Samyr Machado; Ribeiro, César Augusto João; Alberto-Silva, Carlos

    2018-05-01

    Bradykinin-potentiating peptides (BPPs - 5a, 7a, 9a, 10c, 11e, and 12b) of Bothrops jararaca (Bj) were described as argininosuccinate synthase (AsS) activators, improving l-arginine availability. Agmatine and polyamines, which are l-arginine metabolism products, have neuroprotective properties. Here, we investigated the neuroprotective effects of low molecular mass fraction from Bj venom (LMMF) and two synthetic BPPs (BPP-10c, BPP-12b, BPP-10c showed higher protective capacity than BPP-12b. LMMF pretreatment was unable to prevent the reduction of cell viability caused by H 2 O 2 . The neuroprotective mechanism of BPP-10c against oxidative stress was investigated. BPP-10c reduced ROS generation and lipid peroxidation in relation to cells treated only with H 2 O 2 . BBP-10c increased AsS expression and was not neuroprotective in the presence of MDLA, a specific inhibitor of AsS. BPP-10c reduced iNOS expression and nitrate levels but decreased NF-kB expression. Furthermore, BPP-10c protected the mitochondrial membrane against oxidation. Overall, we demonstrated for the first time neuroprotective mechanisms of BPPs against oxidative stress, opening new perspectives to the study and application of these peptides for the treatment of neurodegenerative diseases. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. A method for determining the branching ratio for the superallowed decay: 10C(0+, gs)→10B(0+, 1.74 MeV)+e++ν

    International Nuclear Information System (INIS)

    Kroupa, M.A.; Freedman, S.J.; Barker, P.H.; Ferguson, S.M.

    1991-01-01

    We describe a new way of determining the strength of the superallowed branch of the β-decay of 10 C, 10 C(0 + , gs) → 10 B(0 + , 1.74 MeV) + e + + ν. Precise knowledge of the branching ratio is needed to compute the experimental ft-value and the weak vector coupling constant. (orig.)

  10. Structure and dynamics of mica-confined films of [C10C1Pyrr][NTf2] ionic liquid

    Science.gov (United States)

    Freitas, Adilson Alves de; Shimizu, Karina; Smith, Alexander M.; Perkin, Susan; Canongia Lopes, José Nuno

    2018-05-01

    The structure of the ionic liquid 1-decyl-1-methylpyrrolidinium bis[(trifluoromethane)sulfonyl]imide, [C10C1Pyrr][NTf2], has been probed using Molecular Dynamics (MD) simulations. The simulations endeavour to model the behaviour of the ionic liquid in bulk isotropic conditions and also at interfaces and in confinement. The MD results have been confronted and validated with scattering and surface force experiments reported in the literature. The calculated structure factors, distribution functions, and density profiles were able to provide molecular and mechanistic insights into the properties of these long chain ionic liquids under different conditions, in particular those that lead to the formation of multi-layered ionic liquid films in confinement. Other properties inaccessible to experiment such as in-plane structures and relaxation rates within the films have also been analysed. Overall the work contributes structural and dynamic information relevant to many applications of ionic liquids with long alkyl chains, ranging from nanoparticle synthesis to lubrication.

  11. Production and Extraction of [10C]-CO2 From Proton Bombardment of Molten 10B2O3

    International Nuclear Information System (INIS)

    Schueller, M.J.; Nickles, R.J.; Roberts, A.D.; Jensen, M.

    2003-01-01

    This work describes the production of 10C (t (1/2) = 19 s) from an enriched 10B2O3 target using a CTI RDS-112 11 MeV proton cyclotron. Proton beam heating is used to raise the target to a molten state (∼ 1300 deg. C), enabling the activity to diffuse to the surface of the melt. An infrared thermocouple monitors the melt temperature. Helium sweep gas then transports the activity to flow-through chemistry processing for human inhalation of 10CO2 for blood flow imaging with Positron Emission Tomography. The temperature-related diffusion of activity out of the white-hot molten glass target is discussed

  12. Multiple proton decays of 6Be, 8C, 8B(IAS) and excited states in 10C

    Science.gov (United States)

    Sobotka, Lee

    2011-10-01

    Recent technical advances have allowed for high-order correlation experiments to be done. We have primarily focused on experiments in which the final channels are composed of only alphas and protons. Four cases we have studied are: 6Be, 10C*, 8C, and 8B*(IAS) via 3, 4, 5, and 3-particle correlation measurements, respectively. While the first case had been studied before, our work presents very high statistics in the full Jacobi coordinates (the coordinates needed to describe 3-body decay.) Our study of 10C excited states provides isolatable examples of: correlated 2p decay, from one state, and the decay of another which is unusually highly correlated, a ``ménage a quatre.'' 8C decay presents the only case of sequential 3-body 2p decay steps (i.e. 2p-2p.) The intermediate in this 2-step process is the first example (6Be) mentioned above. Unlike the well-studied second step (6Be decay), the first step in this 2p-2p process provides another example of correlated 2p emission. The decay of 8B(IAS), the isobaric analog of 8C, also decays overwhelmingly by 2p emission, in this case to 6Li(IAS). This IAS-to-IAS 2p decay is one for which decay to the potential 1p intermediates is energetically allowed but isospin forbidden. This represents an expansion, over that originally envisioned by Goldanski, of the conceivable nuclear territory for 2p decay.

  13. Design of Online Spheroidization Process for 1.0C-1.5Cr Bearing Steel and Microstructure Analysis

    Science.gov (United States)

    Li, Zhen-Xing; Li, Chang-Sheng; Ren, Jin-Yi; Li, Bin-Zhou; Suh, Dong-Woo

    2018-02-01

    Using thermo-mechanical control process, the online spheroidization annealing process of 1.0C-1.5Cr bearing steel was designed. Apart from intercritical online spheroidization (IS), a novel subcritical online spheroidization (SS) process was proposed, which is characterized by water-cooling to around 773 K (500 °C) after the final rolling pass, and then directly reheating to 973 K (700 °C) for isothermal holding. Compared with the results from the traditional offline spheroidization (TS) process, the size of spheroidized carbides is similar in both the TS and IS processes, whereas it is much smaller in the SS process. After spheroidization annealing, microstructure evolution during austenitization and quenching treatment was examined. It is shown that the refining of spheroidized carbides accelerates the dissolution of carbides during the austenitizing process, and decreases the size of undissolved carbides. In addition, the SS process can obtain finer prior austenite grain after quenching, which contributes to the enhancement of final hardness.

  14. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  15. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  16. Visualization and quantitative analysis of reconstituted tight junctions using localization microscopy.

    Directory of Open Access Journals (Sweden)

    Rainer Kaufmann

    Full Text Available Tight Junctions (TJ regulate paracellular permeability of tissue barriers. Claudins (Cld form the backbone of TJ-strands. Pore-forming claudins determine the permeability for ions, whereas that for solutes and macromolecules is assumed to be crucially restricted by the strand morphology (i.e., density, branching and continuity. To investigate determinants of the morphology of TJ-strands we established a novel approach using localization microscopy.TJ-strands were reconstituted by stable transfection of HEK293 cells with the barrier-forming Cld3 or Cld5. Strands were investigated at cell-cell contacts by Spectral Position Determination Microscopy (SPDM, a method of localization microscopy using standard fluorophores. Extended TJ-networks of Cld3-YFP and Cld5-YFP were observed. For each network, 200,000 to 1,100,000 individual molecules were detected with a mean localization accuracy of ∼20 nm, yielding a mean structural resolution of ∼50 nm. Compared to conventional fluorescence microscopy, this strongly improved the visualization of strand networks and enabled quantitative morphometric analysis. Two populations of elliptic meshes (mean diameter <100 nm and 300-600 nm, respectively were revealed. For Cld5 the two populations were more separated than for Cld3. Discrimination of non-polymeric molecules and molecules within polymeric strands was achieved. For both subtypes of claudins the mean density of detected molecules was similar and estimated to be ∼24 times higher within the strands than outside the strands.The morphometry and single molecule information provided advances the mechanistic analysis of paracellular barriers. Applying this novel method to different TJ-proteins is expected to significantly improve the understanding of TJ on the molecular level.

  17. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  18. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  19. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  20. INT WFC photometry of a Galactic flare star spatially coincident with the recurrent nova candidate M31N 1966-08a = 1968-10c

    Science.gov (United States)

    Arce-Tord, C.; Esteban-Gutierrez, A.; Garcia-Broock, E.; Garcia-Rivas, M.; Gonzalez-Cuesta, L.; Hermosa-Muñoz, L.; Hernandez-Sanchez, M.; Jimenez-Gallardo, A.; Lopez-Navas, E.; Mantero-Castañeda, E. A.; Otero-Santos, J.; Prendin, M. G.; Rodriguez-Sanchez, M.; Perez-Fournon, I.

    2017-12-01

    We report photometry in the Sloan g (3x150s), r (3x100s), and i (3x100s) bands of the recurrent nova candidate M31N 1966-08a = 1968-10c (= PNV J00412371+4114594) from observations with the Wide Field Camera of the Isaac Newton Telescope* on the night of 28 October 2017.

  1. CLA isomer t10,c12 induce oxidation and apoptosis in 3t3 adipocyte cells in a similar effect as omega-3 linolenic acid and DHA.

    Directory of Open Access Journals (Sweden)

    Jon Meadus

    2017-02-01

    Full Text Available Background: Commercial conjugated linoleic acid (CLA dietary supplements contain an equal mixture of the C18:2 isomers, cis-9trans-11 and trans-10cis-12. Predominantly, CLA-c9t11 occurs naturally in meat and dairy products at ~ 0.5% of total fat , whereas CLA-t10c12 occurs at >0.1%. Recent studies show that CLA-c9t11 generally promotes lipid accumulation but CLA-t10c12 may inhibit lipid accumulation and may also promote inflammation. The omega-3 fatty acids α-linolenic acid (C18:3n-3 and docosahexaenoic acid (DHA have also been observed to inhibit lipid accumulation and effect inflammation; therefore we examined the effects of the two main isomersof CLA and omega -3 fatty acids C18:3n-3 and DHA at the molecular levelto determine if they are causing similar oxidative stresses.Methods:Purified CLA-c9t11 and CLA-t10c12 were added to 3T3 cells induced into mature adipocyte cultures at 100uM concentrations and compared with 100uM C18:3n-3(α-linolenic acid and 50uM docosahexaenoic acid (DHA to observe their effect on growth, gene transcription and general oxidation. The results of multiple separate trials were averaged and compared for significance at levels of P< 0.05, using one way ANOVA and Student’s t-test.Results:C18:3n-3, DHA and CLA-t10c12inhibited 3T3 adipose cell growth and caused a significant increase in lipid hydro peroxide activity. CLA-t10c12 and c9t11 increased AFABP, FAS and ACOX1 mRNA gene expression but DHA and C18:3n-3decreased the same mRNAs. CLA-c9t11 but not the t10c12 stimulated adipoQ expression even though; CLA-c9t11 had only a slightly greater affinity for PPARγ than CLA-t10c12, according to TR-FRET assays. The expression of the xenobiotic metabolism genes, aldo-keto reduct as 1c1 (akr1c1, superoxide dismutase (SODand inflammation chemokine secretions of eotaxin (CCL11, Rantes (CCL5, MIG (CCL9 and MCP-1 were increased by DHA, C18:3n-3and CLA-t10c12 but not CLA-c9t11. This correlated with an increase in apoptosis factors

  2. How protein recognizes ladder-like polycyclic ethers. Interactions between ciguatoxin (CTX3C) fragments and its specific antibody 10C9.

    Science.gov (United States)

    Ui, Mihoko; Tanaka, Yoshikazu; Tsumuraya, Takeshi; Fujii, Ikuo; Inoue, Masayuki; Hirama, Masahiro; Tsumoto, Kouhei

    2008-07-11

    Ciguatoxins are a family of marine toxins composed of transfused polycyclic ethers. It has not yet been clarified at the atomic level on the pathogenic mechanism of these toxins or the interaction between a polycyclic ether compounds and a protein. Using the crystal structures of anti-ciguatoxin antibody 10C9 Fab in ligand-free form and in complexes with ABCD-ring (CTX3C-ABCD) and ABCDE-ring (CTX3C-ABCDE) fragments of the antigen CTX3C at resolutions of 2.6, 2.4, and 2.3 angstroms, respectively, we elucidated the mechanism of the interaction between the polycyclic ethers and the antibody. 10C9 Fab has an extraordinarily large and deep binding pocket at the center of the variable region, where CTX3C-ABCD or CTX3C-ABCDE binds longitudinally in the pocket via hydrogen bonds and van der Waals interactions. Upon antigen-antibody complexation, 10C9 Fab adjusts to the antigen fragments by means of rotational motion in the variable region. In addition, the antigen fragment lacking the E-ring induces a large motion in the constant region. Consequently, the thermostability of 10C9 Fab is enhanced by 10 degrees C upon complexation with CTX3C-ABCDE but not with CTX3C-ABCD. The crystal structures presented in this study also show that 10C9 Fab recoginition of CTX3C antigens requires molecular rearrangements over the entire antibody structure. These results further expand the fundamental understanding of the mechanism by which ladder-like polycyclic ethers are recognized and may be useful for the design of novel therapeutic agents by antibodies, marine toxins, or new diagnostic reagents for the detection and targeting of members of the polycyclic ether family.

  3. Expression, Purification and Characterization of GMZ2'.10C, a Complex Disulphide-Bonded Fusion Protein Vaccine Candidate against the Asexual and Sexual Life-Stages of the Malaria-Causing Plasmodium falciparum Parasite

    NARCIS (Netherlands)

    Mistarz, U.H.; Singh, S.K; Nguyen, T.; Roeffen, W.; Lissau, C.; Madsen, S.M.; Vrang, A.; Tiendrebeogo, R.W.; Kana, I.H.; Sauerwein, R.W.; Theisen, M.; Rand, K.D.

    2017-01-01

    PURPOSE: Production and characterization of a chimeric fusion protein (GMZ2'.10C) which combines epitopes of key malaria parasite antigens: glutamate-rich protein (GLURP), merozoite surface protein 3 (MSP3), and the highly disulphide bonded Pfs48/45 (10C). GMZ2'.10C is a potential candidate for a

  4. Quantitative Thermochronology

    Science.gov (United States)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  5. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  6. Structural and energetic hot-spots for the interaction between a ladder-like polycyclic ether and the anti-ciguatoxin antibody 10C9Fab.

    Science.gov (United States)

    Ui, Mihoko; Tanaka, Yoshikazu; Tsumuraya, Takeshi; Fujii, Ikuo; Inoue, Masayuki; Hirama, Masahiro; Tsumoto, Kouhei

    2011-03-01

    The mechanism by which anti-ciguatoxin antibody 10C9Fab recognizes a fragment of ciguatoxin CTX3C (CTX3C-ABCDE) was investigated by mutational analysis based on structural data. 10C9Fab has an extraordinarily large and deep antigen-binding pocket at the center of its variable region. We mutated several residues located at the antigen-binding pocket to Ala, and kinetic analysis of the interactions between the mutant proteins and the antigen fragment was performed. The results indicate that some residues associated with the rigid antigen-binding pocket are structural hot-spots and that L-N94 is an energetic hot-spot for association of the antibody with the antigen fragment CTX3C-ABCDE, suggesting the importance of structural complementarity and energetic hot-spot interactions for specific recognition of polycyclic ethers.

  7. Low-Temperature (10?C) Anaerobic Digestion of Dilute Dairy Wastewater in an EGSB Bioreactor: Microbial Community Structure, Population Dynamics, and Kinetics of Methanogenic Populations

    OpenAIRE

    Bialek, Katarzyna; Cysneiros, Denise; O'Flaherty, Vincent

    2013-01-01

    The feasibility of anaerobic digestion of dairy wastewater at 10?C was investigated in a high height?:?diameter ratio EGSB reactor. Stable performance was observed at an applied organic loading rate (OLR) of 0.5?2?kg?COD?m?3?d?1 with chemical oxygen demand (COD) removal efficiencies above 85%. When applied OLR increased to values above 2?kg?COD?m?3?d?1, biotreatment efficiency deteriorated, with methanogenesis being the rate-limiting step. The bioreactor recovered quickly (3 days) after reduc...

  8. Expression, Purification and Characterization of GMZ2'.10C, a Complex Disulphide-Bonded Fusion Protein Vaccine Candidate against the Asexual and Sexual Life-Stages of the Malaria-Causing Plasmodium falciparum Parasite

    DEFF Research Database (Denmark)

    Mistarz, Ulrik H; Singh, Susheel K; Nguyen, Tam T T N

    2017-01-01

    PURPOSE: Production and characterization of a chimeric fusion protein (GMZ2'.10C) which combines epitopes of key malaria parasite antigens: glutamate-rich protein (GLURP), merozoite surface protein 3 (MSP3), and the highly disulphide bonded Pfs48/45 (10C). GMZ2'.10C is a potential candidate...... was analysed by RP-HPLC, SEC-HPLC, 2-site ELISA, gel-electrophoresis and Western blotting. Structural characterization (mass analysis, peptide mapping and cysteine connectivity mapping) was performed by LC-MS/MS. RESULTS: CP-GMZ2'.10C resulted in similar purity, yield, structure and stability as compared to IP...

  9. Low Dose Radiation Response Curves, Networks and Pathways in Human Lymphoblastoid Cells Exposed from 1 to 10 cGy of Acute Gamma Radiation

    Energy Technology Data Exchange (ETDEWEB)

    Wyrobek, A. J.; Manohar, C. F.; Nelson, D. O.; Furtado, M. R.; Bhattacharya, M. S.; Marchetti, F.; Coleman, M.A.

    2011-04-18

    We investigated the low dose dependency of the transcriptional response of human cells to characterize the shape and biological functions associated with the dose response curve and to identify common and conserved functions of low dose expressed genes across cells and tissues. Human lymphoblastoid (HL) cells from two unrelated individuals were exposed to graded doses of radiation spanning the range of 1-10 cGy were analyzed by transcriptome profiling, qPCR and bioinformatics, in comparison to sham irradiated samples. A set of {approx}80 genes showed consistent responses in both cell lines; these genes were associated with homeostasis mechanisms (e.g., membrane signaling, molecule transport), subcellular locations (e.g., Golgi, and endoplasmic reticulum), and involved diverse signal transduction pathways. The majority of radiation-modulated genes had plateau-like responses across 1-10 cGy, some with suggestive evidence that transcription was modulated at doses below 1 cGy. MYC, FOS and TP53 were the major network nodes of the low-dose response in HL cells. Comparison our low dose expression findings in HL cells with those of prior studies in mouse brain after whole body exposure, in human keratinocyte cultures, and in endothelial cells cultures, indicates that certain components of the low dose radiation response are broadly conserved across cell types and tissues, independent of proliferation status.

  10. Expression, Purification and Characterization of GMZ2'.10C, a Complex Disulphide-Bonded Fusion Protein Vaccine Candidate against the Asexual and Sexual Life-Stages of the Malaria-Causing Plasmodium falciparum Parasite.

    Science.gov (United States)

    Mistarz, Ulrik H; Singh, Susheel K; Nguyen, Tam T T N; Roeffen, Will; Yang, Fen; Lissau, Casper; Madsen, Søren M; Vrang, Astrid; Tiendrebeogo, Régis W; Kana, Ikhlaq H; Sauerwein, Robert W; Theisen, Michael; Rand, Kasper D

    2017-09-01

    Production and characterization of a chimeric fusion protein (GMZ2'.10C) which combines epitopes of key malaria parasite antigens: glutamate-rich protein (GLURP), merozoite surface protein 3 (MSP3), and the highly disulphide bonded Pfs48/45 (10C). GMZ2'.10C is a potential candidate for a multi-stage malaria vaccine that targets both transmission and asexual life-cycle stages of the parasite. GMZ2'.10C was produced in Lactococcus lactis and purified using either an immunoaffinity purification (IP) or a conventional purification (CP) method. Protein purity and stability was analysed by RP-HPLC, SEC-HPLC, 2-site ELISA, gel-electrophoresis and Western blotting. Structural characterization (mass analysis, peptide mapping and cysteine connectivity mapping) was performed by LC-MS/MS. CP-GMZ2'.10C resulted in similar purity, yield, structure and stability as compared to IP-GMZ2'.10C. CP-GMZ2'.10C and IP-GMZ2'.10C both elicited a high titer of transmission blocking (TB) antibodies in rodents. The intricate disulphide-bond connectivity of C-terminus Pfs48/45 was analysed by tandem mass spectrometry and was established for GMZ2'.10C and two reference fusion proteins encompassing similar parts of Pfs48/45. GMZ2'.10C, combining GMZ2' and correctly-folded Pfs48/45 can be produced by the Lactoccus lactis P170 based expression system in purity and quality for pharmaceutical development and elicit high level of TB antibodies. The cysteine connectivity for the 10C region of Pfs48/45 was revealed experimentally, providing an important guideline for employing the Pfs48/45 antigen in vaccine design.

  11. Stability of Mg-sulfates at-10C and the rates of dehydration/rehydration processes under conditions relevant to Mars

    Science.gov (United States)

    Wang, A.; Freeman, J.J.; Chou, I.-Ming; Jolliff, B.L.

    2011-01-01

    We report the results of low temperature (-10??C) experiments on the stability fields and phase transition pathways of five hydrous Mg-sulfates. A low temperature form of MgSO 47H 2O (LT-7w) was found to have a wide stability field that extends to low relative humidity (???13% RH at-10??C). Using information on the timing of phase transitions, we extracted information on the reaction rates of five important dehydration and rehydration processes. We found that the temperature dependencies of rate constants for dehydration processes differ from those of rehydration, which reflect differences in reaction mechanisms. By extrapolating these rate constants versus T correlations into the T range relevant to Mars, we can evaluate the possibility of occurrence of specific processes and the presence of common Mg-sulfate species present on Mars in different periods and locations. We anticipate in a moderate obliquity period, starkeyite and LH-MgSO 4H 2O should be two common Mg-sulfates at the surface, another polymorph MH-MgSO 4H 2O can exist at the locations where hydrothermal processes may have occurred. In polar regions or within the subsurface of other regions, meridianiite (coexisting with water ice, near 100% RH) and LT-7w (over a large RH range) are the stable phases. During a high obliquity period, meridianiite and LT-7w should exhibit widespread occurrence. The correlations of reaction rates versus temperature found in this study imply that dehydration and rehydration of hydrous Mg-sulfates would always be slower than the sublimation and crystallization of water ice, which would be supported by mission observations from Odyssey and by Mars Exploration Rovers. Copyright 2011 by the American Geophysical Union.

  12. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  13. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  14. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  15. Solid-state Fermentation of Xylanase from Penicillium canescens 10-10c in a Multi-layer-packed Bed Reactor

    Science.gov (United States)

    Assamoi, Antoine A.; Destain, Jacqueline; Delvigne, Frank; Lognay, Georges; Thonart, Philippe

    Xylanase is produced by Penicillium canescens 10-10c from soya oil cake in static conditions using solid-state fermentation. The impact of several parameters such as the nature and the size of inoculum, bed-loading, and aeration is evaluated during the fermentation process. Mycelial inoculum gives more production than conidial inoculum. Increasing the quantity of inoculum enhances slightly xylanase production. Forced aeration induces more sporulation of strain and reduces xylanase production. However, forced moistened air improves the production compared to production obtained with forced dry air. In addition, increasing bed-loading reduces the specific xylanase production likely due to the incapacity of the Penicillium strain to grow deeply in the fermented soya oil cake mass. Thus, the best cultivation conditions involve mycelial inoculum form, a bed loading of 1-cm height and passive aeration. The maximum xylanase activity is obtained after 7 days of fermentation and attains 10,200 U/g of soya oil cake. These levels are higher than those presented in the literature and, therefore, show all the potentialities of this stock and this technique for the production of xylanase.

  16. Characterization of d-succinylase from Cupriavidus sp. P4-10-C and its application in d-amino acid synthesis.

    Science.gov (United States)

    Sumida, Yosuke; Iwai, Sachio; Nishiya, Yoshiaki; Kumagai, Shinya; Yamada, Toshihide; Azuma, Masayuki

    2018-03-01

    d-Amino acids are important building blocks for various compounds, such as pharmaceuticals and agrochemicals. A more cost-effective enzymatic method for d-amino acid production is needed in the industry. We improved a one-pot enzymatic method for d-amino acid production by the dynamic kinetic resolution of N-succinyl amino acids using two enzymes: d-succinylase (DSA) from Cupriavidus sp. P4-10-C, which hydrolyzes N-succinyl-d-amino acids enantioselectively to their corresponding d-amino acid, and N-succinyl amino acid racemase (NSAR, EC.4.2.1.113) from Geobacillus stearothermophilus NCA1503. In this study, DSA and NSAR were purified and their properties were investigated. The optimum temperature of DSA was 50°C and it was stable up to 55°C. The optimum pH of DSA and NSAR was around 7.5. In d-phenylalanine production, the optical purity of product was improved to 91.6% ee from the examination about enzyme concentration. Moreover, 100 mM N-succinyl-dl-tryptophan was converted to d-tryptophan at 81.8% yield with 94.7% ee. This enzymatic method could be useful for the industrial production of various d-amino acids. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  17. Application for Permit to Operate a Class III Solid Waste Disposal Site at the Nevada Test Site - U10c Disposal Site

    Energy Technology Data Exchange (ETDEWEB)

    NSTec Environmental Programs

    2010-08-05

    The NTS is located approximately 105 km (65 mi) northwest of Las Vegas, Nevada. NNSA/NSO is the federal lands management authority for the NTS and NSTec is the Management & Operations contractor. Access on and off the NTS is tightly controlled, restricted, and guarded on a 24-hour basis. The NTS is posted with signs along its entire perimeter. NSTec is the operator of all solid waste disposal sites on the NTS. The U10C Disposal Site is located in the northwest corner of Area 9 at the NTS (Figure 1) and is located in a subsidence crater created by two underground nuclear events, one in October 1962 and another in April 1964. The disposal site opened in 1971 for the disposal of rubbish, refuse, pathological waste, asbestos-containing material, and industrial solid waste. A Notice of Intent form to operate the disposal site as a Class II site was submitted to the state of Nevada on January 26, 1994, and was acknowledged in a letter to the DOE on February 8, 1994. It operated as a state of Nevada Class II Solid Waste Disposal Site (SWDS) until it closed on October 5, 1995, for retrofit as a Class III SWDS. The retrofit consisted of the installation of a minimum four-foot compacted soil layer to segregate the different waste types and function as a liner to inhibit leachate and water flow into the lower waste zone. Five neutron monitoring tubes were installed in this layer to monitor possible leachate production and water activity. Upon acceptance of the installed barrier and approval of an Operating Plan by NDEP/BFF, the site reopened in January 1996 as a Class III SWDS for the disposal of industrial solid waste and other inert waste.

  18. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  19. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  20. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  1. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  2. Quantitative secondary electron detection

    Science.gov (United States)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    2018-05-08

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  3. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  4. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  5. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  6. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  7. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  8. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  9. Quantitative skeletal scintiscanning

    International Nuclear Information System (INIS)

    Haushofer, R.

    1982-01-01

    330 patients were examined by skeletal scintiscanning with sup(99m)Tc pyrophosphate and sup(99m)methylene diphosphonate in the years between 1977 and 1979. Course control examinations were carried out in 12 patients. The collective of patients presented with primary skeletal tumours, metastases, inflammatory and degenerative skeletal diseases. Bone scintiscanning combined with the ''region of interest'' technique was found to be an objective and reproducible technique for quantitative measurement of skeletal radioactivity concentrations. The validity of nuclear skeletal examinations can thus be enhanced as far as diagnosis, course control, and differential diagnosis are concerned. Quantitative skeletal scintiscanning by means of the ''region of interest'' technique has opened up a new era in skeletal diagnosis by nuclear methods. (orig./MG) [de

  10. Quantitative FDG in depression

    Energy Technology Data Exchange (ETDEWEB)

    Chua, P.; O`Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D. [Austin Hospital, Melbourne, VIC (Australia). Dept of Psychiatry and Centre for PET

    1998-03-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual``s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals.

  11. Quantitative FDG in depression

    International Nuclear Information System (INIS)

    Chua, P.; O'Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D.

    1998-01-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual''s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals

  12. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  13. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  14. Quantitative cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Thelen, M.; Dueber, C.; Wolff, P.; Erbel, R.; Hoffmann, T.

    1985-06-01

    The scope and limitations of quantitative cardiac CT have been evaluated in a series of experimental and clinical studies. The left ventricular muscle mass was estimated by computed tomography in 19 dogs (using volumetric methods, measurements in two axes and planes and reference volume). There was good correlation with anatomical findings. The enddiastolic volume of the left ventricle was estimated in 22 patients with cardiomyopathies; using angiography as a reference, CT led to systematic under-estimation. It is also shown that ECG-triggered magnetic resonance tomography results in improved visualisation and may be expected to improve measurements of cardiac morphology.

  15. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  16. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  17. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  18. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  19. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  20. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  1. Mechanical behavior of Fe{sub 75}Mo{sub 5}P{sub 10}C{sub 7.5}B{sub 2.5} bulk-metallic glass under torsional loading

    Energy Technology Data Exchange (ETDEWEB)

    Zhang Xinjian [School of Chemical Engineering and Technology, Tianjin University, 92 Weijin Road, Tianjin 300072 (China); Huang Lu [Department of Materials Science and Engineering, University of Tennessee, TN 37996 (United States); Key Laboratory of Aerospace Materials and Performance (Ministry of Education), School of Materials Science and Engineering, Beijing University of Aeronautics and Astronautics, Beijing 100191 (China); Chen Xu, E-mail: xchen@tju.edu.cn [School of Chemical Engineering and Technology, Tianjin University, 92 Weijin Road, Tianjin 300072 (China); Liaw, Peter K. [Department of Materials Science and Engineering, University of Tennessee, TN 37996 (United States); An Ke [Neutron Scattering Sciences Division, Oak Ridge National Laboratory, TN 37831 (United States); Zhang Tao [Key Laboratory of Aerospace Materials and Performance (Ministry of Education), School of Materials Science and Engineering, Beijing University of Aeronautics and Astronautics, Beijing 100191 (China); Wang Gongyao [Department of Materials Science and Engineering, University of Tennessee, TN 37996 (United States)

    2010-11-15

    Research highlights: {yields} Fe{sub 75}Mo{sub 5}P{sub 10}C{sub 7.5}B{sub 2.5} bulk-metallic glass exhibits a brittle characteristic under torsional loading. {yields} The BMG occurs in a tensile mode failure under torsional loading. {yields} A slight cyclic-hardening behavior was observed in the initial loading cycles during torsional-fatigue tests. {yields} The torsional fatigue-fracture surface consists of three main regions. - Abstract: Pure- and cyclic-torsional studies were conducted on a Fe{sub 75}Mo{sub 5}P{sub 10}C{sub 7.5}B{sub 2.5} (atomic percent, at.%) bulk-metallic glass at room temperature for an understanding of its damage and fracture mechanisms. Under pure-torsional loading, the metallic glass exhibited very little plastic strain before fracture. The fracture initiated along the maximum tensile-stress plane, which is about 45{sup o} to the axial direction. The shear-fracture strength ({approx}510 MPa) is much lower than the compressive-fracture strength ({approx}3280 MPa), which suggests that different deformation mechanisms be present under various loading modes. Instead of an apparent vein-type structure, the fracture morphologies revealed a crack-initiation site, a mirror region, a mist region, and a hackle region. Under cyclic-torsional loading, fatigue cracks initiated from casting defects, and propagate generally along the maximum tensile-stress plane. A slight cyclic-hardening behavior was observed in initial loading steps. The fatigue-fracture surface consists of three main regions: the fatigue crack-initiation, crack-propagation, and final-fast-fracture areas. The striations resulting from the blunting and re-sharpening of the fatigue crack tip were observed in the crack-propagation region. Based on these results, the damage and fracture mechanisms of the metallic glass induced by torsional loadings are elucidated.

  2. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  3. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  4. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  5. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  6. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    value are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  7. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    are useful for the differentiation of various renal diseases, however, qualitative analysis of the renogram with one or two parameters is not accurate. 3) In bilateral non-functioning kidney groups, a positive correlation between anemia and nitrogen retention was observed, although the quantitative assessment of the degree of non-functioning was impossible.

  8. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  9. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  10. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  11. Isolation of the cyanobacterial YFP-tagged photosystem I using GFP-Trap (R)

    Czech Academy of Sciences Publication Activity Database

    Strašková, Adéla; Knoppová, Jana; Komenda, Josef

    2018-01-01

    Roč. 56, č. 1 (2018), s. 300-305 ISSN 0300-3604 R&D Projects: GA MŠk(CZ) LO1416; GA MŠk(CZ) LM2015055 Institutional support: RVO:61388971 Keywords : assembly factor * pigment- protein complex * two-dimensional electrophoresis Subject RIV: EE - Microbiology, Virology OBOR OECD: Microbiology Impact factor: 1.507, year: 2016

  12. Synthesis and structural characterization of Al4SiC4-homeotypic aluminum silicon oxycarbide, [Al4.4Si0.6][O1.0C2.0]C

    International Nuclear Information System (INIS)

    Kaga, Motoaki; Iwata, Tomoyuki; Nakano, Hiromi; Fukuda, Koichiro

    2010-01-01

    A new quaternary layered oxycarbide, [Al 4.39(5) Si 0.61(5) ] Σ5 [O 1.00(2) C 2.00(2) ] Σ3 C, has been synthesized and characterized by X-ray powder diffraction, transmission electron microscopy and energy dispersive X-ray spectroscopy (EDX). The title compound was found to be hexagonal with space group P6 3 /mmc, Z=2, and unit-cell dimensions a=0.32783(1) nm, c=2.16674(7) nm and V=0.20167(1) nm 3 . The atom ratios Al:Si were determined by EDX, and the initial structural model was derived by the direct methods. The final structural model showed the positional disordering of one of the three types of Al/Si sites. The maximum-entropy methods-based pattern fitting (MPF) method was used to confirm the validity of the split-atom model, in which conventional structure bias caused by assuming intensity partitioning was minimized. The reliability indices calculated from the MPF were R wp =3.73% (S=1.20), R p =2.94%, R B =1.04% and R F =0.81%. The crystal was an inversion twin. Each twin-related individual was isostructural with Al 4 SiC 4 (space group P6 3 mc, Z=2). - Graphical abstract: A new oxycarbide discovered in the Al-Si-O-C system, Al 4 SiC 4 -homeotypic [Al 4.4 Si 0.6 ][O 1.0 C 2.0 ]C. The crystal is an inversion twin, and hence the structure is represented by a split-atom model. The three-dimensional electron density distributions are determined by the maximum-entropy methods-based pattern fitting, being consistent with the disordered structural model.

  13. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  14. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  15. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  16. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  17. Quantitative nature of overexpression experiments

    Science.gov (United States)

    Moriya, Hisao

    2015-01-01

    Overexpression experiments are sometimes considered as qualitative experiments designed to identify novel proteins and study their function. However, in order to draw conclusions regarding protein overexpression through association analyses using large-scale biological data sets, we need to recognize the quantitative nature of overexpression experiments. Here I discuss the quantitative features of two different types of overexpression experiment: absolute and relative. I also introduce the four primary mechanisms involved in growth defects caused by protein overexpression: resource overload, stoichiometric imbalance, promiscuous interactions, and pathway modulation associated with the degree of overexpression. PMID:26543202

  18. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  19. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  20. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  1. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  2. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  3. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  4. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  5. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  6. Quantitative (real-time) PCR

    International Nuclear Information System (INIS)

    Denman, S.E.; McSweeney, C.S.

    2005-01-01

    Many nucleic acid-based probe and PCR assays have been developed for the detection tracking of specific microbes within the rumen ecosystem. Conventional PCR assays detect PCR products at the end stage of each PCR reaction, where exponential amplification is no longer being achieved. This approach can result in different end product (amplicon) quantities being generated. In contrast, using quantitative, or real-time PCR, quantification of the amplicon is performed not at the end of the reaction, but rather during exponential amplification, where theoretically each cycle will result in a doubling of product being created. For real-time PCR, the cycle at which fluorescence is deemed to be detectable above the background during the exponential phase is termed the cycle threshold (Ct). The Ct values obtained are then used for quantitation, which will be discussed later

  7. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  8. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  9. Qualitative discussion of quantitative radiography

    International Nuclear Information System (INIS)

    Berger, H.; Motz, J.W.

    1975-01-01

    Since radiography yields an image that can be easily related to the tested object, it is superior to many nondestructive testing techniques in revealing the size, shape, and location of certain types of discontinuities. The discussion is limited to a description of the radiographic process, examination of some of the quantitative aspects of radiography, and an outline of some of the new ideas emerging in radiography. The advantages of monoenergetic x-ray radiography and neutron radiography are noted

  10. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  11. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  12. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  13. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  14. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  15. Quantitative Analysis of cardiac SPECT

    International Nuclear Information System (INIS)

    Nekolla, S.G.; Bengel, F.M.

    2004-01-01

    The quantitative analysis of myocardial SPECT images is a powerful tool to extract the highly specific radio tracer uptake in these studies. If compared to normal data bases, the uptake values can be calibrated on an individual basis. Doing so increases the reproducibility of the analysis substantially. Based on the development over the last three decades starting from planar scinitigraphy, this paper discusses the methods used today incorporating the changes due to tomographic image acquisitions. Finally, the limitations of these approaches as well as consequences from most recent hardware developments, commercial analysis packages and a wider view of the description of the left ventricle are discussed. (orig.)

  16. Quantitative Trait Loci in Inbred Lines

    NARCIS (Netherlands)

    Jansen, R.C.

    2001-01-01

    Quantitative traits result from the influence of multiple genes (quantitative trait loci) and environmental factors. Detecting and mapping the individual genes underlying such 'complex' traits is a difficult task. Fortunately, populations obtained from crosses between inbred lines are relatively

  17. A quantitative framework for assessing ecological resilience

    Science.gov (United States)

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  18. Operations management research methodologies using quantitative modeling

    NARCIS (Netherlands)

    Bertrand, J.W.M.; Fransoo, J.C.

    2002-01-01

    Gives an overview of quantitative model-based research in operations management, focusing on research methodology. Distinguishes between empirical and axiomatic research, and furthermore between descriptive and normative research. Presents guidelines for doing quantitative model-based research in

  19. Qualitative and quantitative modifications of root mitochondria during senescence of above-ground parts of Arabidopis thaliana.

    Science.gov (United States)

    Fanello, Diego Darío; Bartoli, Carlos Guillermo; Guiamet, Juan José

    2017-05-01

    This work studied modifications experienced by root mitochondria during whole plant senescence or under light deprivation, using Arabidopsis thaliana plants with YFP tagged to mitochondria. During post-bolting development, root respiratory activity started to decline after aboveground organs (i.e., rosette leaves) had senesced. This suggests that carbohydrate starvation may induce root senescence. Similarly, darkening the whole plant induced a decrease in respiration of roots. This was partially due to a decrease in the number of total mitochondria (YFP-labelled mitochondria) and most probably to a decrease in the quantity of mitochondria with a developed inner membrane potential (ΔΨm, i.e., Mitotracker red- labelled mitochondria). Also, the lower amount of mitochondria with ΔΨm compared to YFP-labelled mitochondria at 10d of whole darkened plant, suggests the presence of mitochondria in a "standby state". The experiments also suggest that small mitochondria made the main contribution to the respiratory activity that was lost during root senescence. Sugar supplementation partially restored the respiration of mitochondria after 10d of whole plant dark treatment. These results suggest that root senescence is triggered by carbohydrate starvation, with loss of ΔΨm mitochondria and changes in mitochondrial size distribution. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  1. Methods for Quantitative Creatinine Determination.

    Science.gov (United States)

    Moore, John F; Sharer, J Daniel

    2017-04-06

    Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  2. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  3. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  4. Quantitative information in medical imaging

    International Nuclear Information System (INIS)

    Deconinck, F.

    1985-01-01

    When developing new imaging or image processing techniques, one constantly has in mind that the new technique should provide a better, or more optimal answer to medical tasks than existing techniques do 'Better' or 'more optimal' imply some kind of standard by which one can measure imaging or image processing performance. The choice of a particular imaging modality to answer a diagnostic task, such as the detection of coronary artery stenosis is also based on an implicit optimalisation of performance criteria. Performance is measured by the ability to provide information about an object (patient) to the person (referring doctor) who ordered a particular task. In medical imaging the task is generally to find quantitative information on bodily function (biochemistry, physiology) and structure (histology, anatomy). In medical imaging, a wide range of techniques is available. Each technique has it's own characteristics. The techniques discussed in this paper are: nuclear magnetic resonance, X-ray fluorescence, scintigraphy, positron emission tomography, applied potential tomography, computerized tomography, and compton tomography. This paper provides a framework for the comparison of imaging performance, based on the way the quantitative information flow is altered by the characteristics of the modality

  5. Digital radiography: a quantitative approach

    International Nuclear Information System (INIS)

    Retraint, F.

    2004-01-01

    'Full-text:' In a radiograph the value of each pixel is related to the material thickness crossed by the x-rays. Using this relationship, an object can be characterized by parameters such as depth, surface and volume. Assuming a locally linear detector response and using a radiograph of reference object, the quantitative thickness map of object can be obtained by applying offset and gain corrections. However, for an acquisition system composed of cooled CCD camera optically coupled to a scintillator screen, the radiographic image formation process generates some bias which prevent from obtaining the quantitative information: non uniformity of the x-ray source, beam hardening, Compton scattering, scintillator screen, optical system response. In a first section, we propose a complete model of the radiographic image formation process taking account of these biases. In a second section, we present an inversion scheme of this model for a single material object, which enables to obtain the thickness map of the object crossed by the x-rays. (author)

  6. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  7. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  8. Quantitative criticism of literary relationships.

    Science.gov (United States)

    Dexter, Joseph P; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A; Bonilla Lopez, Jorge A; Schroeder, Lea A; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-04-18

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships ("intertextuality") and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term "quantitative criticism," focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca's main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions.

  9. Quantitative evaluation of dermatological antiseptics.

    Science.gov (United States)

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. © 2015 British Association of Dermatologists.

  10. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  11. Quantitative Activities for Introductory Astronomy

    Science.gov (United States)

    Keohane, Jonathan W.; Bartlett, J. L.; Foy, J. P.

    2010-01-01

    We present a collection of short lecture-tutorial (or homework) activities, designed to be both quantitative and accessible to the introductory astronomy student. Each of these involves interpreting some real data, solving a problem using ratios and proportionalities, and making a conclusion based on the calculation. Selected titles include: "The Mass of Neptune” "The Temperature on Titan” "Rocks in the Early Solar System” "Comets Hitting Planets” "Ages of Meteorites” "How Flat are Saturn's Rings?” "Tides of the Sun and Moon on the Earth” "The Gliese 581 Solar System"; "Buckets in the Rain” "How Hot, Bright and Big is Betelgeuse?” "Bombs and the Sun” "What Forms Stars?” "Lifetimes of Cars and Stars” "The Mass of the Milky” "How Old is the Universe?” "Is The Universe Speeding up or Slowing Down?"

  12. Quantitative patterns in drone wars

    Science.gov (United States)

    Garcia-Bernardo, Javier; Dodds, Peter Sheridan; Johnson, Neil F.

    2016-02-01

    Attacks by drones (i.e., unmanned combat air vehicles) continue to generate heated political and ethical debates. Here we examine the quantitative nature of drone attacks, focusing on how their intensity and frequency compare with that of other forms of human conflict. Instead of the power-law distribution found recently for insurgent and terrorist attacks, the severity of attacks is more akin to lognormal and exponential distributions, suggesting that the dynamics underlying drone attacks lie beyond these other forms of human conflict. We find that the pattern in the timing of attacks is consistent with one side having almost complete control, an important if expected result. We show that these novel features can be reproduced and understood using a generative mathematical model in which resource allocation to the dominant side is regulated through a feedback loop.

  13. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  14. Quantitative variation in natural populations

    International Nuclear Information System (INIS)

    Parsons, P.A.

    1975-01-01

    Quantitative variation is considered in natural populations using Drosophila as the example. A knowledge of such variation enables its rapid exploitation in directional selection experiments as shown for scutellar chaeta number. Where evidence has been obtained, genetic architectures are in qualitative agreement with Mather's concept of balance for traits under stabilizing selection. Additive genetic control is found for acute environmental stresses, but not for less acute stresses as shown by exposure to 60 Co-γ rays. D. simulans probably has a narrower ecological niche than its sibling species D. melanogaster associated with lower genetic heterogeneity. One specific environmental stress to which D. simulans is sensitive in nature is ethyl alcohol as shown by winery data. (U.S.)

  15. Quantitative pulsed eddy current analysis

    International Nuclear Information System (INIS)

    Morris, R.A.

    1975-01-01

    The potential of pulsed eddy current testing for furnishing more information than conventional single-frequency eddy current methods has been known for some time. However, a fundamental problem has been analyzing the pulse shape with sufficient precision to produce accurate quantitative results. Accordingly, the primary goal of this investigation was to: demonstrate ways of digitizing the short pulses encountered in PEC testing, and to develop empirical analysis techniques that would predict some of the parameters (e.g., depth) of simple types of defect. This report describes a digitizing technique using a computer and either a conventional nuclear ADC or a fast transient analyzer; the computer software used to collect and analyze pulses; and some of the results obtained. (U.S.)

  16. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  17. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  18. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  19. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  20. Quantitative organ visualization using SPECT

    International Nuclear Information System (INIS)

    Kircos, L.T.; Carey, J.E. Jr.; Keyes, J.W. Jr.

    1987-01-01

    Quantitative organ visualization (QOV) was performed using single photon emission computed tomography (SPECT). Organ size was calculated from serial, contiguous ECT images taken through the organ of interest with image boundaries determined using a maximum directional gradient edge finding technique. Organ activity was calculated using ECT counts bounded by the directional gradient, imaging system efficiency, and imaging time. The technique used to perform QOV was evaluated using phantom studies, in vivo canine liver, spleen, bladder, and kidney studies, and in vivo human bladder studies. It was demonstrated that absolute organ activity and organ size could be determined with this system and total imaging time restricted to less than 45 min to an accuracy of about +/- 10% providing the minimum dimensions of the organ are greater than the FWHM of the imaging system and the total radioactivity within the organ of interest exceeds 15 nCi/cc for dog-sized torsos. In addition, effective half-lives of approximately 1.5 hr or greater could be determined

  1. Quantitative isotopes miction cystoureterography (QIMCU)

    International Nuclear Information System (INIS)

    Szy, D.A.G.; Stroetges, M.W.; Funke-Voelkers, R.

    1982-01-01

    A simple method for a quantitative evaluation of vesicoureteral reflux was developed. It allows the determination of a) the volume of reflux b) the volume of the bladder at each point of time during the examination. The QIMCU gives an insight into the dynamic of reflux, of reflux volume, and of actual bladder volume. The clinical application in 37 patients with 53 insufficient ureteral orifices (i.e. reflux) showed that the onset of reflux occured in 60% as early as in the first five minutes of the examination but later in the remaining 40%. The maximal reflux was found only in 26% during the first five minutes. The reflux volume exceeded in more than 50% the amount of 3.5 ml. The international grading corresponds with the reflux volume determined by this method. Radionuclide cystoureterography can be used as well in childhood as in adults. Because the radiaction exposure is low, the method can be recommended for the initial examination and for follow up studies. (Author)

  2. A quantitative philology of introspection

    Directory of Open Access Journals (Sweden)

    Carlos eDiuk

    2012-09-01

    Full Text Available The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the ``Axial Age'', saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy - which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single ``arrow of time'' in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the 20th century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus.

  3. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  4. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  5. Quantitative Ultrasound Measurements at the Heel

    DEFF Research Database (Denmark)

    Daugschies, M.; Brixen, K.; Hermann, P.

    2015-01-01

    Calcaneal quantitative ultrasound can be used to predict osteoporotic fracture risk, but its ability to monitor therapy is unclear possibly because of its limited precision. We developed a quantitative ultrasound device (foot ultrasound scanner) that measures the speed of sound at the heel...... with the foot ultrasound scanner reduced precision errors by half (p quantitative ultrasound measurements is feasible. (E-mail: m.daugschies@rad.uni-kiel.de) (C) 2015 World Federation for Ultrasound in Medicine & Biology....

  6. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  7. Scientific Opinion on the substantiation of a health claim related to an equimolar mixture of the CLA isomers c9,t11 and t10,c12 (marketed as Clarinol® and Tonalin®) and “contributes to a reduction in body fat mass” pursuant to Article 13(5) of Regulation

    DEFF Research Database (Denmark)

    Tetens, Inge

    2015-01-01

    Following an application from BASF SE and Stepan Lipid Nutrition, submitted for the authorisation of a health claim pursuant to Article 13(5) of Regulation (EC) No 1924/2006 via the Competent Authority of the Netherlands, the EFSA Panel on Dietetic Products, Nutrition and Allergies (NDA) was asked...... to deliver an opinion on the scientific substantiation of a health claim related to an equimolar mixture (marketed under the trade names Clarinol® and Tonalin®) of the two conjugated linoleic acid (CLA) isomers c9,t11 and t10,c12. The Panel considers that the food is sufficiently characterised. The claimed...... of the CLA isomers c9,t11 and t10,c12, marketed under the trade names of Clarinol® and Tonalin®, and a beneficial physiological effect....

  8. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  9. Quantitative PET of liver functions.

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[ 18 F]fluoro- D -galactose ( 18 F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value ( SUV ) from a static liver 18 F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11 C-palmitate and with the conjugated bile acid tracer [ N -methyl- 11 C]cholylsarcosine ( 11 C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood ( K 1 ; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion , SUV of non-invasive static PET with 18 F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET.

  10. Quantitative PET of liver functions

    Science.gov (United States)

    Keiding, Susanne; Sørensen, Michael; Frisch, Kim; Gormsen, Lars C; Munk, Ole Lajord

    2018-01-01

    Improved understanding of liver physiology and pathophysiology is urgently needed to assist the choice of new and upcoming therapeutic modalities for patients with liver diseases. In this review, we focus on functional PET of the liver: 1) Dynamic PET with 2-deoxy-2-[18F]fluoro-D-galactose (18F-FDGal) provides quantitative images of the hepatic metabolic clearance K met (mL blood/min/mL liver tissue) of regional and whole-liver hepatic metabolic function. Standard-uptake-value (SUV) from a static liver 18F-FDGal PET/CT scan can replace K met and is currently used clinically. 2) Dynamic liver PET/CT in humans with 11C-palmitate and with the conjugated bile acid tracer [N-methyl-11C]cholylsarcosine (11C-CSar) can distinguish between individual intrahepatic transport steps in hepatic lipid metabolism and in hepatic transport of bile acid from blood to bile, respectively, showing diagnostic potential for individual patients. 3) Standard compartment analysis of dynamic PET data can lead to physiological inconsistencies, such as a unidirectional hepatic clearance of tracer from blood (K 1; mL blood/min/mL liver tissue) greater than the hepatic blood perfusion. We developed a new microvascular compartment model with more physiology, by including tracer uptake into the hepatocytes from the blood flowing through the sinusoids, backflux from hepatocytes into the sinusoidal blood, and re-uptake along the sinusoidal path. Dynamic PET data include information on liver physiology which cannot be extracted using a standard compartment model. In conclusion, SUV of non-invasive static PET with 18F-FDGal provides a clinically useful measurement of regional and whole-liver hepatic metabolic function. Secondly, assessment of individual intrahepatic transport steps is a notable feature of dynamic liver PET. PMID:29755841

  11. Qualitative and quantitative analysis of detonation products

    International Nuclear Information System (INIS)

    Xie Yun

    2005-01-01

    Different sampling and different injection method were used during analyzing unknown detonation products in a obturator. The sample analyzed by gas chromatography and gas chromatography/mass spectrum. Qualitative analysis was used with CO, NO, C 2 H 2 , C 6 H 6 and so on, qualitative analysis was used with C 3 H 5 N, C 10 H 10 , C 8 H 8 N 2 and so on. The method used in the article is feasible. The results show that the component of detonation in the study is negative oxygen balance, there were many pollutants in the detonation products. (authors)

  12. Validating quantitative precipitation forecast for the Flood ...

    Indian Academy of Sciences (India)

    In order to issue an accurate warning for flood, a better or appropriate quantitative forecasting of precipitationis required. In view of this, the present study intends to validate the quantitative precipitationforecast (QPF) issued during southwest monsoon season for six river catchments (basin) under theflood meteorological ...

  13. 78 FR 64202 - Quantitative Messaging Research

    Science.gov (United States)

    2013-10-28

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast- track OMB approval... comments. Please submit your comments using only one method and identify that it is for the ``Quantitative...

  14. Applications of quantitative remote sensing to hydrology

    NARCIS (Netherlands)

    Su, Z.; Troch, P.A.A.

    2003-01-01

    In order to quantify the rates of the exchanges of energy and matter among hydrosphere, biosphere and atmosphere, quantitative description of land surface processes by means of measurements at different scales are essential. Quantitative remote sensing plays an important role in this respect. The

  15. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  16. Quantitative Phase Imaging Using Hard X Rays

    International Nuclear Information System (INIS)

    Nugent, K.A.; Gureyev, T.E.; Cookson, D.J.; Paganin, D.; Barnea, Z.

    1996-01-01

    The quantitative imaging of a phase object using 16keV xrays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. We find that our phase image is in quantitative agreement with independent measurements of the object. copyright 1996 The American Physical Society

  17. A Primer on Disseminating Applied Quantitative Research

    Science.gov (United States)

    Bell, Bethany A.; DiStefano, Christine; Morgan, Grant B.

    2010-01-01

    Transparency and replication are essential features of scientific inquiry, yet scientific communications of applied quantitative research are often lacking in much-needed procedural information. In an effort to promote researchers dissemination of their quantitative studies in a cohesive, detailed, and informative manner, the authors delineate…

  18. Using Popular Culture to Teach Quantitative Reasoning

    Science.gov (United States)

    Hillyard, Cinnamon

    2007-01-01

    Popular culture provides many opportunities to develop quantitative reasoning. This article describes a junior-level, interdisciplinary, quantitative reasoning course that uses examples from movies, cartoons, television, magazine advertisements, and children's literature. Some benefits from and cautions to using popular culture to teach…

  19. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  20. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  1. Quantitative neutron radiography using neutron absorbing honeycomb

    International Nuclear Information System (INIS)

    Tamaki, Masayoshi; Oda, Masahiro; Takahashi, Kenji; Ohkubo, Kohei; Tasaka, Kanji; Tsuruno, Akira; Matsubayashi, Masahito.

    1993-01-01

    This investigation concerns quantitative neutron radiography and computed tomography by using a neutron absorbing honeycomb collimator. By setting the neutron absorbing honeycomb collimator between object and imaging system, neutrons scattered in the object were absorbed by the honeycomb material and eliminated before coming to the imaging system, but the neutrons which were transmitted the object without interaction could reach the imaging system. The image by purely transmitted neutrons gives the quantitative information. Two honeycombs were prepared with coating of boron nitride and gadolinium oxide and evaluated for the quantitative application. The relation between the neutron total cross section and the attenuation coefficient confirmed that they were in a fairly good agreement. Application to quantitative computed tomography was also successfully conducted. The new neutron radiography method using the neutron-absorbing honeycomb collimator for the elimination of the scattered neutrons improved remarkably the quantitativeness of the neutron radiography and computed tomography. (author)

  2. Induced mutations for quantitative traits in rice

    International Nuclear Information System (INIS)

    Chakrabarti, B.N.

    1974-01-01

    The characteristics and frequency of micro-mutations induced in quantitative traits by radiation treatment and the extent of heterozygotic effects of different recessive chlorophyll-mutant-genes on quantitative trait has been presented. Mutagenic treatments increased the variance for quantitative traits in all cases although the magnitude of increase varied depending on the treatment and the selection procedure adopted. The overall superiority of the chlorophyll-mutant heterozygotes over the corresponding wild homozygotes, as noted in consecutive two seasons, was not observed when these were grown at a high level of nitrogen fertiliser. (author)

  3. Quantitative determination of uranium by SIMS

    International Nuclear Information System (INIS)

    Kuruc, J.; Harvan, D.; Galanda, D.; Matel, L.; Aranyosiova, M.; Velic, D.

    2008-01-01

    The paper presents results of quantitative measurements of uranium-238 by secondary ion mass spectrometry (SIMS) with using alpha spectrometry as well as complementary technique. Samples with specific activity of uranium-238 were prepared by electrodeposition from aqueous solution of UO 2 (NO 3 ) 2 ·6H 2 O. We tried to apply SIMS to quantitative analysis and search for correlation between intensity obtained from SIMS and activity of uranium-238 in dependence on the surface's weight and possibility of using SIMS in quantitative analysis of environmental samples. The obtained results and correlations as well as results of two real samples measurements are presented in this paper. (authors)

  4. Replication of linkage to quantitative trait loci: variation in location and magnitude of the lod score.

    Science.gov (United States)

    Hsueh, W C; Göring, H H; Blangero, J; Mitchell, B D

    2001-01-01

    Replication of linkage signals from independent samples is considered an important step toward verifying the significance of linkage signals in studies of complex traits. The purpose of this empirical investigation was to examine the variability in the precision of localizing a quantitative trait locus (QTL) by analyzing multiple replicates of a simulated data set with the use of variance components-based methods. Specifically, we evaluated across replicates the variation in both the magnitude and the location of the peak lod scores. We analyzed QTLs whose effects accounted for 10-37% of the phenotypic variance in the quantitative traits. Our analyses revealed that the precision of QTL localization was directly related to the magnitude of the QTL effect. For a QTL with effect accounting for > 20% of total phenotypic variation, > 90% of the linkage peaks fall within 10 cM from the true gene location. We found no evidence that, for a given magnitude of the lod score, the presence of interaction influenced the precision of QTL localization.

  5. Comparative mapping reveals quantitative trait loci that affect spawning time in coho salmon (Oncorhynchus kisutch

    Directory of Open Access Journals (Sweden)

    Cristian Araneda

    2012-01-01

    Full Text Available Spawning time in salmonids is a sex-limited quantitative trait that can be modified by selection. In rainbow trout (Oncorhynchus mykiss, various quantitative trait loci (QTL that affect the expression of this trait have been discovered. In this study, we describe four microsatellite loci associated with two possible spawning time QTL regions in coho salmon (Oncorhynchus kisutch. The four loci were identified in females from two populations (early and late spawners produced by divergent selection from the same base population. Three of the loci (OmyFGT34TUF, One2ASC and One19ASC that were strongly associated with spawning time in coho salmon (p < 0.0002 were previously associated with QTL for the same trait in rainbow trout; a fourth loci (Oki10 with a suggestive association (p = 0.00035 mapped 10 cM from locus OmyFGT34TUF in rainbow trout. The changes in allelic frequency observed after three generations of selection were greater than expected because of genetic drift. This work shows that comparing information from closely-related species is a valid strategy for identifying QTLs for marker-assisted selection in species whose genomes are poorly characterized or lack a saturated genetic map.

  6. Quantitative traits in wheat (Triticum aestivum L

    African Journals Online (AJOL)

    MSS

    2012-11-13

    Nov 13, 2012 ... Of the quantitative traits in wheat, spike length, number of spikes per m2, grain mass per spike, number ... design with four liming variants along with three replications, in which the experimental field .... The sampling was done.

  7. Quantitative Fundus Autofluorescence in Recessive Stargardt Disease

    OpenAIRE

    Burke, Tomas R.; Duncker, Tobias; Woods, Russell L.; Greenberg, Jonathan P.; Zernant, Jana; Tsang, Stephen H.; Smith, R. Theodore; Allikmets, Rando; Sparrow, Janet R.; Delori, François C.

    2014-01-01

    Quantitative fundus autofluorescence (qAF) is significantly increased in Stargardt disease, consistent with previous reports of increased RPE lipofuscin. QAF will help to establish genotype-phenotype correlations and may serve as an outcome measure in clinical trials.

  8. Quantitative Microbial Risk Assessment Tutorial - Primer

    Science.gov (United States)

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  9. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. A Quantitative Technique for Beginning Microscopists.

    Science.gov (United States)

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  11. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  12. Quantitative data extraction from transmission electron micrographs

    International Nuclear Information System (INIS)

    Sprague, J.A.

    1982-01-01

    The discussion will cover an overview of quantitative TEM, the digital image analysis process, coherent optical processing, and finally a summary of the author's views on potentially useful advances in TEM image processing

  13. Quantitative Ability as Correlates of Students' Academic ...

    African Journals Online (AJOL)

    Nekky Umera

    The introduction of quantitative topics into the secondary school economics curriculum has ... since the quality of education at any level is highly dependent on the quality and dedication of ..... Ibadan: Constellations Books 466-481. Anderson ...

  14. Laboratory technique for quantitative thermal emissivity ...

    Indian Academy of Sciences (India)

    Emission of radiation from a sample occurs due to thermal vibration of its .... Quantitative thermal emissivity measurements of geological samples. 393. Figure 1. ...... tral mixture modeling: A new analysis of rock and soil types at the Viking ...

  15. A Quantitative Gas Chromatographic Ethanol Determination.

    Science.gov (United States)

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  16. Qualitative vs. quantitative atopic dermatitis criteria

    DEFF Research Database (Denmark)

    Andersen, R M; Thyssen, J P; Maibach, H I

    2016-01-01

    This review summarizes historical aspects, clinical expression and pathophysiology leading to coining of the terms atopy and atopic dermatitis, current diagnostic criteria and further explore the possibility of developing quantitative diagnostic criteria of atopic dermatitis (AD) based on the imp...

  17. Strategies for quantitation of phosphoproteomic data

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Thingholm, Tine Engberg

    2010-01-01

    Recent developments in phosphoproteomic sample-preparation techniques and sensitive mass spectrometry instrumentation have led to large-scale identifications of phosphoproteins and phosphorylation sites from highly complex samples. This has facilitated the implementation of different quantitation...

  18. Quantitative Methods to Evaluate Timetable Attractiveness

    DEFF Research Database (Denmark)

    Schittenhelm, Bernd; Landex, Alex

    2009-01-01

    The article describes how the attractiveness of timetables can be evaluated quantitatively to ensure a consistent evaluation of timetables. Since the different key stakeholders (infrastructure manager, train operating company, customers, and society) have different opinions on what an attractive...

  19. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  20. Quantitative approaches in climate change ecology

    DEFF Research Database (Denmark)

    Brown, Christopher J.; Schoeman, David S.; Sydeman, William J.

    2011-01-01

    Contemporary impacts of anthropogenic climate change on ecosystems are increasingly being recognized. Documenting the extent of these impacts requires quantitative tools for analyses of ecological observations to distinguish climate impacts in noisy data and to understand interactions between...... climate variability and other drivers of change. To assist the development of reliable statistical approaches, we review the marine climate change literature and provide suggestions for quantitative approaches in climate change ecology. We compiled 267 peer‐reviewed articles that examined relationships...

  1. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  2. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  3. Quantitative autoradiography of semiconductor base material

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1983-01-01

    Autoradiographic methods for the quantitative determination of elements interesting in semiconductor technology and their distribution in silicon are described. Whereas the local concentration and distribution of phosphorus has been determined with the aid of silver halide films the neutron-induced autoradiography has been applied in the case of boron. Silicon disks containing diffused phosphorus or implanted or diffused boron have been used as standard samples. Different possibilities of the quantitative evaluation of autoradiograms are considered and compared

  4. Quantitative methods in psychology: inevitable and useless

    Directory of Open Access Journals (Sweden)

    Aaro Toomela

    2010-07-01

    Full Text Available Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  5. Radiological interpretation 2020: Toward quantitative image assessment

    International Nuclear Information System (INIS)

    Boone, John M.

    2007-01-01

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  6. Improved Protein Arrays for Quantitative Systems Analysis of the Dynamics of Signaling Pathway Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Chin-Rang [National Inst. of Health (NIH), Bethesda, MD (United States). National Heart, Lung and Blood Inst.

    2013-12-11

    Astronauts and workers in nuclear plants who repeatedly exposed to low doses of ionizing radiation (IR, <10 cGy) are likely to incur specific changes in signal transduction and gene expression in various tissues of their body. Remarkable advances in high throughput genomics and proteomics technologies enable researchers to broaden their focus from examining single gene/protein kinetics to better understanding global gene/protein expression profiling and biological pathway analyses, namely Systems Biology. An ultimate goal of systems biology is to develop dynamic mathematical models of interacting biological systems capable of simulating living systems in a computer. This Glue Grant is to complement Dr. Boothman’s existing DOE grant (No. DE-FG02-06ER64186) entitled “The IGF1/IGF-1R-MAPK-Secretory Clusterin (sCLU) Pathway: Mediator of a Low Dose IR-Inducible Bystander Effect” to develop sensitive and quantitative proteomic technology that suitable for low dose radiobiology researches. An improved version of quantitative protein array platform utilizing linear Quantum dot signaling for systematically measuring protein levels and phosphorylation states for systems biology modeling is presented. The signals are amplified by a confocal laser Quantum dot scanner resulting in ~1000-fold more sensitivity than traditional Western blots and show the good linearity that is impossible for the signals of HRP-amplification. Therefore this improved protein array technology is suitable to detect weak responses of low dose radiation. Software is developed to facilitate the quantitative readout of signaling network activities. Kinetics of EGFRvIII mutant signaling was analyzed to quantify cross-talks between EGFR and other signaling pathways.

  7. Qualitative versus quantitative methods in psychiatric research.

    Science.gov (United States)

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  8. Quantitative Appearance Inspection for Film Coated Tablets.

    Science.gov (United States)

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  9. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  10. The rise of quantitative methods in Psychology

    Directory of Open Access Journals (Sweden)

    Denis Cousineau

    2005-09-01

    Full Text Available Quantitative methods have a long history in some scientific fields. Indeed, no one today would consider a qualitative data set in physics or a qualitative theory in chemistry. Quantitative methods are so central in these fields that they are often labelled “hard sciences”. Here, we examine the question whether psychology is ready to enter the “hard science club” like biology did in the forties. The facts that a over half of the statistical techniques used in psychology are less than 40 years old and that b the number of simulations in empirical papers has followed an exponential growth since the eighties, both suggests that the answer is yes. The purpose of Tutorials in Quantitative Methods for Psychology is to provide a concise and easy access to the currents methods.

  11. Comparison between culture and a multiplex quantitative real-time polymerase chain reaction assay detecting Ureaplasma urealyticum and U. parvum.

    Science.gov (United States)

    Frølund, Maria; Björnelius, Eva; Lidbrink, Peter; Ahrens, Peter; Jensen, Jørgen Skov

    2014-01-01

    A novel multiplex quantitative real-time polymerase chain reaction (qPCR) for simultaneous detection of U. urealyticum and U. parvum was developed and compared with quantitative culture in Shepard's 10 C medium for ureaplasmas in urethral swabs from 129 men and 66 women, and cervical swabs from 61 women. Using culture as the gold standard, the sensitivity of the qPCR was 96% and 95% for female urethral and cervical swabs, respectively. In male urethral swabs the sensitivity was 89%. The corresponding specificities were 100%, 87% and 99%. The qPCR showed a linear increasing DNA copy number with increasing colour-changing units. Although slightly less sensitive than culture, this multiplex qPCR assay detecting U. urealyticum and U. parvum constitutes a simple and fast alternative to the traditional methods for identification of ureaplasmas and allows simultaneous species differentiation and quantitation in clinical samples. Furthermore, specimens overgrown by other bacteria using the culture method can be evaluated in the qPCR.

  12. Comparison between culture and a multiplex quantitative real-time polymerase chain reaction assay detecting Ureaplasma urealyticum and U. parvum.

    Directory of Open Access Journals (Sweden)

    Maria Frølund

    Full Text Available A novel multiplex quantitative real-time polymerase chain reaction (qPCR for simultaneous detection of U. urealyticum and U. parvum was developed and compared with quantitative culture in Shepard's 10 C medium for ureaplasmas in urethral swabs from 129 men and 66 women, and cervical swabs from 61 women. Using culture as the gold standard, the sensitivity of the qPCR was 96% and 95% for female urethral and cervical swabs, respectively. In male urethral swabs the sensitivity was 89%. The corresponding specificities were 100%, 87% and 99%. The qPCR showed a linear increasing DNA copy number with increasing colour-changing units. Although slightly less sensitive than culture, this multiplex qPCR assay detecting U. urealyticum and U. parvum constitutes a simple and fast alternative to the traditional methods for identification of ureaplasmas and allows simultaneous species differentiation and quantitation in clinical samples. Furthermore, specimens overgrown by other bacteria using the culture method can be evaluated in the qPCR.

  13. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    Science.gov (United States)

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  14. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  15. Quantitative whole body scintigraphy - a simplified approach

    International Nuclear Information System (INIS)

    Marienhagen, J.; Maenner, P.; Bock, E.; Schoenberger, J.; Eilles, C.

    1996-01-01

    In this paper we present investigations on a simplified method of quantitative whole body scintigraphy by using a dual head LFOV-gamma camera and a calibration algorithm without the need of additional attenuation or scatter correction. Validation of this approach to the anthropomorphic phantom as well as in patient studies showed a high accuracy concerning quantification of whole body activity (102.8% and 97.72%, resp.), by contrast organ activities were recovered with an error range up to 12%. The described method can be easily performed using commercially available software packages and is recommendable especially for quantitative whole body scintigraphy in a clinical setting. (orig.) [de

  16. Aspects of quantitative secondary ion mass spectrometry

    International Nuclear Information System (INIS)

    Grauer, R.

    1982-05-01

    Parameters which have an influence on the formation of secondary ions by ion bombardment of a solid matrix are discussed. Quantitative SIMS-analysis with the help of calibration standards necessitates a stringent control of these parameters. This is particularly valid for the oxygen partial pressure which for metal analysis has to be maintained constant also under ultra high vacuum. The performance of the theoretical LTE-model (Local Thermal Equilibrium) using internal standards will be compared with the analysis with the help of external standards. The LTE-model does not satisfy the requirements for quantitative analysis. (Auth.)

  17. Accuracy of quantitative visual soil assessment

    Science.gov (United States)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  18. Review of progress in quantitative nondestructive evaluation

    International Nuclear Information System (INIS)

    Thompson, D.O.; Chimenti, D.E.

    1983-01-01

    A comprehensive review of the current state of quantitative nondestructive evaluation (NDE), this volume brings together papers by researchers working in government, private industry, and university laboratories. Their papers cover a wide range of interests and concerns for researchers involved in theoretical and applied aspects of quantitative NDE. Specific topics examined include reliability probability of detection--ultrasonics and eddy currents weldments closure effects in fatigue cracks technology transfer ultrasonic scattering theory acoustic emission ultrasonic scattering, reliability and penetrating radiation metal matrix composites ultrasonic scattering from near-surface flaws ultrasonic multiple scattering

  19. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibanez, Noelia; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp......Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional...

  20. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  1. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  2. Quantitative Mapping of Large Area Graphene Conductance

    DEFF Research Database (Denmark)

    Buron, Jonas Christian Due; Petersen, Dirch Hjorth; Bøggild, Peter

    2012-01-01

    We present quantitative mapping of large area graphene conductance by terahertz time-domain spectroscopy and micro four point probe. We observe a clear correlation between the techniques and identify the observed systematic differences to be directly related to imperfections of the graphene sheet...

  3. The Sampling Issues in Quantitative Research

    Science.gov (United States)

    Delice, Ali

    2010-01-01

    A concern for generalization dominates quantitative research. For generalizability and repeatability, identification of sample size is essential. The present study investigates 90 qualitative master's theses submitted for the Primary and Secondary School Science and Mathematics Education Departments, Mathematic Education Discipline in 10…

  4. Critical Race Quantitative Intersections: A "testimonio" Analysis

    Science.gov (United States)

    Covarrubias, Alejandro; Nava, Pedro E.; Lara, Argelia; Burciaga, Rebeca; Vélez, Verónica N.; Solorzano, Daniel G.

    2018-01-01

    The educational pipeline has become a commonly referenced depiction of educational outcomes for racialized groups across the country. While visually impactful, an overreliance on decontextualized quantitative data often leads to majoritarian interpretations. Without sociohistorical contexts, these interpretations run the risk of perpetuating…

  5. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  6. Quantitative sample preparation of some heavy elements

    International Nuclear Information System (INIS)

    Jaffey, A.H.

    1977-01-01

    A discussion is given of some techniques that have been useful in quantitatively preparing and analyzing samples used in the half-life determinations of some plutonium and uranium isotopes. Application of these methods to the preparation of uranium and plutonium samples used in neutron experiments is discussed

  7. Strategies for MCMC computation in quantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánez, N.; Sorensen, Daniel

    2006-01-01

    Given observations of a trait and a pedigree for a group of animals, the basic model in quantitative genetics is a linear mixed model with genetic random effects. The correlation matrix of the genetic random effects is determined by the pedigree and is typically very highdimensional but with a sp...

  8. Proteomic approaches for quantitative cancer cell signaling

    DEFF Research Database (Denmark)

    Voellmy, Franziska

    studies in an effort to contribute to the study of signaling dynamics in cancer systems. This thesis is divided into two parts. Part I begins with a brief introduction in the use of omics in systems cancer research with a focus on mass spectrometry as a means to quantitatively measure protein...

  9. Quantitative analyses of shrinkage characteristics of neem ...

    African Journals Online (AJOL)

    Quantitative analyses of shrinkage characteristics of neem (Azadirachta indica A. Juss.) wood were carried out. Forty five wood specimens were prepared from the three ecological zones of north eastern Nigeria, viz: sahel savanna, sudan savanna and guinea savanna for the research. The results indicated that the wood ...

  10. Quantitative multiplex detection of pathogen biomarkers

    Energy Technology Data Exchange (ETDEWEB)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.

    2016-02-09

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  11. Quantitative angiography after directional coronary atherectomy

    NARCIS (Netherlands)

    P.W.J.C. Serruys (Patrick); V.A.W.M. Umans (Victor); B.H. Strauss (Bradley); R-J. van Suylen (Robert-Jan); M.J.B.M. van den Brand (Marcel); H. Suryapranata (Harry); P.J. de Feyter (Pim); J.R.T.C. Roelandt (Jos)

    1991-01-01

    textabstractOBJECTIVE: To assess by quantitative analysis the immediate angiographic results of directional coronary atherectomy. To compare the effects of successful atherectomy with those of successful balloon dilatation in a series of patients with matched lesions. DESIGN--Case series.

  12. Deforestation since independence: A quantitative assessment of ...

    African Journals Online (AJOL)

    Deforestation since independence: A quantitative assessment of four decades of land-cover change in Malawi. ... pressure and demographic factors are important predictors of deforestation rate within our study area. Keywords: afforestation, Africa, deforestation, drivers, land-use change, reforestation, rural, urban ...

  13. Quantitative SPECT reconstruction of iodine-123 data

    International Nuclear Information System (INIS)

    Gilland, D.R.; Jaszczak, R.J.; Greer, K.L.; Coleman, R.E.

    1991-01-01

    Many clinical and research studies in nuclear medicine require quantitation of iodine-123 ( 123 I) distribution for the determination of kinetics or localization. The objective of this study was to implement several reconstruction methods designed for single-photon emission computed tomography (SPECT) using 123 I and to evaluate their performance in terms of quantitative accuracy, image artifacts, and noise. The methods consisted of four attenuation and scatter compensation schemes incorporated into both the filtered backprojection/Chang (FBP) and maximum likelihood-expectation maximization (ML-EM) reconstruction algorithms. The methods were evaluated on data acquired of a phantom containing a hot sphere of 123 I activity in a lower level background 123 I distribution and nonuniform density media. For both reconstruction algorithms, nonuniform attenuation compensation combined with either scatter subtraction or Metz filtering produced images that were quantitatively accurate to within 15% of the true value. The ML-EM algorithm demonstrated quantitative accuracy comparable to FBP and smaller relative noise magnitude for all compensation schemes

  14. Values in Qualitative and Quantitative Research

    Science.gov (United States)

    Duffy, Maureen; Chenail, Ronald J.

    2008-01-01

    The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…

  15. 78 FR 52166 - Quantitative Messaging Research

    Science.gov (United States)

    2013-08-22

    ... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast-track OMB approval... message testing research (for which CFTC received fast-track OMB approval) and is necessary to identify...

  16. Quantitative grading of store separation trajectories

    CSIR Research Space (South Africa)

    Jamison, Kevin A

    2017-09-01

    Full Text Available . This paper describes the development of an automated analysis process and software that can run a multitude of separation scenarios. A key enabler for this software is the development of a quantitative grading algorithm that scores the outcome of each release...

  17. Subjective Quantitative Studies of Human Agency

    Science.gov (United States)

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  18. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  19. Quantitative X-ray analysis of pigments

    International Nuclear Information System (INIS)

    Araujo, M. Marrocos de

    1987-01-01

    The 'matrix-flushing' and the 'adiabatic principle' methods have been applied for the quantitative analysis through X-ray diffraction patterns of pigments and extenders mixtures, frequently used in paint industry. The results obtained have shown the usefulness of these methods, but still ask for improving their accuracy. (Author) [pt

  20. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...

  1. Quantitative multiplex detection of pathogen biomarkers

    Science.gov (United States)

    Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K

    2014-10-14

    The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.

  2. A quantitative lubricant test for deep drawing

    DEFF Research Database (Denmark)

    Olsson, David Dam; Bay, Niels; Andreasen, Jan L.

    2010-01-01

    A tribological test for deep drawing has been developed by which the performance of lubricants may be evaluated quantitatively measuring the maximum backstroke force on the punch owing to friction between tool and workpiece surface. The forming force is found not to give useful information...

  3. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  4. Quantitative muscle ultrasonography in amyotrophic lateral sclerosis.

    NARCIS (Netherlands)

    Arts, I.M.P.; Rooij, F.G. van; Overeem, S.; Pillen, S.; Janssen, H.M.; Schelhaas, H.J.; Zwarts, M.J.

    2008-01-01

    In this study, we examined whether quantitative muscle ultrasonography can detect structural muscle changes in early-stage amyotrophic lateral sclerosis (ALS). Bilateral transverse scans were made of five muscles or muscle groups (sternocleidomastoid, biceps brachii/brachialis, forearm flexor group,

  5. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Pieters, W.; Arnold, F.; Stoelinga, M.I.A.

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Therefore, penetration testing has thus far been used as a qualitative research method. To enable quantitative approaches to security risk management,

  6. QUANTITATIVE EXTRACTION OF MEIOFAUNA: A COMPARISON ...

    African Journals Online (AJOL)

    and A G DE WET. Department of Mathematical Statistics, University of Port Elizabeth. Accepted: May 1978. ABSTRACT. Two methods for the quantitative extraction of meiofauna from natural sandy sediments were investigated and compared: Cobb's decanting and sieving technique and the Oostenbrink elutriator. Both.

  7. Development of Three Methods for Simultaneous Quantitative ...

    African Journals Online (AJOL)

    Development of Three Methods for Simultaneous Quantitative Determination of Chlorpheniramine Maleate and Dexamethasone in the Presence of Parabens in ... Tropical Journal of Pharmaceutical Research ... Results: All the proposed methods were successfully applied to the analysis of raw materials and dosage form.

  8. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  9. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  10. Quantitative image analysis of synovial tissue

    NARCIS (Netherlands)

    van der Hall, Pascal O.; Kraan, Maarten C.; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the

  11. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  12. Uncertainties in elemental quantitative analysis by PIXE

    International Nuclear Information System (INIS)

    Montenegro, E.C.; Baptista, G.B.; Paschoa, A.S.; Barros Leite, C.V.

    1979-01-01

    The effects of the degree of non-uniformity of the particle beam, matrix composition and matrix thickness in a quantitative elemental analysis by particle induced X-ray emission (PIXE) are discussed and a criterion to evaluate the resulting degree of uncertainty in the mass determination by this method is established. (Auth.)

  13. Quantitative and Qualitative Extensions of Event Structures

    NARCIS (Netherlands)

    Katoen, Joost P.

    1996-01-01

    An important application of formal methods is the specification, design, and analysis of functional aspects of (distributed) systems. Recently the study of quantitative aspects of such systems based on formal methods has come into focus. Several extensions of formal methods where the occurrence of

  14. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  15. Quantitative Penetration Testing with Item Response Theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle Ida Antoinette

    2014-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  16. Quantitative penetration testing with item response theory

    NARCIS (Netherlands)

    Arnold, Florian; Pieters, Wolter; Stoelinga, Mariëlle

    2013-01-01

    Existing penetration testing approaches assess the vulnerability of a system by determining whether certain attack paths are possible in practice. Thus, penetration testing has so far been used as a qualitative research method. To enable quantitative approaches to security risk management, including

  17. Engaging Business Students in Quantitative Skills Development

    Science.gov (United States)

    Cronin, Anthony; Carroll, Paula

    2015-01-01

    In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…

  18. Leaderless Covert Networks : A Quantitative Approach

    NARCIS (Netherlands)

    Husslage, B.G.M.; Lindelauf, R.; Hamers, H.J.M.

    2012-01-01

    Abstract: Lindelauf et al. (2009a) introduced a quantitative approach to investigate optimal structures of covert networks. This approach used an objective function which is based on the secrecy versus information trade-off these organizations face. Sageman (2008) hypothesized that covert networks

  19. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  20. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  1. Bringing quality and meaning to quantitative data - Bringing quantitative evidence to qualitative observation

    DEFF Research Database (Denmark)

    Karpatschof, Benny

    2007-01-01

    Based on the author's methodological theory defining the distinctive properties of quantitative and qualitative method the article demonstrates the possibilities and advantages of combining the two types of investigation in the same research project. The project being an effect study...

  2. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  3. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  4. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  5. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  6. Credit Institutions Management Evaluation using Quantitative Methods

    Directory of Open Access Journals (Sweden)

    Nicolae Dardac

    2006-02-01

    Full Text Available Credit institutions supervising mission by state authorities is mostly assimilated with systemic risk prevention. In present, the mission is orientated on analyzing the risk profile of the credit institutions, the mechanism and existing systems as management tools providing to bank rules the proper instruments to avoid and control specific bank risks. Rating systems are sophisticated measurement instruments which are capable to assure the above objectives, such as success in banking risk management. The management quality is one of the most important elements from the set of variables used in the quoting process in credit operations. Evaluation of this quality is – generally speaking – fundamented on quantitative appreciations which can induce subjectivism and heterogeneity in quotation. The problem can be solved by using, complementary, quantitative technics such us DEA (Data Envelopment Analysis.

  7. Quantitative analysis of thallium-201 myocardial scintigraphy

    International Nuclear Information System (INIS)

    Kanemoto, Nariaki; Hoer, G.; Johost, S.; Maul, F.-D.; Standke, R.

    1981-01-01

    The method of quantitative analysis of thallium-201 myocardial scintigraphy using computer assisted technique was described. Calculated indices are washout factor, vitality index and redistribution factor. Washout factor is the ratio of counts at certain period of time after exercise and immediately after exercise. This value is neccessary for the evaluation of redistribution to the ischemic areas in serial imagings to correct the Tl-201 washout from the myocardium under the assumption that the washout is constant in the whole myocardium. Vitality index is the ratio between the Tl-201 uptake in the region of interest and that of the maximum. Redistribution factor is the ratio of the redistribution in the region of interest in serial imagings after exercise to that of immediately after exercise. Four examples of exercise Tl-201 myocardial scintigrams and the quantitative analyses before and after the percutaneous transluminal coronary angioplasty were presented. (author)

  8. Nanostructured surfaces investigated by quantitative morphological studies

    International Nuclear Information System (INIS)

    Perani, Martina; Carapezzi, Stefania; Mutta, Geeta Rani; Cavalcoli, Daniela

    2016-01-01

    The morphology of different surfaces has been investigated by atomic force microscopy and quantitatively analyzed in this paper. Two different tools have been employed to this scope: the analysis of the height–height correlation function and the determination of the mean grain size, which have been combined to obtain a complete characterization of the surfaces. Different materials have been analyzed: SiO_xN_y, InGaN/GaN quantum wells and Si nanowires, grown with different techniques. Notwithstanding the presence of grain-like structures on all the samples analyzed, they present very diverse surface design, underlying that this procedure can be of general use. Our results show that the quantitative analysis of nanostructured surfaces allows us to obtain interesting information, such as grain clustering, from the comparison of the lateral correlation length and the grain size. (paper)

  9. Quantitative phosphoproteomics to characterize signaling networks

    DEFF Research Database (Denmark)

    Rigbolt, Kristoffer T G; Blagoev, Blagoy

    2012-01-01

    for analyzing protein phosphorylation at a system-wide scale and has become the intuitive strategy for comprehensive characterization of signaling networks. Contemporary phosphoproteomics use highly optimized procedures for sample preparation, mass spectrometry and data analysis algorithms to identify......Reversible protein phosphorylation is involved in the regulation of most, if not all, major cellular processes via dynamic signal transduction pathways. During the last decade quantitative phosphoproteomics have evolved from a highly specialized area to a powerful and versatile platform...... and quantify thousands of phosphorylations, thus providing extensive overviews of the cellular signaling networks. As a result of these developments quantitative phosphoproteomics have been applied to study processes as diverse as immunology, stem cell biology and DNA damage. Here we review the developments...

  10. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1979-01-01

    Although the overall aim of radiobiology is to understand the biological effects of radiation, it also has the implied practical purpose of developing rational measures for the control of radiation exposure in man. The emphasis in this presentation is to show that the enormous effort expended over the years to develop quantitative dose-effect relationships in biochemical and cellular systems, animals, and human beings now seems to be paying off. The pieces appear to be falling into place, and a framework is evolving to utilize these data. Specifically, quantitative risk assessments will be discussed in terms of the cellular, animal, and human data on which they are based; their use in the development of radiation protection standards; and their present and potential impact and meaning in relation to the quantity dose equivalent and its special unit, the rem

  11. Quantitative sputter profiling at surfaces and interfaces

    International Nuclear Information System (INIS)

    Kirschner, J.; Etzkorn, H.W.

    1981-01-01

    The key problem in quantitative sputter profiling, that of a sliding depth scale has been solved by combined Auger/X-ray microanalysis. By means of this technique and for the model system Ge/Si (amorphous) the following questions are treated quantitatively: shape of the sputter profiles when sputtering through an interface and origin of their asymmetry; precise location of the interface plane on the depth profile; broadening effects due to limited depth of information and their correction; origin and amount of bombardment induced broadening for different primary ions and energies; depth dependence of the broadening, and basic limits to depth resolution. Comparisons are made to recent theoretical calculations based on recoil mixing in the collision cascade and very good agreement is found

  12. Quantitative image processing in fluid mechanics

    Science.gov (United States)

    Hesselink, Lambertus; Helman, James; Ning, Paul

    1992-01-01

    The current status of digital image processing in fluid flow research is reviewed. In particular, attention is given to a comprehensive approach to the extraction of quantitative data from multivariate databases and examples of recent developments. The discussion covers numerical simulations and experiments, data processing, generation and dissemination of knowledge, traditional image processing, hybrid processing, fluid flow vector field topology, and isosurface analysis using Marching Cubes.

  13. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  14. Quantitative indicators of fruit and vegetable consumption

    OpenAIRE

    Dagmar Kozelová; Dana Országhová; Milan Fiľa; Zuzana Čmiková

    2015-01-01

    The quantitative research of the market is often based on surveys and questionnaires which are finding out the behavior of customers in observed areas. Before purchasing process consumers consider where they will buy fruit and vegetables, what kind to choose and in what quantity of goods. Consumers' behavior is affected by the factors as: regional gastronomic traditions, price, product appearance, aroma, place of buying, own experience and knowledge, taste preferences as well as specific heal...

  15. Development of a quantitative risk standard

    International Nuclear Information System (INIS)

    Temme, M.I.

    1982-01-01

    IEEE Working Group SC-5.4 is developing a quantitative risk standard for LWR plant design and operation. The paper describes the Working Group's conclusions on significant issues, including the scope of the standard, the need to define the process (i.e., PRA calculation) for meeting risk criteria, the need for PRA quality requirements and the importance of distinguishing standards from goals. The paper also describes the Working Group's approach to writing this standard

  16. Quantitative possibility analysis. Present status in ESCA

    International Nuclear Information System (INIS)

    Brion, D.

    1981-01-01

    A short review of the recent developments in quantification of X-ray photoelectron spectroscopy or ESCA is presented. The basic equations are reminded. Each involved parameter (photoionisation, inelastic mean free paths, 'response function' of the instruments, intensity measurement) is separately discussed in relation with the accuracy and the precision of the method. Other topics are considered such as roughness, surface contamination, matrix effect and inhomogeneous composition. Some aspects of the quantitative ESCA analysis and AES analysis are compared [fr

  17. Quantitative imaging of bilirubin by photoacoustic microscopy

    Science.gov (United States)

    Zhou, Yong; Zhang, Chi; Yao, Da-Kang; Wang, Lihong V.

    2013-03-01

    Noninvasive detection of both bilirubin concentration and its distribution is important for disease diagnosis. Here we implemented photoacoustic microscopy (PAM) to detect bilirubin distribution. We first demonstrate that our PAM system can measure the absorption spectra of bilirubin and blood. We also image bilirubin distributions in tissuemimicking samples, both without and with blood mixed. Our results show that PAM has the potential to quantitatively image bilirubin in vivo for clinical applications.

  18. Quantitative Risk Assessment of Contact Sensitization

    DEFF Research Database (Denmark)

    Api, Anne Marie; Belsito, Donald; Bickers, David

    2010-01-01

    Background: Contact hypersensitivity quantitative risk assessment (QRA) for fragrance ingredients is being used to establish new international standards for all fragrance ingredients that are potential skin sensitizers. Objective: The objective was to evaluate the retrospective clinical data...... as potential sensitizers. Methods: This article reviews clinical data for three fragrance ingredients cinnamic aldehyde, citral, and isoeugenol to assess the utility of the QRA approach for fragrance ingredients. Results: This assessment suggests that had the QRA approach been available at the time standards...

  19. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  20. Quantitative maps of groundwater resources in Africa

    International Nuclear Information System (INIS)

    MacDonald, A M; Bonsor, H C; Dochartaigh, B É Ó; Taylor, R G

    2012-01-01

    In Africa, groundwater is the major source of drinking water and its use for irrigation is forecast to increase substantially to combat growing food insecurity. Despite this, there is little quantitative information on groundwater resources in Africa, and groundwater storage is consequently omitted from assessments of freshwater availability. Here we present the first quantitative continent-wide maps of aquifer storage and potential borehole yields in Africa based on an extensive review of available maps, publications and data. We estimate total groundwater storage in Africa to be 0.66 million km 3 (0.36–1.75 million km 3 ). Not all of this groundwater storage is available for abstraction, but the estimated volume is more than 100 times estimates of annual renewable freshwater resources on Africa. Groundwater resources are unevenly distributed: the largest groundwater volumes are found in the large sedimentary aquifers in the North African countries Libya, Algeria, Egypt and Sudan. Nevertheless, for many African countries appropriately sited and constructed boreholes can support handpump abstraction (yields of 0.1–0.3 l s −1 ), and contain sufficient storage to sustain abstraction through inter-annual variations in recharge. The maps show further that the potential for higher yielding boreholes ( > 5 l s −1 ) is much more limited. Therefore, strategies for increasing irrigation or supplying water to rapidly urbanizing cities that are predicated on the widespread drilling of high yielding boreholes are likely to be unsuccessful. As groundwater is the largest and most widely distributed store of freshwater in Africa, the quantitative maps are intended to lead to more realistic assessments of water security and water stress, and to promote a more quantitative approach to mapping of groundwater resources at national and regional level. (letter)

  1. Review of progress in quantitative NDE

    International Nuclear Information System (INIS)

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques

  2. Radioimmunoassay to quantitatively measure cell surface immunoglobulins

    International Nuclear Information System (INIS)

    Krishman, E.C.; Jewell, W.R.

    1975-01-01

    A radioimmunoassay techniques developed to quantitatively measure the presence of immunoglobulins on the surface of cells, is described. The amount of immunoglobulins found on different tumor cells varied from 200 to 1140 ng/10 6 cells. Determination of immunoglobulins on the peripheral lymphocytes obtained from different cancer patients varied between 340 to 1040 ng/10 6 cells. Cultured tumor cells, on the other hand, were found to contain negligible quantities of human IgG [pt

  3. Quantitative analysis of untreated bio-samples

    International Nuclear Information System (INIS)

    Sera, K.; Futatsugawa, S.; Matsuda, K.

    1999-01-01

    A standard-free method of quantitative analysis for untreated samples has been developed. For hair samples, measurements were performed by irradiating with a proton beam a few hairs as they are, and quantitative analysis was carried out by means of a standard-free method developed by ourselves. First, quantitative values of concentration of zinc were derived, then concentration of other elements was obtained by regarding zinc as an internal standard. As the result, values of concentration of sulphur for 40 samples agree well with the average value for a typical Japanese and also with each other within 20%, and validity of the present method could be confirmed. Accuracy was confirmed by comparing the results with those obtained by the usual internal standard method, too. For the purpose of a surface analysis of a bone sample, a very small incidence angle of the proton beam was used, so that both energy loss of the projectile and self-absorption of X-rays become negligible. As the result, consistent values of concentration for many elements were obtained by the standard-free method

  4. Quantitative evaluation of dysphagia using scintigraphy

    International Nuclear Information System (INIS)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae

    1998-01-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes

  5. Quantitative evaluation of dysphagia using scintigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seok Gun; Hyun, Jung Keun; Lee, Seong Jae [College of Medicine, Dankook Univ., Cheonnon (Korea, Republic of)

    1998-08-01

    To evaluate dysphagia objectively and quantitatively, and to clarify the effect of neck position and viscosity changes in patients with aspiration and laryngeal penetration. We studied 35 patients with dysphagia and 21 normal controls using videofluoroscopy and scintigraphy. Videofluoroscopy was performed with barium with three different viscosity, and scintigraphy was done with water, yogurt, and steamed egg mixed with Tc-99m tin colloid. If aspiration was found during videofluoroscopic examination, patient's neck position was changed and study repeated. Videofluoroscopy was analyzed qualitatively. We calculated 7 quantitative parameters from scintigraphy. According to the videofluoroscopic findings, we divided patients into 3 subgroups; aspiration, laryngeal penetration, and no-aspiration group. The result of videofluoroscopy revealed that the most common finding was the delay in triggering pharyngeal swallow. Pharyngeal transit time (PTT) and pharyngeal swallowing efficiency (PSE) in patients with aspiration were significantly different from other groups. After neck position change, aspiration could be reduced in all of 7 patients, and laryngeal penetration reduced by about 82%. PTT and PSE were also improved after position change. Aspiration and laryngeal penetration occurred more frequently in thin liquid swallowing than in thin liquid and solid swallowing. PTT and PSE were useful for the evaluation of dysphagia. Aspiration and laryngeal penetration could by reduced when appropriate position assumed. We could decrease the chance of aspiration by changing the patient diet consistency. Scintigraphy might be useful tool to quantitate and follow up these changes.

  6. Quantitative and qualitative coronary arteriography. 1

    International Nuclear Information System (INIS)

    Brown, B.G.; Simpson, Paul; Dodge, J.T. Jr; Bolson, E.L.; Dodge, H.T.

    1991-01-01

    The clinical objectives of arteriography are to obtain information that contributes to an understanding of the mechanisms of the clinical syndrome, provides prognostic information, facilitates therapeutic decisions, and guides invasive therapy. Quantitative and improved qualitative assessments of arterial disease provide us with a common descriptive language which has the potential to accomplish these objectives more effectively and thus to improve clinical outcome. In certain situations, this potential has been demonstrated. Clinical investigation using quantitative techniques has definitely contributed to our understanding of disease mechanisms and of atherosclerosis progression/regression. Routine quantitation of clinical images should permit more accurate and repeatable estimates of disease severity and promises to provide useful estimates of coronary flow reserve. But routine clinical QCA awaits more cost- and time-efficient methods and clear proof of a clinical advantage. Careful inspection of highly magnified, high-resolution arteriographic images reveals morphologic features related to the pathophysiology of the clinical syndrome and to the likelihood of future progression or regression of obstruction. Features that have been found useful include thrombus in its various forms, ulceration and irregularity, eccentricity, flexing and dissection. The description of such high-resolution features should be included among, rather than excluded from, the goals of image processing, since they contribute substantially to the understanding and treatment of the clinical syndrome. (author). 81 refs.; 8 figs.; 1 tab

  7. Rational quantitative safety goals: a summary

    International Nuclear Information System (INIS)

    Unwin, S.D.; Hayns, M.R.

    1984-08-01

    We introduce the notion of a Rational Quantitative Safety Goal. Such a goal reflects the imprecision and vagueness inherent in any reasonable notion of adequate safety and permits such vagueness to be incorporated into the formal regulatory decision-making process. A quantitative goal of the form, the parameter x, characterizing the safety level of the nuclear plant, shall not exceed the value x 0 , for example, is of a non-rational nature in that it invokes a strict binary logic in which the parameter space underlying x is cut sharply into two portions: that containing those values of x that comply with the goal and that containing those that do not. Here, we utilize an alternative form of logic which, in accordance with any intuitively reasonable notion of safety, permits a smooth transition of a safety determining parameter between the adequately safe and inadequately safe domains. Fuzzy set theory provides a suitable mathematical basis for the formulation of rational quantitative safety goals. The decision-making process proposed here is compatible with current risk assessment techniques and produces results in a transparent and useful format. Our methodology is illustrated with reference to the NUS Corporation risk assessment of the Limerick Generating Station

  8. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Quantitative trait loci and metabolic pathways

    Science.gov (United States)

    McMullen, M. D.; Byrne, P. F.; Snook, M. E.; Wiseman, B. R.; Lee, E. A.; Widstrom, N. W.; Coe, E. H.

    1998-01-01

    The interpretation of quantitative trait locus (QTL) studies is limited by the lack of information on metabolic pathways leading to most economic traits. Inferences about the roles of the underlying genes with a pathway or the nature of their interaction with other loci are generally not possible. An exception is resistance to the corn earworm Helicoverpa zea (Boddie) in maize (Zea mays L.) because of maysin, a C-glycosyl flavone synthesized in silks via a branch of the well characterized flavonoid pathway. Our results using flavone synthesis as a model QTL system indicate: (i) the importance of regulatory loci as QTLs, (ii) the importance of interconnecting biochemical pathways on product levels, (iii) evidence for “channeling” of intermediates, allowing independent synthesis of related compounds, (iv) the utility of QTL analysis in clarifying the role of specific genes in a biochemical pathway, and (v) identification of a previously unknown locus on chromosome 9S affecting flavone level. A greater understanding of the genetic basis of maysin synthesis and associated corn earworm resistance should lead to improved breeding strategies. More broadly, the insights gained in relating a defined genetic and biochemical pathway affecting a quantitative trait should enhance interpretation of the biological basis of variation for other quantitative traits. PMID:9482823

  10. Quantitative learning strategies based on word networks

    Science.gov (United States)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  11. Quantitative tools for addressing hospital readmissions

    Directory of Open Access Journals (Sweden)

    Lagoe Ronald J

    2012-11-01

    Full Text Available Abstract Background Increased interest in health care cost containment is focusing attention on reduction of hospital readmissions. Major payors have already developed financial penalties for providers that generate excess readmissions. This subject has benefitted from the development of resources such as the Potentially Preventable Readmissions software. This process has encouraged hospitals to renew efforts to improve these outcomes. The aim of this study was to describe quantitative tools such as definitions, risk estimation, and tracking of patients for reducing hospital readmissions. Findings This study employed the Potentially Preventable Readmissions software to develop quantitative tools for addressing hospital readmissions. These tools included two definitions of readmissions that support identification and management of patients. They also included analytical approaches for estimation of the risk of readmission for individual patients by age, discharge status of the initial admission, and severity of illness. They also included patient specific spreadsheets for tracking of target populations and for evaluation of the impact of interventions. Conclusions The study demonstrated that quantitative tools including the development of definitions of readmissions, estimation of the risk of readmission, and patient specific spreadsheets could contribute to the improvement of patient outcomes in hospitals.

  12. Some exercises in quantitative NMR imaging

    International Nuclear Information System (INIS)

    Bakker, C.J.G.

    1985-01-01

    The articles represented in this thesis result from a series of investigations that evaluate the potential of NMR imaging as a quantitative research tool. In the first article the possible use of proton spin-lattice relaxation time T 1 in tissue characterization, tumor recognition and monitoring tissue response to radiotherapy is explored. The next article addresses the question whether water proton spin-lattice relaxation curves of biological tissues are adequately described by a single time constant T 1 , and analyzes the implications of multi-exponentiality for quantitative NMR imaging. In the third article the use of NMR imaging as a quantitative research tool is discussed on the basis of phantom experiments. The fourth article describes a method which enables unambiguous retrieval of sign information in a set of magnetic resonance images of the inversion recovery type. The next article shows how this method can be adapted to allow accurate calculation of T 1 pictures on a pixel-by-pixel basis. The sixth article, finally, describes a simulation procedure which enables a straightforward determination of NMR imaging pulse sequence parameters for optimal tissue contrast. (orig.)

  13. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  14. Quantitative Reasoning Learning Progressions for Environmental Science: Developing a Framework

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2013-01-01

    Full Text Available Quantitative reasoning is a complex concept with many definitions and a diverse account in the literature. The purpose of this article is to establish a working definition of quantitative reasoning within the context of science, construct a quantitative reasoning framework, and summarize research on key components in that framework. Context underlies all quantitative reasoning; for this review, environmental science serves as the context.In the framework, we identify four components of quantitative reasoning: the quantification act, quantitative literacy, quantitative interpretation of a model, and quantitative modeling. Within each of these components, the framework provides elements that comprise the four components. The quantification act includes the elements of variable identification, communication, context, and variation. Quantitative literacy includes the elements of numeracy, measurement, proportional reasoning, and basic probability/statistics. Quantitative interpretation includes the elements of representations, science diagrams, statistics and probability, and logarithmic scales. Quantitative modeling includes the elements of logic, problem solving, modeling, and inference. A brief comparison of the quantitative reasoning framework with the AAC&U Quantitative Literacy VALUE rubric is presented, demonstrating a mapping of the components and illustrating differences in structure. The framework serves as a precursor for a quantitative reasoning learning progression which is currently under development.

  15. Quantitative fluorescence microscopy and image deconvolution.

    Science.gov (United States)

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  16. Simple preparations of Pd6Cl12, Pt6Cl12, and Qn[Pt2Cl8+n], n=1, 2 (Q=TBA+, PPN+) and structural characterization of [TBA][Pt2Cl9] and [PPN]2[Pt2Cl10].C7H8.

    Science.gov (United States)

    Dell'Amico, Daniela Belli; Calderazzo, Fausto; Marchetti, Fabio; Ramello, Stefano; Samaritani, Simona

    2008-02-04

    The hexanuclear Pd6Cl12, i.e., the crystal phase classified as beta-PdCl2, was obtained by reacting [TBA]2[Pd2Cl6] with AlCl3 (or FeCl3) in CH2Cl2. The action of AlCl3 on PtCl42-, followed by digestion of the resulting solid in 1,2-C2H4Cl2 (DCE), CHCl3, or benzene, produced Pt6Cl12.DCE, Pt6Cl12.CHCl3, or Pt6Cl12.C6H6, respectively. Treating [TBA]2[PtCl6] with a slight excess of AlCl3 afforded [TBA][Pt2Cl9], whose anion was established crystallographically to be constituted by two "PtCl6" octahedra sharing a face. Dehydration of H2PtCl6.nH2O with SOCl2 gave an amorphous compound closely analyzing as PtCl4, reactive with [Q]Cl in SOCl2 to yield [Q][Pt2Cl9] or [Q]2[Pt2Cl10], depending on the [Q]Cl/Pt molar ratio (Q=TBA+, PPN+). A single-crystal X-ray diffraction study has shown [PPN]2[Pt2Cl10].C7H8 to contain dinuclear anions formed by two edge-sharing PtCl6 octahedra.

  17. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  18. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  19. Quantitative Imaging in Cancer Evolution and Ecology

    Science.gov (United States)

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral

  20. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  1. Quantitative safety goals for the regulatory process

    International Nuclear Information System (INIS)

    Joksimovic, V.; O'Donnell, L.F.

    1981-01-01

    The paper offers a brief summary of the current regulatory background in the USA, emphasizing nuclear, related to the establishment of quantitative safety goals as a way to respond to the key issue of 'how safe is safe enough'. General Atomic has taken a leading role in advocating the use of probabilistic risk assessment techniques in the regulatory process. This has led to understanding of the importance of quantitative safety goals. The approach developed by GA is discussed in the paper. It is centred around definition of quantitative safety regions. The regions were termed: design basis, safety margin or design capability and safety research. The design basis region is bounded by the frequency of 10 -4 /reactor-year and consequences of no identifiable public injury. 10 -4 /reactor-year is associated with the total projected lifetime of a commercial US nuclear power programme. Events which have a 50% chance of happening are included in the design basis region. In the safety margin region, which extends below the design basis region, protection is provided against some events whose probability of not happening during the expected course of the US nuclear power programme is within the range of 50 to 90%. Setting the lower mean frequency to this region of 10 -5 /reactor-year is equivalent to offering 90% assurance that an accident of given severity will not happen. Rare events with a mean frequency below 10 -5 can be predicted to occur. However, accidents predicted to have a probability of less than 10 -6 are 99% certain not to happen at all, and are thus not anticipated to affect public health and safety. The area between 10 -5 and 10 -6 defines the frequency portion of the safety research region. Safety goals associated with individual risk to a maximum-exposed member of public, general societal risk and property risk are proposed in the paper

  2. Quantitative imaging of turbulent and reacting flows

    Energy Technology Data Exchange (ETDEWEB)

    Paul, P.H. [Sandia National Laboratories, Livermore, CA (United States)

    1993-12-01

    Quantitative digital imaging, using planar laser light scattering techniques is being developed for the analysis of turbulent and reacting flows. Quantitative image data, implying both a direct relation to flowfield variables as well as sufficient signal and spatial dynamic range, can be readily processed to yield two-dimensional distributions of flowfield scalars and in turn two-dimensional images of gradients and turbulence scales. Much of the development of imaging techniques to date has concentrated on understanding the requisite molecular spectroscopy and collision dynamics to be able to determine how flowfield variable information is encoded into the measured signal. From this standpoint the image is seen as a collection of single point measurements. The present effort aims at realizing necessary improvements in signal and spatial dynamic range, signal-to-noise ratio and spatial resolution in the imaging system as well as developing excitation/detection strategies which provide for a quantitative measure of particular flowfield scalars. The standard camera used for the study is an intensified CCD array operated in a conventional video format. The design of the system was based on detailed modeling of signal and image transfer properties of fast UV imaging lenses, image intensifiers and CCD detector arrays. While this system is suitable for direct scalar imaging, derived quantities (e.g. temperature or velocity images) require an exceptionally wide dynamic range imaging detector. To apply these diagnostics to reacting flows also requires a very fast shuttered camera. The authors have developed and successfully tested a new type of gated low-light level detector. This system relies on fast switching of proximity focused image-diode which is direct fiber-optic coupled to a cooled CCD array. Tests on this new detector show significant improvements in detection limit, dynamic range and spatial resolution as compared to microchannel plate intensified arrays.

  3. Quantitative microanalysis with a nuclear microprobe

    International Nuclear Information System (INIS)

    Themner, Klas.

    1989-01-01

    The analytical techniques of paticle induced X-ray emission (PIXE) and Rutherford backscattering (RBS), together with the nuclear microprobe, form a very powerful tool for performing quantitative microanalysis of biological material. Calibration of the X-ray detection system in the microprobe set-up has been performed and the accuracy of the quantitative procedure using RBS for determination of the areal mass density was investigated. The accuracy of the analysis can be affected by alteration in the elemental concentrations during irradiation due to the radiation damage induced by the very intense beams of ionixing radiation. Loss of matrix elements from freeze-dried tissue sections and polymer films have been studied during proton and photon irradiation and the effect on the accuracy discussed. Scanning the beam over an area of the target, with e.g. 32x32 pixels, in order to produce en elemental map, yields a lot of information and, to be able to make an accurate quantitatification, a fast algorithm using descriptions of the different spectral contributions is of need. The production of continuum X-rays by 2.55 MeV protons has been studied and absolute cross-sections for the bremsstrahlung production from thin carbon and some polymer films determined. For the determination of the bremsstrahlung background knowledge of the amounts of the matrix elements is important and a fast program for the evaluation of spectra of proton back- and forward scattering from biological samples has been developed. Quantitative microanalysis with the nuclear microprobe has been performed on brain tissue from rats subjected to different pathological conditions. Increase in calcium levels and decrease in potssium levels for animals subjected to crebral ischaemia and for animals suffering from epileptic seizures were observed coincidentally with or, in some cases before, visible signs of cell necrosis. (author)

  4. Quantitative transmission electron microscopy at atomic resolution

    International Nuclear Information System (INIS)

    Allen, L J; D'Alfonso, A J; Forbes, B D; Findlay, S D; LeBeau, J M; Stemmer, S

    2012-01-01

    In scanning transmission electron microscopy (STEM) it is possible to operate the microscope in bright-field mode under conditions which, by the quantum mechanical principle of reciprocity, are equivalent to those in conventional transmission electron microscopy (CTEM). The results of such an experiment will be presented which are in excellent quantitative agreement with theory for specimens up to 25 nm thick. This is at variance with the large contrast mismatch (typically between two and five) noted in equivalent CTEM experiments. The implications of this will be discussed.

  5. Quantitative spectrographic determination of zirconium minerals

    International Nuclear Information System (INIS)

    Rocal Adell, M.; Alvarez Gonzalez, F.; Fernandez Cellini, R.

    1958-01-01

    The method described in the following report permits the quantitative determination of zirconium in minerals and rocks in a 0,02-100% of ZrO 2 concentration rate. The excitation is carried out by a 10 ampere continuous current arc among carbon electrodes, and placing the sample in a crater of 2 mm depth. For low concentrations a dilution of the sample with the same weight as its own in carbon powder and with 1/25 of its weight of Co 3 O 4 (internal patron) is carried out. Line Zr 2571,4, Co 2585,3 and Co 2587,2 are used. (Author) 6 refs

  6. Quantitative angiography methods for bifurcation lesions

    DEFF Research Database (Denmark)

    Collet, Carlos; Onuma, Yoshinobu; Cavalcante, Rafael

    2017-01-01

    Bifurcation lesions represent one of the most challenging lesion subsets in interventional cardiology. The European Bifurcation Club (EBC) is an academic consortium whose goal has been to assess and recommend the appropriate strategies to manage bifurcation lesions. The quantitative coronary...... angiography (QCA) methods for the evaluation of bifurcation lesions have been subject to extensive research. Single-vessel QCA has been shown to be inaccurate for the assessment of bifurcation lesion dimensions. For this reason, dedicated bifurcation software has been developed and validated. These software...

  7. Software performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2009-01-01

    Praise from the Reviewers:"The practicality of the subject in a real-world situation distinguishes this book from othersavailable on the market."—Professor Behrouz Far, University of Calgary"This book could replace the computer organization texts now in use that every CS and CpEstudent must take. . . . It is much needed, well written, and thoughtful."—Professor Larry Bernstein, Stevens Institute of TechnologyA distinctive, educational text onsoftware performance and scalabilityThis is the first book to take a quantitative approach to the subject of software performance and scalability

  8. MR Fingerprinting for Rapid Quantitative Abdominal Imaging.

    Science.gov (United States)

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D; Wright, Katherine L; Seiberlich, Nicole; Griswold, Mark A; Gulani, Vikas

    2016-04-01

    To develop a magnetic resonance (MR) "fingerprinting" technique for quantitative abdominal imaging. This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue

  9. Chinese legal texts – Quantitative Description

    Directory of Open Access Journals (Sweden)

    Ľuboš GAJDOŠ

    2017-06-01

    Full Text Available The aim of the paper is to provide a quantitative description of legal Chinese. This study adopts the approach of corpus-based analyses and it shows basic statistical parameters of legal texts in Chinese, namely the length of a sentence, the proportion of part of speech etc. The research is conducted on the Chinese monolingual corpus Hanku. The paper also discusses the issues of statistical data processing from various corpora, e.g. the tokenisation and part of speech tagging and their relevance to study of registers variation.

  10. Enhancing quantitative approaches for assessing community resilience

    Science.gov (United States)

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  11. Quantitative interface models for simulating microstructure evolution

    International Nuclear Information System (INIS)

    Zhu, J.Z.; Wang, T.; Zhou, S.H.; Liu, Z.K.; Chen, L.Q.

    2004-01-01

    To quantitatively simulate microstructural evolution in real systems, we investigated three different interface models: a sharp-interface model implemented by the software DICTRA and two diffuse-interface models which use either physical order parameters or artificial order parameters. A particular example is considered, the diffusion-controlled growth of a γ ' precipitate in a supersaturated γ matrix in Ni-Al binary alloys. All three models use the thermodynamic and kinetic parameters from the same databases. The temporal evolution profiles of composition from different models are shown to agree with each other. The focus is on examining the advantages and disadvantages of each model as applied to microstructure evolution in alloys

  12. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1978-01-01

    The bases for developing quantitative assessment of exposure risks in the human being, and the several problems that accompany the assessment and introduction of the risk of exposure to high and low LET radiation into radiation protection, will be evaluated. The extension of the pioneering radiation protection philosophies to the control of other hazardous agents that cannot be eliminated from the environment will be discussed, as will the serious misunderstandings and misuse of concepts and facts that have inevitably surrounded the application to one agent alone, of the protection philosophy that must in time be applied to a broad spectrum of potentially hazardous agents. (orig.) [de

  13. Quantitative methods for management and economics

    CERN Document Server

    Chakravarty, Pulak

    2009-01-01

    ""Quantitative Methods for Management and Economics"" is specially prepared for the MBA students in India and all over the world. It starts from the basics, such that even a beginner with out much mathematical sophistication can grasp the ideas and then comes forward to more complex and professional problems. Thus, both the ordinary students as well as ""above average: i.e., ""bright and sincere"" students would be benefited equally through this book.Since, most of the problems are solved or hints are given, students can do well within the short duration of the semesters of their busy course.

  14. Quantitative proteomic analysis of intact plastids.

    Science.gov (United States)

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  15. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x......-ray diffraction for both the substrate layer and the electrodeposits yielded experimental evidence for epitaxy between Cu and Au. An orientation relation between film and substrate was discussed with respect to various concepts of epitaxy. While the conventional mode of epitaxy fails for the Cu...

  16. A Quantitative Scale of Oxophilicity and Thiophilicity

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2016-01-01

    Oxophilicity and thiophilicity are widely used concepts with no quantitative definition. In this paper, a simple, generic scale is developed that solves issues with reference states and system dependencies and captures empirically known tendencies toward oxygen. This enables a detailed analysis......, ionic bonding is stronger to metals of low electronegativity. Left-side d-block elements with low effective nuclear charges and electro-negativities are thus highly oxophilic, and the f-block elements, not because of their hardness, which is normal, but as a result of the small ionization energies...

  17. Path to development of quantitative safety goals

    International Nuclear Information System (INIS)

    Joksimovic, V.; Houghton, W.J.

    1980-04-01

    There is a growing interest in defining numerical safety goals for nuclear power plants as exemplified by an ACRS recommendation. This paper proposes a lower frequency limit of approximately 10 -4 /reactor-year for design basis events. Below this frequency, down, to a small frequency such as 10 -5 /reactor-year, safety margin can be provided by, say, site emergency plans. Accident sequences below 10 -5 should not impact public safety, but it is prudent that safety research programs examine sequences with significant consequences. Once tentatively agreed upon, quantitative safety goals together with associated implementation tools would be factored into regulatory and design processes

  18. Expermental Studies of quantitative evaluation using HPLC

    Directory of Open Access Journals (Sweden)

    Ki Rok Kwon

    2005-06-01

    Full Text Available Methods : This study was conducted to carry out quantitative evaluation using HPLC Content analysis was done using HPLC Results : According to HPLC analysis, each BVA-1 contained approximately 0.36㎍ melittin, and BVA-2 contained approximately 0.54㎍ melittin. But the volume of coating was so minute, slight difference exists between each needle. Conclusion : Above results indicate that the bee venom acupuncture can complement shortcomings of syringe usage as a part of Oriental medicine treatment, but extensive researches should be done for further verification.

  19. Quantitative Assessment of the IT Agile Transformation

    Directory of Open Access Journals (Sweden)

    Orłowski Cezary

    2017-03-01

    Full Text Available The aim of this paper is to present the quantitative perspective of the agile transformation processes in IT organisations. The phenomenon of agile transformation becomes a complex challenge for an IT organisation since it has not been analysed in detail so far. There is no research on the readiness of IT organisations to realise agile transformation processes. Such processes also prove to have uncontrolled character. Therefore, to minimise the risk of failure referring to the realisation of transformation processes, it is necessary to monitor them. It is also necessary to identify and analyse such processes to ensure their continuous character.

  20. Quantitative Communication Research: Review, Trends, and Critique

    Directory of Open Access Journals (Sweden)

    Timothy R. Levine

    2013-01-01

    Full Text Available Trends in quantitative communication research are reviewed. A content analysis of 48 articles reporting original communication research published in 1988-1991 and 2008-2011 is reported. Survey research and self-report measurement remain common approaches to research. Null hypothesis significance testing remains the dominant approach to statistical analysis. Reporting the shapes of distributions, estimates of statistical power, and confidence intervals remain uncommon. Trends over time include the increased popularity of health communication and computer mediated communication as topics of research, and increased attention to mediator and moderator variables. The implications of these practices for scientific progress are critically discussed, and suggestions for the future are provided.

  1. Quantitative Methods in the Study of Local History

    Science.gov (United States)

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  2. Developing quantitative tools for measuring aspects of prisonization

    DEFF Research Database (Denmark)

    Kjær Minke, Linda

    2013-01-01

    The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners.......The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners....

  3. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  4. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  5. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  6. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  7. Fundamental quantitative security in quantum key generation

    International Nuclear Information System (INIS)

    Yuen, Horace P.

    2010-01-01

    We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographic context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.

  8. Quantitative image fusion in infrared radiometry

    Science.gov (United States)

    Romm, Iliya; Cukurel, Beni

    2018-05-01

    Towards high-accuracy infrared radiance estimates, measurement practices and processing techniques aimed to achieve quantitative image fusion using a set of multi-exposure images of a static scene are reviewed. The conventional non-uniformity correction technique is extended, as the original is incompatible with quantitative fusion. Recognizing the inherent limitations of even the extended non-uniformity correction, an alternative measurement methodology, which relies on estimates of the detector bias using self-calibration, is developed. Combining data from multi-exposure images, two novel image fusion techniques that ultimately provide high tonal fidelity of a photoquantity are considered: ‘subtract-then-fuse’, which conducts image subtraction in the camera output domain and partially negates the bias frame contribution common to both the dark and scene frames; and ‘fuse-then-subtract’, which reconstructs the bias frame explicitly and conducts image fusion independently for the dark and the scene frames, followed by subtraction in the photoquantity domain. The performances of the different techniques are evaluated for various synthetic and experimental data, identifying the factors contributing to potential degradation of the image quality. The findings reflect the superiority of the ‘fuse-then-subtract’ approach, conducting image fusion via per-pixel nonlinear weighted least squares optimization.

  9. QUANTITATIVE INDICATORS OF THE SECURITIZATION OF ASSETS

    Directory of Open Access Journals (Sweden)

    Denis VOSTRICOV

    2018-02-01

    Full Text Available Securitization is instrumental in return on capital increment through the withdrawal from the balance oflending activities being accompanied by off-balance incomes flow from fees, which are less capital-intensive. Thepurpose of this paper is to analyze the quantitative indicators characterizing the securitization of assets. For draftingthis article, the method of analysis, synthesis method, logic and dialectic method, normative method, the study ofstatistical sampling and time series of expert evaluations (Standard and Poor’s, personal observations, andmonographic studies have been used. The main difference between the securitization of assets from traditional waysof financing is related to the achievement of a plenty of secondary goals in attracting financial resources, whichcan play a significant role in choosing to favour the securitization of assets or other types of financing. Inparticular, it gives a possibility to write off the assets from the balance sheet along with the relevant obligationsunder the securities, to expand the range of potential investors accompanied by the reducing of credit risk, interestrate and liquidity risk, as well as to improve the management quality of assets, liabilities and risks. All of thesesecondary effects are achieved by the isolation of selected assets from the total credit risk of the enterprise, raisingits funds, which forms the pivotal actuality and significance of asset securitization. The article containsdemonstrations of quantitative and qualitative indicators characterizing the securitization of assets.

  10. Quantitating cellular immune responses to cancer vaccines.

    Science.gov (United States)

    Lyerly, H Kim

    2003-06-01

    While the future of immunotherapy in the treatment of cancer is promising, it is difficult to compare the various approaches because monitoring assays have not been standardized in approach or technique. Common assays for measuring the immune response need to be established so that these assays can one day serve as surrogate markers for clinical response. Assays that accurately detect and quantitate T-cell-mediated, antigen-specific immune responses are particularly desired. However, to date, increases in the number of cytotoxic T cells through immunization have not been correlated with clinical tumor regression. Ideally, then, a T-cell assay not only needs to be sensitive, specific, reliable, reproducible, simple, and quick to perform, it must also demonstrate close correlation with clinical outcome. Assays currently used to measure T-cell response are delayed-type hypersensitivity testing, flow cytometry using peptide major histocompatibility complex tetramers, lymphoproliferation assay, enzyme-linked immunosorbant assay, enzyme-linked immunospot assay, cytokine flow cytometry, direct cytotoxicity assay, measurement of cytokine mRNA by quantitative reverse transcriptase polymerase chain reaction, and limiting dilution analysis. The purpose of this review is to describe the attributes of each test and compare their advantages and disadvantages.

  11. A quantitative calculation for software reliability evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young-Jun; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    To meet these regulatory requirements, the software used in the nuclear safety field has been ensured through the development, validation, safety analysis, and quality assurance activities throughout the entire process life cycle from the planning phase to the installation phase. A variety of activities, such as the quality assurance activities are also required to improve the quality of a software. However, there are limitations to ensure that the quality is improved enough. Therefore, the effort to calculate the reliability of the software continues for a quantitative evaluation instead of a qualitative evaluation. In this paper, we propose a quantitative calculation method for the software to be used for a specific operation of the digital controller in an NPP. After injecting random faults in the internal space of a developed controller and calculating the ability to detect the injected faults using diagnostic software, we can evaluate the software reliability of a digital controller in an NPP. We tried to calculate the software reliability of the controller in an NPP using a new method that differs from a traditional method. It calculates the fault detection coverage after injecting the faults into the software memory space rather than the activity through the life cycle process. We attempt differentiation by creating a new definition of the fault, imitating the software fault using the hardware, and giving a consideration and weights for injection faults.

  12. Quantitative stratification of diffuse parenchymal lung diseases.

    Directory of Open Access Journals (Sweden)

    Sushravya Raghunath

    Full Text Available Diffuse parenchymal lung diseases (DPLDs are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients.

  13. Quantitative Stratification of Diffuse Parenchymal Lung Diseases

    Science.gov (United States)

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Maldonado, Fabien; Peikert, Tobias; Moua, Teng; Ryu, Jay H.; Bartholmai, Brian J.; Robb, Richard A.

    2014-01-01

    Diffuse parenchymal lung diseases (DPLDs) are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes) and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients. PMID:24676019

  14. The Quantitative Nature of Autistic Social Impairment

    Science.gov (United States)

    Constantino, John N.

    2011-01-01

    Autism, like intellectual disability, represents the severe end of a continuous distribution of developmental impairments that occur in nature, that are highly inherited, and that are orthogonally related to other parameters of development. A paradigm shift in understanding the core social abnormality of autism as a quantitative trait rather than as a categorically-defined condition has key implications for diagnostic classification, the measurement of change over time, the search for underlying genetic and neurobiologic mechanisms, and public health efforts to identify and support affected children. Here a recent body of research in genetics and epidemiology is presented to examine a dimensional reconceptualization of autistic social impairment—as manifested in clinical autistic syndromes, the broader autism phenotype, and normal variation in the general population. It illustrates how traditional categorical approaches to diagnosis may lead to misclassification of subjects (especially girls and mildly affected boys in multiple-incidence autism families), which can be particularly damaging to biological studies, and proposes continued efforts to derive a standardized quantitative system by which to characterize this family of conditions. PMID:21289537

  15. Quantitative analysis by nuclear magnetic resonance spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wainai, T; Mashimo, K [Nihon Univ., Tokyo. Coll. of Science and Engineering

    1976-04-01

    Recent papers on the practical quantitative analysis by nuclear magnetic resonance spectroscopy (NMR) are reviewed. Specifically, the determination of moisture in liquid N/sub 2/O/sub 4/ as an oxidizing agent for rocket propulsion, the analysis of hydroperoxides, the quantitative analysis using a shift reagent, the analysis of aromatic sulfonates, and the determination of acids and bases are reviewed. Attention is paid to the accuracy. The sweeping velocity and RF level in addition to the other factors must be on the optimal condition to eliminate the errors, particularly when computation is made with a machine. Higher sweeping velocity is preferable in view of S/N ratio, but it may be limited to 30 Hz/s. The relative error in the measurement of area is generally 1%, but when those of dilute concentration and integrated, the error will become smaller by one digit. If impurities are treated carefully, the water content on N/sub 2/O/sub 4/ can be determined with accuracy of about 0.002%. The comparison method between peak heights is as accurate as that between areas, when the uniformity of magnetic field and T/sub 2/ are not questionable. In the case of chemical shift movable due to content, the substance can be determined by the position of the chemical shift. Oil and water contents in rape-seed, peanuts, and sunflower-seed are determined by measuring T/sub 1/ with 90 deg pulses.

  16. Immune chromatography: a quantitative radioimmunological assay

    International Nuclear Information System (INIS)

    Davis, J.W.; Demetriades, M.; Bowen, J.M.

    1984-01-01

    Immune chromatography, a radioimmunological binding assay, employs paper chromatography to separate immune complexes from free antigen and antibodies. During chromatography free antigen and antibodies become distributed throughout the paper, while immune complexes remain near the bottoms of the strips. The chromatographic differences can be made quantitative by using either iodinated antigens or antibodies. Under these conditions nanogram quantities of antigen can be detected or antibodies in sera diluted several 1000-fold. The immune chromatography assay can also be performed as an indirect assay, since the paper strips are cut from nitrocellulose paper. In this case the immune components are absorbed by the paper during chromatography. Antigen is then detected with an iodinated second antibody. The indirect immune chromatography assay is particularly useful for identifying different sera that react with the same antigen. Reaction with the first serum before chromatography reduces the amount of antigen available to the second serum following chromatography. In addition to characterizing the immune chromatography procedure, we discuss the possible applications of chromatography assays for the quantitation of other types of molecular binding interactions. (Auth.)

  17. Quantitative fluorescence nanoscopy for cancer biomedicine

    Science.gov (United States)

    Huang, Tao; Nickerson, Andrew; Peters, Alec; Nan, Xiaolin

    2015-08-01

    Cancer is a major health threat worldwide. Options for targeted cancer therapy, however, are often limited, in a large part due to our incomplete understanding of how key processes including oncogenesis and drug response are mediated at the molecular level. New imaging techniques for visualizing biomolecules and their interactions at the nanometer and single molecule scales, collectively named fluorescence nanoscopy, hold the promise to transform biomedical research by providing direct mechanistic insight into cellular processes. We discuss the principles of quantitative single-molecule localization microscopy (SMLM), a subset of fluorescence nanoscopy, and their applications to cancer biomedicine. In particular, we will examine oncogenesis and drug resistance mediated by mutant Ras, which is associated with ~1/3 of all human cancers but has remained an intractable drug target. At ~20 nm spatial and single-molecule stoichiometric resolutions, SMLM clearly showed that mutant Ras must form dimers to activate its effector pathways and drive oncogenesis. SMLM further showed that the Raf kinase, one of the most important effectors of Ras, also forms dimers upon activation by Ras. Moreover, treatment of cells expressing wild type Raf with Raf inhibitors induces Raf dimer formation in a manner dependent on Ras dimerization. Together, these data suggest that Ras dimers mediate oncogenesis and drug resistance in tumors with hyperactive Ras and can potentially be targeted for cancer therapy. We also discuss recent advances in SMLM that enable simultaneous imaging of multiple biomolecules and their interactions at the nanoscale. Our work demonstrates the power of quantitative SMLM in cancer biomedicine.

  18. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  19. Quantitative diagnosis of skeletons with demineralizing osteopathy

    International Nuclear Information System (INIS)

    Banzer, D.

    1979-01-01

    The quantitative diagnosis of bone diseases must be assessed according to the accuracy of the applied method, the expense in apparatus, personnel and financial resources and the comparability of results. Nuclide absorptiometry and in the future perhaps computed tomography represent the most accurate methods for determining the mineral content of bones. Their application is the clinics' prerogative because of the costs. Morphometry provides quantiative information, in particular in course control, and enables an objective judgement of visual pictures. It requires little expenditure and should be combined with microradioscopy. Direct comparability of the findings of different working groups is most easy in morphometry; it depends on the equipment in computerized tomography and is still hardly possible in nuclide absorptiometry. For fundamental physical reason, it will hardly be possible to produce a low-cost, fast and easy-to-handle instrument for the determination of the mineral salt concentration in bones. Instead, there is rather a trend towards more expensive equipment, e.g. CT instruments; the universal use of these instruments, however, will help to promote quantitative diagnoses. (orig.) [de

  20. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  1. Quantitative fluorescence angiography for neurosurgical interventions.

    Science.gov (United States)

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  2. Developments in quantitative electron probe microanalysis

    International Nuclear Information System (INIS)

    Tixier, R.

    1977-01-01

    A study of the range of validity of the formulae for corrections used with massive specimen analysis is made. The method used is original; we have shown that it was possible to use a property of invariability of corrected intensity ratios for standards. This invariance property provides a test for the self consistency of the theory. The theoretical and experimental conditions required for quantitative electron probe microanalysis of thin transmission electron microscope specimens are examined. The correction formulae for atomic number, absorption and fluorescence effects are calculated. Several examples of experimental results are given, relative to the quantitative analysis of intermetallic precipitates and carbides in steels. Advances in applications of electron probe instruments related to the use of computer and the present development of fully automated instruments are reviewed. The necessary statistics for measurements of X ray count data are studied. Estimation procedure and tests are developed. These methods are used to perform a statistical check of electron probe microanalysis measurements and to reject rogue values. An estimator of the confidence interval of the apparent concentration is derived. Formulae were also obtained to optimize the counting time in order to obtain the best precision in a minimum amount of time [fr

  3. Quantitative phase analysis by neutron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chang Hee; Song, Su Ho; Lee, Jin Ho; Shim, Hae Seop [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-06-01

    This study is to apply quantitative phase analysis (QPA) by neutron diffraction to the round robin samples provided by the International Union of Crystallography(IUCr). We measured neutron diffraction patterns for mixed samples which have several different weight percentages and their unique characteristic features. Neutron diffraction method has been known to be superior to its complementary methods such as X-ray or Synchrotron, but it is still accepted as highly reliable under limited conditions or samples. Neutron diffraction has strong capability especially on oxides due to its scattering cross-section of the oxygen and it can become a more strong tool for analysis on the industrial materials with this quantitative phase analysis techniques. By doing this study, we hope not only to do one of instrument performance tests on our HRPD but also to improve our ability on the analysis of neutron diffraction data by comparing our QPA results with others from any advanced reactor facilities. 14 refs., 4 figs., 6 tabs. (Author)

  4. Quantitative evaluations of male pattern baldness.

    Science.gov (United States)

    Tsuji, Y; Ishino, A; Hanzawa, N; Uzuka, M; Okazaki, K; Adachi, K; Imamura, S

    1994-07-01

    Several methods for the evaluation of hair growth have been reported; however, none of the hitherto reported methods are satisfactory as unbiased double blind studies to evaluate the efficacy of hair growth agents. In the present paper, we describe quantitative evaluation methods for hair growth by measuring the anagen ratio and hair diameters in 56 Japanese subjects aged 23-56 for 3 years. The average anagen ratio decreased by 3.8% in 3 years. The average hair diameters showed a statistically significant decrease each year totalling 3.4 microns. Subjects were sorted according to their anagen ratio into 4 groups. Each group showed different distribution patterns of hair diameters. The higher anagen ratio group has a high frequency peak at thicker hair diameters and the lower anagen ratio group has a high frequency peak at thinner hair diameters. The number of thicker hairs decreased and the high frequency peak shifted to thinner hair diameters in 3 years. These methods are useful to evaluate both the progression of male pattern baldness and the effects of hair growth agents with double blind studies in an unbiased quantitative fashion.

  5. Technological innovation in neurosurgery: a quantitative study.

    Science.gov (United States)

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-07-01

    Technological innovation within health care may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technology-intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical techniques. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation, respectively. The authors searched a patent database for articles published between 1960 and 2010 using the Boolean search term "neurosurgeon OR neurosurgical OR neurosurgery." The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top-performing technology cluster was then selected as an exemplar for a more detailed analysis of individual patents. In all, 11,672 patents and 208,203 publications related to neurosurgery were identified. The top-performing technology clusters during these 50 years were image-guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes, and endoscopes. In relation to image-guidance and neuromodulation devices, the authors found a highly correlated rapid rise in the numbers of patents and publications, which suggests that these are areas of technology expansion. An in-depth analysis of neuromodulation-device patents revealed that the majority of well-performing patents were related to deep brain stimulation. Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery.

  6. Quantitative assessment of growth plate activity

    International Nuclear Information System (INIS)

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies

  7. Quantitative tomographic measurements of opaque multiphase flows

    Energy Technology Data Exchange (ETDEWEB)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN; O' HERN,TIMOTHY J.; CECCIO,STEVEN L.

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDT and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.

  8. Allometric trajectories and "stress": a quantitative approach

    Directory of Open Access Journals (Sweden)

    Tommaso Anfodillo

    2016-11-01

    Full Text Available The term stress is an important but vague term in plant biology. We show situations in which thinking in terms of stress is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between source and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, too little leaf area (e.g. due to herbivory or disease per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to stress, without need for recourse to this term. Our approach contrasts with traditional approaches for studying stress, e.g. revealing that small stressed plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as stress, plasticity, adaptation, and acclimation.

  9. Allometric Trajectories and "Stress": A Quantitative Approach.

    Science.gov (United States)

    Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E

    2016-01-01

    The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.

  10. Quantitative standard-less XRF analysis

    International Nuclear Information System (INIS)

    Ulitzka, S.

    2002-01-01

    Full text: For most analytical tasks in the mining and associated industries matrix-matched calibrations are used for the monitoring of ore grades and process control. In general, such calibrations are product specific (iron ore, bauxite, alumina, mineral sands, cement etc.) and apply to a relatively narrow concentration range but give the best precision and accuracy for those materials. A wide range of CRMs is available and for less common materials synthetic standards can be made up from 'pure' chemicals. At times, analysis of materials with varying matrices (powders, scales, fly ash, alloys, polymers, liquors, etc.) and diverse physical shapes (non-flat, metal drillings, thin layers on substrates etc.) is required that could also contain elements which are not part of a specific calibration. A qualitative analysis can provide information about the presence of certain elements and the relative intensities of element peaks in a scan can give a rough idea about their concentrations. More often however, quantitative values are required. The paper will look into the basics of quantitative standardless analysis and show results for some well-defined CRMs. Copyright (2002) Australian X-ray Analytical Association Inc

  11. Analysis of Ingredient Lists to Quantitatively Characterize ...

    Science.gov (United States)

    The EPA’s ExpoCast program is developing high throughput (HT) approaches to generate the needed exposure estimates to compare against HT bioactivity data generated from the US inter-agency Tox21 and the US EPA ToxCast programs. Assessing such exposures for the thousands of chemicals in consumer products requires data on product composition. This is a challenge since quantitative product composition data are rarely available. We developed methods to predict the weight fractions of chemicals in consumer products from weight fraction-ordered chemical ingredient lists, and curated a library of such lists from online manufacturer and retailer sites. The probabilistic model predicts weight fraction as a function of the total number of reported ingredients, the rank of the ingredient in the list, the minimum weight fraction for which ingredients were reported, and the total weight fraction of unreported ingredients. Weight fractions predicted by the model compared very well to available quantitative weight fraction data obtained from Material Safety Data Sheets for products with 3-8 ingredients. Lists were located from the online sources for 5148 products containing 8422 unique ingredient names. A total of 1100 of these names could be located in EPA’s HT chemical database (DSSTox), and linked to 864 unique Chemical Abstract Service Registration Numbers (392 of which were in the Tox21 chemical library). Weight fractions were estimated for these 864 CASRN. Using a

  12. Another Curriculum Requirement? Quantitative Reasoning in Economics: Some First Steps

    Science.gov (United States)

    O'Neill, Patrick B.; Flynn, David T.

    2013-01-01

    In this paper, we describe first steps toward focusing on quantitative reasoning in an intermediate microeconomic theory course. We find student attitudes toward quantitative aspects of economics improve over the duration of the course (as we would hope). Perhaps more importantly, student attitude toward quantitative reasoning improves, in…

  13. Quantitative Literacy Courses as a Space for Fusing Literacies

    Science.gov (United States)

    Tunstall, Samuel Luke; Matz, Rebecca L.; Craig, Jeffrey C.

    2016-01-01

    In this article, we examine how students in a general education quantitative literacy course reason with public issues when unprompted to use quantitative reasoning. Michigan State University, like many institutions, not only has a quantitative literacy requirement for all undergraduates but also offers two courses specifically for meeting the…

  14. Videodensitometric quantitative angiography after coronary balloon angioplasty, compared to edge-detection quantitative angiography and intracoronary ultrasound imaging

    NARCIS (Netherlands)

    Peters, R. J.; Kok, W. E.; Pasterkamp, G.; von Birgelen, C.; Prins, M. [=Martin H.; Serruys, P. W.

    2000-01-01

    AIMS: To assess the value of videodensitometric quantification of the coronary lumen after angioplasty by comparison to two other techniques of coronary artery lumen quantification. METHODS AND RESULTS: Videodensitometric quantitative angiography, edge detection quantitative angiography and 30 MHz

  15. Good practices for quantitative bias analysis.

    Science.gov (United States)

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  16. [Progress in stable isotope labeled quantitative proteomics methods].

    Science.gov (United States)

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2013-06-01

    Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.

  17. Real time quantitative amplification detection on a microarray: towards high multiplex quantitative PCR.

    NARCIS (Netherlands)

    Pierik, A.; Moamfa, M; van Zelst, M.; Clout, D.; Stapert, H.; Dijksman, Johan Frederik; Broer, D.; Wimberger-Friedl, R.

    2012-01-01

    Quantitative real-time polymerase chain reaction (qrtPCR) is widely used as a research and diagnostic tool. Notwithstanding its many powerful features, the method is limited in the degree of multiplexing to about 6 due to spectral overlap of the available fluorophores. A new method is presented that

  18. Real time quantitative amplification detection on a microarray : towards high multiplex quantitative PCR

    NARCIS (Netherlands)

    Pierik, Anke; Boamfa, M.; Zelst, van M.; Clout, D.; Stapert, H.R.; Dijksman, J.F.; Broer, D.J.; Wimberger-Friedl, R.

    2012-01-01

    Quantitative real-time polymerase chain reaction (qrtPCR) is widely used as a research and diagnostic tool. Notwithstanding its many powerful features, the method is limited in the degree of multiplexing to about 6 due to spectral overlap of the available fluorophores. A new method is presented that

  19. Quantitative Determination of Aluminum in Deodorant Brands: A Guided Inquiry Learning Experience in Quantitative Analysis Laboratory

    Science.gov (United States)

    Sedwick, Victoria; Leal, Anne; Turner, Dea; Kanu, A. Bakarr

    2018-01-01

    The monitoring of metals in commercial products is essential for protecting public health against the hazards of metal toxicity. This article presents a guided inquiry (GI) experimental lab approach in a quantitative analysis lab class that enabled students' to determine the levels of aluminum in deodorant brands. The utility of a GI experimental…

  20. Progress towards in vitro quantitative imaging of human femur using compound quantitative ultrasonic tomography

    International Nuclear Information System (INIS)

    Lasaygues, Philippe; Ouedraogo, Edgard; Lefebvre, Jean-Pierre; Gindre, Marcel; Talmant, Marilyne; Laugier, Pascal

    2005-01-01

    The objective of this study is to make cross-sectional ultrasonic quantitative tomography of the diaphysis of long bones. Ultrasonic propagation in bones is affected by the severe mismatch between the acoustic properties of this biological solid and those of the surrounding soft medium, namely, the soft tissues in vivo or water in vitro. Bone imaging is then a nonlinear inverse-scattering problem. In this paper, we showed that in vitro quantitative images of sound velocities in a human femur cross section could be reconstructed by combining ultrasonic reflection tomography (URT), which provides images of the macroscopic structure of the bone, and ultrasonic transmission tomography (UTT), which provides quantitative images of the sound velocity. For the shape, we developed an image-processing tool to extract the external and internal boundaries and cortical thickness measurements. For velocity mapping, we used a wavelet analysis tool adapted to ultrasound, which allowed us to detect precisely the time of flight from the transmitted signals. A brief review of the ultrasonic tomography that we developed using correction algorithms of the wavepaths and compensation procedures are presented. Also shown are the first results of our analyses on models and specimens of long bone using our new iterative quantitative protocol

  1. Winston-Lutz Test: A quantitative analysis

    International Nuclear Information System (INIS)

    Pereira, Aline Garcia; Nandi, Dorival Menegaz; Saraiva, Crystian Wilian Chagas

    2017-01-01

    Objective: Describe a method of quantitative analysis for the Winston-Lutz test. Materials and methods The research is a qualitative exploratory study. The materials used were: portal film; Winston- Lutz test tools and Omni Pro software. Sixteen portal films were used as samples and were analyzed by five different technicians to measure the deviation between the radiation isocenters and mechanic. Results: Among the results were identified two combinations with offset values greater than 1 mm. In addition, when compared the method developed with the previously studied, it was observed that the data obtained are very close, with the maximum percentage deviation of 32.5%, which demonstrates its efficacy in reducing dependence on the performer. Conclusion: The results show that the method is reproducible and practical, which constitutes one of the fundamental factors for its implementation. (author)

  2. Quantitative recurrence for free semigroup actions

    Science.gov (United States)

    Carvalho, Maria; Rodrigues, Fagner B.; Varandas, Paulo

    2018-03-01

    We consider finitely generated free semigroup actions on a compact metric space and obtain quantitative information on Poincaré recurrence, average first return time and hitting frequency for the random orbits induced by the semigroup action. Besides, we relate the recurrence to balls with the rates of expansion of the semigroup generators and the topological entropy of the semigroup action. Finally, we establish a partial variational principle and prove an ergodic optimization for this kind of dynamical action. MC has been financially supported by CMUP (UID/MAT/00144/2013), which is funded by FCT (Portugal) with national (MEC) and European structural funds (FEDER) under the partnership agreement PT2020. FR and PV were partially supported by BREUDS. PV has also benefited from a fellowship awarded by CNPq-Brazil and is grateful to the Faculty of Sciences of the University of Porto for the excellent research conditions.

  3. Nailfold capillaroscopic report: qualitative and quantitative methods

    Directory of Open Access Journals (Sweden)

    S. Zeni

    2011-09-01

    Full Text Available Nailfold capillaroscopy (NVC is a simple and non-invasive method used for the assessment of patients with Raynaud’s phenomenon (RP and in the differential diagnosis of various connective tissue diseases. The scleroderma pattern abnormalities (giant capillaries, haemorrages and/or avascular areas have a positive predictive value for the development of scleroderma spectrum disorders. Thus, an analytical approach to nailfold capillaroscopy can be useful in quantitatively and reproducibly recording various parameters. We developed a new method to assess patients with RP that is capable of predicting the 5-year transition from isolated RP to RP secondary to scleroderma spectrum disorders. This model is a weighted combination of different capillaroscopic parameters (giant capillaries, microhaemorrages, number of capillaries that allows physicians to stratify RP patients easily using a relatively simple diagram to deduce prognosis.

  4. Quantitative radiation monitors for containment and surveillance

    International Nuclear Information System (INIS)

    Fehlau, P.E.

    1983-01-01

    Quantitative radiation monitors make it possible to differentiate between shielded and unshielded nuclear materials. The hardness of the gamma-ray spectrum is the attribute that characterizes bare or shielded material. Separate high- and low-energy gamma-ray regions are obtained from a single-channel analyzer through its window and discriminator outputs. The monitor counts both outputs and computes a ratio of the high- and low-energy region counts whenever an alarm occurs. The ratio clearly differentiates between shielded and unshielded nuclear material so that the net alarm count may be identified with a small quantity of unshielded material or a large quantity of shielded material. Knowledge of the diverted quantity helps determine whether an inventory should be called to identify the loss

  5. Quantitative phosphoproteomic analysis of postmortem muscle development

    DEFF Research Database (Denmark)

    Huang, Honggang

    Meat quality development is highly dependent on postmortem (PM) metabolism and rigor mortis development in PM muscle. PM glycometabolism and rigor mortis fundamentally determine most of the important qualities of raw meat, such as ultimate pH, tenderness, color and water-holding capacity. Protein...... phosphorylation is known to play essential roles on regulating metabolism, contraction and other important activities in muscle systems. However, protein phosphorylation has rarely been systematically explored in PM muscle in relation to meat quality. In this PhD project, both gel-based and mass spectrometry (MS......)-based quantitative phosphoproteomic strategies were employed to analyze PM muscle with the aim to intensively characterize the protein phosphorylation involved in meat quality development. Firstly, gel-based phosphoproteomic studies were performed to analyze the protein phosphorylation in both sarcoplasmic proteins...

  6. The quantitative modelling of human spatial habitability

    Science.gov (United States)

    Wise, J. A.

    1985-01-01

    A model for the quantitative assessment of human spatial habitability is presented in the space station context. The visual aspect assesses how interior spaces appear to the inhabitants. This aspect concerns criteria such as sensed spaciousness and the affective (emotional) connotations of settings' appearances. The kinesthetic aspect evaluates the available space in terms of its suitability to accommodate human movement patterns, as well as the postural and anthrometric changes due to microgravity. Finally, social logic concerns how the volume and geometry of available space either affirms or contravenes established social and organizational expectations for spatial arrangements. Here, the criteria include privacy, status, social power, and proxemics (the uses of space as a medium of social communication).

  7. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  8. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  9. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  10. Quantitative aspects of myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Vogel, R.A.

    1980-01-01

    Myocardial perfusion measurements have traditionally been performed in a quantitative fashion using application of the Sapirstein, Fick, Kety-Schmidt, or compartmental analysis principles. Although global myocardial blood flow measurements have not proven clinically useful, regional determinations have substantially advanced our understanding of and ability to detect myocardial ischemia. With the introduction of thallium-201, such studies have become widely available, although these have generally undergone qualitative evaluation. Using computer-digitized data, several methods for the quantification of myocardial perfusion images have been introduced. These include orthogonal and polar coordinate systems and anatomically oriented region of interest segmentation. Statistical ranges of normal and time-activity analyses have been applied to these data, resulting in objective and reproducible means of data evaluation

  11. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  12. Quantitative Efficiency Evaluation Method for Transportation Networks

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2014-11-01

    Full Text Available An effective evaluation of transportation network efficiency/performance is essential to the establishment of sustainable development in any transportation system. Based on a redefinition of transportation network efficiency, a quantitative efficiency evaluation method for transportation network is proposed, which could reflect the effects of network structure, traffic demands, travel choice, and travel costs on network efficiency. Furthermore, the efficiency-oriented importance measure for network components is presented, which can be used to help engineers identify the critical nodes and links in the network. The numerical examples show that, compared with existing efficiency evaluation methods, the network efficiency value calculated by the method proposed in this paper can portray the real operation situation of the transportation network as well as the effects of main factors on network efficiency. We also find that the network efficiency and the importance values of the network components both are functions of demands and network structure in the transportation network.

  13. Quantitative Accelerated Life Testing of MEMS Accelerometers.

    Science.gov (United States)

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-11-20

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing thereliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shownin this paper and an attempt to assess the reliability level for a batch of MEMSaccelerometers is reported. The testing plan is application-driven and contains combinedtests: thermal (high temperature) and mechanical stress. Two variants of mechanical stressare used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tiltingand high temperature is used. Tilting is appropriate as application-driven stress, because thetilt movement is a natural environment for devices used for automotive and aerospaceapplications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The testresults demonstrated the excellent reliability of the studied devices, the failure rate in the"worst case" being smaller than 10 -7 h -1 .

  14. Nonparametric functional mapping of quantitative trait loci.

    Science.gov (United States)

    Yang, Jie; Wu, Rongling; Casella, George

    2009-03-01

    Functional mapping is a useful tool for mapping quantitative trait loci (QTL) that control dynamic traits. It incorporates mathematical aspects of biological processes into the mixture model-based likelihood setting for QTL mapping, thus increasing the power of QTL detection and the precision of parameter estimation. However, in many situations there is no obvious functional form and, in such cases, this strategy will not be optimal. Here we propose to use nonparametric function estimation, typically implemented with B-splines, to estimate the underlying functional form of phenotypic trajectories, and then construct a nonparametric test to find evidence of existing QTL. Using the representation of a nonparametric regression as a mixed model, the final test statistic is a likelihood ratio test. We consider two types of genetic maps: dense maps and general maps, and the power of nonparametric functional mapping is investigated through simulation studies and demonstrated by examples.

  15. Immune adherence: a quantitative and kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sekine, T [National Cancer Center, Tokyo (Japan). Research Inst.

    1978-09-01

    Quantitative and kinetic analysis of the immune-adherence reaction (IA) between C3b fragments and IA receptors as an agglutination reaction is difficult. Analysis is possible, however, by use of radio-iodinated bovine serum albumin as antigen at low concentrations (less than 200 ng/ml) and optimal concentration of antibody to avoid precipitation of antigen-antibody complexes with human erythrocytes without participation of complement. Antigen and antibody are reacted at 37/sup 0/C, complement is added, the mixture incubated and human erythrocytes added; after further incubation, ice-cold EDTA containing buffer is added and the erythrocytes centrifuged and assayed for radioactivity. Control cells reacted with heated guinea pig serum retained less than 5% of the added radioactivity. The method facilitates measurement of IA reactivity and permits more detailed analysis of the mechanism underlying the reaction.

  16. Quantitative MFM on superconducting thin films

    Energy Technology Data Exchange (ETDEWEB)

    Stopfel, Henry; Vock, Silvia; Shapoval, Tetyana; Neu, Volker; Wolff, Ulrike; Haindl, Silvia; Engelmann, Jan; Schaefer, Rudolf; Holzapfel, Bernhard; Schultz, Ludwig [IFW Dresden, Institute for Metallic Material (Germany); Inosov, Dmytro S. [Max Planck Institute for Solid State Research, Stuttgart (Germany)

    2012-07-01

    Quantitative interpretation of magnetic force microscopy (MFM) data is a challenge, because the measured signal is a convolution between the magnetization of the tip and the stray field emanated by the sample. It was established theoretically that the field distribution just above the surface of the superconductor can be well approximated by the stray field of a magnetic monopole. The description of the MFM tip, however, needs a second approximation. The temperature-dependent vortex-distribution images on a NbN thin film were fitted using two different tip models. Firstly, the magnetic tip was assumed to be a monopole that leads to the simple monopole-monopole model for the tip-sample interaction force. Performing a 2D fitting of the data with this model, we extracted λ, Δ and the vortex pinning force. Secondly, a geometrical model was applied to calculate the tip-transfer-function of the MFM tip using the numerical BEM method.

  17. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  18. Safety culture management and quantitative indicator evaluation

    International Nuclear Information System (INIS)

    Mandula, J.

    2002-01-01

    This report discuses a relationship between safety culture and evaluation of quantitative indicators. It shows how a systematic use of generally shared operational safety indicators may contribute to formation and reinforcement of safety culture characteristics in routine plant operation. The report also briefly describes the system of operational safety indicators used at the Dukovany plant. It is a PC database application enabling an effective work with the indicators and providing all users with an efficient tool for making synoptic overviews of indicator values in their links and hierarchical structure. Using color coding, the system allows quick indicator evaluation against predefined limits considering indicator value trends. The system, which has resulted from several-year development, was completely established at the plant during the years 2001 and 2002. (author)

  19. Geomorphology: now a more quantitative science

    International Nuclear Information System (INIS)

    Lal, D.

    1995-01-01

    Geomorphology, one of the oldest branches of planetary science, is now growing into a quantitative field with the development of a nuclear method capable of providing numeric time controls on a great variety of superficial processes. The method complement the conventional dating methods, e.g. 40 K/ 40 Ar, 87 Rb/ 87 Sr, by providing information on geomorphic processes., e.g. the dwell times of rocks on the earth's surface with strict geometrical constraints; e.g., rates of physical and chemical weathering in the past, chronology of events associated with glaciation, etc. This article attempts to discuss the new possibilities that now exist for studying a wide range of geomorphic processes, with examples of some specific isotopic changes that allow one to model glacial chronology, and evolutionary histories of alluvial fans and sand dunes. (author). 9 refs., 3 figs., 4 tabs

  20. Investment appraisal using quantitative risk analysis.

    Science.gov (United States)

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  1. Quantitative measurement of the cerebral blood flow

    International Nuclear Information System (INIS)

    Houdart, R.; Mamo, H.; Meric, P.; Seylaz, J.

    1976-01-01

    The value of the cerebral blood flow measurement (CBF) is outlined, its limits are defined and some future prospects discussed. The xenon 133 brain clearance study is at present the most accurate quantitative method to evaluate the CBF in different regions of the brain simultaneously. The method and the progress it has led to in the physiological, physiopathological and therapeutic fields are described. The major disadvantage of the method is shown to be the need to puncture the internal carotid for each measurement. Prospects are discussed concerning methods derived from the same general principle but using a simpler, non-traumatic way to introduce the radio-tracer, either by breathing into the lungs or intraveinously [fr

  2. Quantitative Susceptibility Mapping in Parkinson's Disease.

    Science.gov (United States)

    Langkammer, Christian; Pirpamer, Lukas; Seiler, Stephan; Deistung, Andreas; Schweser, Ferdinand; Franthal, Sebastian; Homayoon, Nina; Katschnig-Winter, Petra; Koegl-Wallner, Mariella; Pendl, Tamara; Stoegerer, Eva Maria; Wenzel, Karoline; Fazekas, Franz; Ropele, Stefan; Reichenbach, Jürgen Rainer; Schmidt, Reinhold; Schwingenschuh, Petra

    2016-01-01

    Quantitative susceptibility mapping (QSM) and R2* relaxation rate mapping have demonstrated increased iron deposition in the substantia nigra of patients with idiopathic Parkinson's disease (PD). However, the findings in other subcortical deep gray matter nuclei are converse and the sensitivity of QSM and R2* for morphological changes and their relation to clinical measures of disease severity has so far been investigated only sparsely. The local ethics committee approved this study and all subjects gave written informed consent. 66 patients with idiopathic Parkinson's disease and 58 control subjects underwent quantitative MRI at 3T. Susceptibility and R2* maps were reconstructed from a spoiled multi-echo 3D gradient echo sequence. Mean susceptibilities and R2* rates were measured in subcortical deep gray matter nuclei and compared between patients with PD and controls as well as related to clinical variables. Compared to control subjects, patients with PD had increased R2* values in the substantia nigra. QSM also showed higher susceptibilities in patients with PD in substantia nigra, in the nucleus ruber, thalamus, and globus pallidus. Magnetic susceptibility of several of these structures was correlated with the levodopa-equivalent daily dose (LEDD) and clinical markers of motor and non-motor disease severity (total MDS-UPDRS, MDS-UPDRS-I and II). Disease severity as assessed by the Hoehn & Yahr scale was correlated with magnetic susceptibility in the substantia nigra. The established finding of higher R2* rates in the substantia nigra was extended by QSM showing superior sensitivity for PD-related tissue changes in nigrostriatal dopaminergic pathways. QSM additionally reflected the levodopa-dosage and disease severity. These results suggest a more widespread pathologic involvement and QSM as a novel means for its investigation, more sensitive than current MRI techniques.

  3. Quantitative Analysis of Thallium-201 Myocardial Tomograms

    International Nuclear Information System (INIS)

    Kim, Sang Eun; Nam, Gi Byung; Choi, Chang Woon

    1991-01-01

    The purpose of this study was to assess the ability of quantitative Tl-201 tomography to identify and localize coronary artery disease (CAD). The study population consisted of 41 patients (31 males, 10 females; mean age 55 ± 7 yr) including 14 with prior myocardial infarction who underwent both exercise Tl-201 myocardium SPECT and coronary angiography for the evaluation of chest pain. From the short axis and vertical long axis tomograms, stress extent polar maps were generated by Cedars-Sinai Medical Center program, and the 9 stress defect extent (SDE) was quantified for each coronary artery territory. For the purpose of this study, the coronary circulation was divided into 6 arterial segments, and the myocardial ischemic score (MIS) was calculated from the coronary angiogram. Sensitivity for the detection of CAD (>50% coronary stenosis by angiography) by stress extent polar map was 95% in single vessel disease, and 100% in double and triple vessel diseases. Overall sensitivity was 97%<. Sensitivity and specificity for the detection of individual diseased vessels were, respectively, 87% and 90% for the left anterior descending artery (LAD), 36% and 93% for the left circumflex artery (LCX), and 71% and 70%, for the right coronary artery (RCA). Concordance for the detection of individual diseased vessels between the coronary angiography and stress polar map was fair for the LAD (kappa=0.70), and RCA (kappa=0.41) lesions, whereas it was poor for the LCK lesions (kappa =0.32) There were significant correlations between the MIS and SDE in LAD (rs=0. 56, p=0.0027), and RCA territory (rs=0.60, p=0.0094). No significant correlation was found in LCX territory. When total vascular territories were combined, there was a significant correlation between the MIS and SDE (rs=0.42, p=0,0116). In conclusion, the quantitative analysis of Tl-201 tomograms appears to be accurate for determining the presence and location of CAD.

  4. Reproducibility of quantitative planar thallium-201 scintigraphy: quantitative criteria for reversibility of myocardial perfusion defects

    International Nuclear Information System (INIS)

    Sigal, S.L.; Soufer, R.; Fetterman, R.C.; Mattera, J.A.; Wackers, F.J.

    1991-01-01

    Fifty-two paired stress/delayed planar 201 TI studies (27 exercise studies, 25 dipyridamole studies) were processed twice by seven technologists to assess inter- and intraobserver variability. The reproducibility was inversely related to the size of 201 Tl perfusion abnormalities. Intraobserver variability was not different between exercise and dipyridamole studies for lesions of similar size. Based upon intraobserver variability, objective quantitative criteria for reversibility of perfusion abnormalities were defined. These objective criteria were tested prospectively in a separate group of 35 201 Tl studies and compared with the subjective interpretation of quantitative circumferential profiles. Overall, exact agreement existed in 78% of images (kappa statistic k = 0.66). We conclude that quantification of planar 201 Tl scans is highly reproducible, with acceptable inter- and intraobserver variability. Objective criteria for lesion reversibility correlated well with analysis by experienced observers

  5. Evolutionary Quantitative Genomics of Populus trichocarpa.

    Directory of Open Access Journals (Sweden)

    Ilga Porth

    Full Text Available Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance was investigated for signatures of selection (comparing QST-FST using clustering of individuals by climate of origin (temperature and precipitation. 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation; 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes were associated with adaptive traits (based on significant QST. Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show

  6. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  7. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  8. Progress in quantitative GPR development at CNDE

    Energy Technology Data Exchange (ETDEWEB)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott [Center for Nondestructive Evaluation, Iowa State University, 1915 Scholl Road, Ames, IA 50011-3042 (United States)

    2014-02-18

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  9. Quantitative Ultrasond in the assessment of Osteoporosis

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Terlizzi, Francesca de

    2009-01-01

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  10. Quantitation of vitamin K in human milk

    International Nuclear Information System (INIS)

    Canfield, L.M.; Hopkinson, J.M.; Lima, A.F.; Martin, G.S.; Sugimoto, K.; Burr, J.; Clark, L.; McGee, D.L.

    1990-01-01

    A quantitative method was developed for the assay of vitamin K in human colostrum and milk. The procedure combines preparative and analytical chromatography on silica gel in a nitrogen atmosphere followed by reversed phase high performance liquid chromatography (HPLC). Two HPLC steps were used: gradient separation with ultraviolet (UV) detection followed by isocratic separation detected electrochemically. Due to co-migrating impurities, UV detection alone is insufficient for identification of vitamin K. Exogenous vitamin K was shown to equilibrate with endogenous vitamin K in the samples. A statistical method was incorporated to control for experimental variability. Vitamin K1 was analyzed in 16 pooled milk samples from 7 donors and in individual samples from 15 donors at 1 month post-partum. Vitamin K1 was present at 2.94 +/- 1.94 and 3.15 +/- 2.87 ng/mL in pools and in individuals, respectively. Menaquinones, the bacterial form of the vitamin, were not detected. The significance of experimental variation to studies of vitamin K in individuals is discussed

  11. Quantitative theory of driven nonlinear brain dynamics.

    Science.gov (United States)

    Roberts, J A; Robinson, P A

    2012-09-01

    Strong periodic stimuli such as bright flashing lights evoke nonlinear responses in the brain and interact nonlinearly with ongoing cortical activity, but the underlying mechanisms for these phenomena are poorly understood at present. The dominant features of these experimentally observed dynamics are reproduced by the dynamics of a quantitative neural field model subject to periodic drive. Model power spectra over a range of drive frequencies show agreement with multiple features of experimental measurements, exhibiting nonlinear effects including entrainment over a range of frequencies around the natural alpha frequency f(α), subharmonic entrainment near 2f(α), and harmonic generation. Further analysis of the driven dynamics as a function of the drive parameters reveals rich nonlinear dynamics that is predicted to be observable in future experiments at high drive amplitude, including period doubling, bistable phase-locking, hysteresis, wave mixing, and chaos indicated by positive Lyapunov exponents. Moreover, photosensitive seizures are predicted for physiologically realistic model parameters yielding bistability between healthy and seizure dynamics. These results demonstrate the applicability of neural field models to the new regime of periodically driven nonlinear dynamics, enabling interpretation of experimental data in terms of specific generating mechanisms and providing new tests of the theory. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Quantitative risk assessment of drinking water contaminants

    International Nuclear Information System (INIS)

    Cothern, C.R.; Coniglio, W.A.; Marcus, W.L.

    1986-01-01

    The development of criteria and standards for the regulation of drinking water contaminants involves a variety of processes, one of which is risk estimation. This estimation process, called quantitative risk assessment, involves combining data on the occurrence of the contaminant in drinking water and its toxicity. The human exposure to a contaminant can be estimated from occurrence data. Usually the toxicity or number of health effects per concentration level is estimated from animal bioassay studies using the multistage model. For comparison, other models will be used including the Weibull, probit, logit and quadratic ones. Because exposure and toxicity data are generally incomplete, assumptions need to be made and this generally results in a wide range of certainty in the estimates. This range can be as wide as four to six orders of magnitude in the case of the volatile organic compounds in drinking water and a factor of four to five for estimation of risk due to radionuclides in drinking water. As examples of the differences encountered in risk assessment of drinking water contaminants, discussions are presented on benzene, lead, radon and alachlor. The lifetime population risk estimates for these contaminants are, respectively, in the ranges of: <1 - 3000, <1 - 8000, 2000-40,000 and <1 - 80. 11 references, 1 figure, 1 table

  13. Progress in quantitative GPR development at CNDE

    Science.gov (United States)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-02-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  14. Progress in quantitative GPR development at CNDE

    International Nuclear Information System (INIS)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-01-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability

  15. Quantitative phenotyping via deep barcode sequencing.

    Science.gov (United States)

    Smith, Andrew M; Heisler, Lawrence E; Mellor, Joseph; Kaper, Fiona; Thompson, Michael J; Chee, Mark; Roth, Frederick P; Giaever, Guri; Nislow, Corey

    2009-10-01

    Next-generation DNA sequencing technologies have revolutionized diverse genomics applications, including de novo genome sequencing, SNP detection, chromatin immunoprecipitation, and transcriptome analysis. Here we apply deep sequencing to genome-scale fitness profiling to evaluate yeast strain collections in parallel. This method, Barcode analysis by Sequencing, or "Bar-seq," outperforms the current benchmark barcode microarray assay in terms of both dynamic range and throughput. When applied to a complex chemogenomic assay, Bar-seq quantitatively identifies drug targets, with performance superior to the benchmark microarray assay. We also show that Bar-seq is well-suited for a multiplex format. We completely re-sequenced and re-annotated the yeast deletion collection using deep sequencing, found that approximately 20% of the barcodes and common priming sequences varied from expectation, and used this revised list of barcode sequences to improve data quality. Together, this new assay and analysis routine provide a deep-sequencing-based toolkit for identifying gene-environment interactions on a genome-wide scale.

  16. Qualitative and quantitative descriptions of glenohumeral motion.

    Science.gov (United States)

    Hill, A M; Bull, A M J; Wallace, A L; Johnson, G R

    2008-02-01

    Joint modelling plays an important role in qualitative and quantitative descriptions of both normal and abnormal joints, as well as predicting outcomes of alterations to joints in orthopaedic practice and research. Contemporary efforts in modelling have focussed upon the major articulations of the lower limb. Well-constrained arthrokinematics can form the basis of manageable kinetic and dynamic mathematical predictions. In order to contain computation of shoulder complex modelling, glenohumeral joint representations in both limited and complete shoulder girdle models have undergone a generic simplification. As such, glenohumeral joint models are often based upon kinematic descriptions of inadequate degrees of freedom (DOF) for clinical purposes and applications. Qualitative descriptions of glenohumeral motion range from the parody of a hinge joint to the complex realism of a spatial joint. In developing a model, a clear idea of intention is required in order to achieve a required application. Clinical applicability of a model requires both descriptive and predictive output potentials, and as such, a high level of validation is required. Without sufficient appreciation of the clinical intention of the arthrokinematic foundation to a model, error is all too easily introduced. Mathematical description of joint motion serves to quantify all relevant clinical parameters. Commonly, both the Euler angle and helical (screw) axis methods have been applied to the glenohumeral joint, although concordance between these methods and classical anatomical appreciation of joint motion is limited, resulting in miscommunication between clinician and engineer. Compounding these inconsistencies in motion quantification is gimbal lock and sequence dependency.

  17. An unconventional method of quantitative microstructural analysis

    International Nuclear Information System (INIS)

    Rastani, M.

    1995-01-01

    The experiment described here introduces a simple methodology which could be used to replace the time-consuming and expensive conventional methods of metallographic and quantitative analysis of thermal treatment effect on microstructure. The method is ideal for the microstructural evaluation of tungsten filaments and other wire samples such as copper wire which can be conveniently coiled. Ten such samples were heat treated by ohmic resistance at temperatures which were expected to span the recrystallization range. After treatment, the samples were evaluated in the elastic recovery test. The normalized elastic recovery factor was defined in terms of these deflections. Experimentally it has shown that elastic recovery factor depends on the degree of recrystallization. In other words this factor is used to determine the fraction of unrecrystallized material. Because the elastic recovery method examines the whole filament rather than just one section through the filament as in metallographical method, it more accurately measures the degree of recrystallization. The method also takes a considerably shorter time and cost compared to the conventional method

  18. Limits of qualitative detection and quantitative determination

    International Nuclear Information System (INIS)

    Curie, L.A.

    1976-01-01

    The fact that one can find a series of disagreeing and limiting definitions of the detection limit leads to the reinvestigation of the problems of signal detection and signal processing in analytical and nuclear chemistry. Three cut-off levels were fixed: Lsub(C) - the net signal level (sensitivity of the equipment), above which an observed signal can be reliably recognized as 'detected'; Lsub(D) - the 'true' net signal level, from which one can a priori expect a detection; Lsub(Q) - the level at which the measuring accuracy is sufficient for quantitative determination. Exact definition equations as well as a series of working formulae are given for the general analytical case and for the investigation of radioactivity. As it is assumed that the radioactivity of the Poisson distribution is determined, it is dealt with in such a manner that precise limits can be derived for short-lived and long-lived radionuclides with or without disturbance. The fundamentals are made clear by simple examples for spectrophotometry and radioactivity and by a complicated example for activation analysis in which one must choose between alternative nuclear reactions. (orig./LH) [de

  19. Precision of different quantitative ultrasound densitometers

    International Nuclear Information System (INIS)

    Pocock, N.A.; Harris, N.D.; Griffiths, M.R.

    1998-01-01

    Full text: Quantitative ultrasound (QUS) of the calcaneus, which measures Speed of Sound (SOS) and Broadband ultrasound attenuation (BUA), is predictive of the risk of osteoporotic fracture. However, the utility of QUS for predicting fracture risk or for monitoring treatment efficacy depends on its precision and reliability. Published results and manufacturers data vary significantly due to differences in statistical methodology. We have assessed the precision of the current model of the Lunar Achilles and the McCue Cuba QUS densitometers; the most commonly used QUS machines in Australia. Twenty seven subjects had duplicate QUS measurements performed on the same day on both machines. These data were used to calculate the within pair standard deviation (SD) the co-efficient of variation, CV and the standardised co efficient of variation (sCV) which is corrected for the dynamic range. In addition, the co-efficient of reliability (R) was calculated as an index of reliability which is independent of the population mean value, or the dynamic range of the measurements. R ranges between 0 (for no reliability) to 1(for a perfect reliability). The results indicate that the precision of QUS is dependent on the dynamic range and the instrument. Furthermore, they suggest that while QUS is a useful predictor of fracture risk, at present it has limited clinical value in monitoring short term age-related bone loss of 1-2% per year

  20. Quantitative Measurements using Ultrasound Vector Flow Imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    scanner for pulsating flow mimicking the femoral artery from a CompuFlow 1000 pump (Shelley Medical). Data were used in four estimators based on directional transverse oscillation for velocity, flow angle, volume flow, and turbulence estimation and their respective precisions. An adaptive lag scheme gave...... the ability to estimate a large velocity range, or alternatively measure at two sites to find e.g. stenosis degree in a vessel. The mean angle at the vessel center was estimated to 90.9◦±8.2◦ indicating a laminar flow from a turbulence index being close to zero (0.1 ±0.1). Volume flow was 1.29 ±0.26 mL/stroke...... (true: 1.15 mL/stroke, bias: 12.2%). Measurements down to 160 mm were obtained with a relative standard deviation and bias of less than 10% for the lateral component for stationary, parabolic flow. The method can, thus, find quantitative velocities, angles, and volume flows at sites currently...

  1. Quantitative NDE of Composite Structures at NASA

    Science.gov (United States)

    Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.

    2015-01-01

    The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.

  2. Neuropathic pain: is quantitative sensory testing helpful?

    Science.gov (United States)

    Krumova, Elena K; Geber, Christian; Westermann, Andrea; Maier, Christoph

    2012-08-01

    Neuropathic pain arises as a consequence of a lesion or disease affecting the somatosensory system and is characterised by a combination of positive and negative sensory symptoms. Quantitative sensory testing (QST) examines the sensory perception after application of different mechanical and thermal stimuli of controlled intensity and the function of both large (A-beta) and small (A-delta and C) nerve fibres, including the corresponding central pathways. QST can be used to determine detection, pain thresholds and stimulus-response curves and can thus detect both negative and positive sensory signs, the second ones not being assessed by other methods. Similarly to all other psychophysical tests QST requires standardised examination, instructions and data evaluation to receive valid and reliable results. Since normative data are available, QST can contribute also to the individual diagnosis of neuropathy, especially in the case of isolated small-fibre neuropathy, in contrast to the conventional electrophysiology which assesses only large myelinated fibres. For example, detection of early stages of subclinical neuropathy in symptomatic or asymptomatic patients with diabetes mellitus can be helpful to optimise treatment and identify diabetic foot at risk of ulceration. QST assessed the individual's sensory profile and thus can be valuable to evaluate the underlying pain mechanisms which occur in different frequencies even in the same neuropathic pain syndromes. Furthermore, assessing the exact sensory phenotype by QST might be useful in the future to identify responders to certain treatments in accordance to the underlying pain mechanisms.

  3. Quantitative topographic differentiation of the neonatal EEG.

    Science.gov (United States)

    Paul, Karel; Krajca, Vladimír; Roth, Zdenek; Melichar, Jan; Petránek, Svojmil

    2006-09-01

    To test the discriminatory topographic potential of a new method of the automatic EEG analysis in neonates. A quantitative description of the neonatal EEG can contribute to the objective assessment of the functional state of the brain, and may improve the precision of diagnosing cerebral dysfunctions manifested by 'disorganization', 'dysrhythmia' or 'dysmaturity'. 21 healthy, full-term newborns were examined polygraphically during sleep (EEG-8 referential derivations, respiration, ECG, EOG, EMG). From each EEG record, two 5-min samples (one from the middle of quiet sleep, the other from the middle of active sleep) were subject to subsequent automatic analysis and were described by 13 variables: spectral features and features describing shape and variability of the signal. The data from individual infants were averaged and the number of variables was reduced by factor analysis. All factors identified by factor analysis were statistically significantly influenced by the location of derivation. A large number of statistically significant differences were also established when comparing the effects of individual derivations on each of the 13 measured variables. Both spectral features and features describing shape and variability of the signal are largely accountable for the topographic differentiation of the neonatal EEG. The presented method of the automatic EEG analysis is capable to assess the topographic characteristics of the neonatal EEG, and it is adequately sensitive and describes the neonatal electroencephalogram with sufficient precision. The discriminatory capability of the used method represents a promise for their application in the clinical practice.

  4. Marketing communications: Qualitative and quantitative paradigm

    Directory of Open Access Journals (Sweden)

    Uzelac Nikola

    2005-01-01

    Full Text Available This paper focuses on key issues in relation to the choice of basic language of communication of marketing as a practical and academic field. Principally, marketing managers prefer descriptive way of expression, but they should use the advantages of language of numbers much more. By doing so, they will advance decision-making process - and the communication with finance and top management. In this regard, models offered by academic community could be helpful. This especially pertains to those positive or normative verbal approaches and models in which mathematics and statistical solutions have been embedded, as well as to those which emphasize financial criteria in decision-making. Concerning the process of creation and verification of scientific knowledge, the choice between languages of words and numbers is the part of much wider dimension, because it is inseparable from the decision on basic research orientation. Quantitative paradigm is more appropriate for hypotheses testing, while qualitative paradigm gives greater contribution in their generation. Competition factor could become the key driver of changes by which existing "parallel worlds" of main paradigms would be integrating, for the sake of disciplinary knowledge advancement.

  5. Quantitative multi-modal NDT data analysis

    International Nuclear Information System (INIS)

    Heideklang, René; Shokouhi, Parisa

    2014-01-01

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundant information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity

  6. Individual patient dosimetry using quantitative SPECT imaging

    International Nuclear Information System (INIS)

    Gonzalez, J.; Oliva, J.; Baum, R.; Fisher, S.

    2002-01-01

    An approach is described to provide individual patient dosimetry for routine clinical use. Accurate quantitative SPECT imaging was achieved using appropriate methods. The volume of interest (VOI) was defined semi-automatically using a fixed threshold value obtained from phantom studies. The calibration factor to convert the voxel counts from SPECT images into activity values was determine from calibrated point source using the same threshold value as in phantom studies. From selected radionuclide the dose within and outside a sphere of voxel dimension at different distances was computed through dose point-kernels to obtain a discrete absorbed dose kernel representation around the volume source with uniform activity distribution. The spatial activity distribution from SPECT imaging was convolved with this kernel representation using the discrete Fourier transform method to yield three-dimensional absorbed dose rate distribution. The accuracy of dose rates calculation was validated by software phantoms. The absorbed dose was determined by integration of the dose rate distribution for each volume of interest (VOI). Parameters for treatment optimization such as dose rate volume histograms and dose rate statistic are provided. A patient example was used to illustrate our dosimetric calculations

  7. Quantitative infrared analysis of hydrogen fluoride

    International Nuclear Information System (INIS)

    Manuta, D.M.

    1997-04-01

    This work was performed at the Portsmouth Gaseous Diffusion Plant where hydrogen fluoride is produced upon the hydrolysis of UF 6 . This poses a problem for in this setting and a method for determining the mole percent concentration was desired. HF has been considered to be a non-ideal gas for many years. D. F. Smith utilized complex equations in his HF studies in the 1950s. We have evaluated HF behavior as a function of pressure from three different perspectives. (1) Absorbance at 3877 cm -1 as a function of pressure for 100% HF. (2) Absorbance at 3877 cm -1 as a function of increasing partial pressure HF. Total pressure = 300 mm HgA maintained with nitrogen. (3) Absorbance at 3877 cm -1 for constant partial pressure HF. Total pressure is increased to greater than 800 mm HgA with nitrogen. These experiments have shown that at partial pressures up to 35mm HgA, HIF follows the ideal gas law. The absorbance at 3877 cm -1 can be quantitatively analyzed via infrared methods

  8. Hepatic iron overload: Quantitative MR imaging

    International Nuclear Information System (INIS)

    Gomori, J.M.; Horev, G.; Tamary, H.; Zandback, J.; Kornreich, L.; Zaizov, R.; Freud, E.; Krief, O.; Ben-Meir, J.; Rotem, H.

    1991-01-01

    Iron deposits demonstrate characteristically shortened T2 relaxation times. Several previously published studies reported poor correlation between the in vivo hepatic 1/T2 measurements made by means of midfield magnetic resonance (MR) units and the hepatic iron content of iron-overloaded patients. In this study, the authors assessed the use of in vivo 1/T2 measurements obtained by means of MR imaging at 0.5 T using short echo times (13.4 and 30 msec) and single-echo-sequences as well as computed tomographic (CT) attenuation as a measure of liver iron concentration in 10 severely iron-overloaded patients with beta-thalassemia major. The iron concentrations in surgical wedge biopsy samples of the liver, which varied between 3 and 9 mg/g of wet weight (normal, less than or equal to 0.5 mg/g), correlated well (r = .93, P less than or equal to .0001) with the preoperative in vivo hepatic 1/T2 measurements. The CT attenuation did not correlate with liver iron concentration. Quantitative MR imaging is a readily available noninvasive method for the assessment of hepatic iron concentration in iron-overloaded patients, reducing the need for needle biopsies of the liver

  9. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  10. A Quantitative Index of Forest Structural Sustainability

    Directory of Open Access Journals (Sweden)

    Jonathan A. Cale

    2014-07-01

    Full Text Available Forest health is a complex concept including many ecosystem functions, interactions and values. We develop a quantitative system applicable to many forest types to assess tree mortality with respect to stable forest structure and composition. We quantify impacts of observed tree mortality on structure by comparison to baseline mortality, and then develop a system that distinguishes between structurally stable and unstable forests. An empirical multivariate index of structural sustainability and a threshold value (70.6 derived from 22 nontropical tree species’ datasets differentiated structurally sustainable from unsustainable diameter distributions. Twelve of 22 species populations were sustainable with a mean score of 33.2 (median = 27.6. Ten species populations were unsustainable with a mean score of 142.6 (median = 130.1. Among them, Fagus grandifolia, Pinus lambertiana, P. ponderosa, and Nothofagus solandri were attributable to known disturbances; whereas the unsustainability of Abies balsamea, Acer rubrum, Calocedrus decurrens, Picea engelmannii, P. rubens, and Prunus serotina populations were not. This approach provides the ecological framework for rational management decisions using routine inventory data to objectively: determine scope and direction of change in structure and composition, assess excessive or insufficient mortality, compare disturbance impacts in time and space, and prioritize management needs and allocation of scarce resources.

  11. Quantitative Ultrasond in the assessment of Osteoporosis

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, Giuseppe [Department of Radiology, University of Foggia, Viale L. Pinto, 71100 Foggia (Italy); Department of Radiology, Scientific Institute Hospital, San Giovanni Rotondo (Italy)], E-mail: g.guglielmi@unifg.it; Terlizzi, Francesca de [IGEA srl, Via Parmenide 10/A 41012 Carpi, MO (Italy)], E-mail: f.deterlizzi@igeamedical.com

    2009-09-15

    Quantitative ultrasound (QUS) is used in the clinical setting to identify changes in bone tissue connected with menopause, osteoporosis and bone fragility. The versatility of the technique, its low cost and lack of ionizing radiation have led to the use of this method worldwide. Furthermore, with increased clinical interest among clinicians, QUS has been applied to several field of investigation of bone, in various pathologies of bone metabolism, in paediatrics, neonatology, genetics and other fields. Several studies have been carried out in recent years to investigate the potential of QUS, with important positive results. The technique is able to predict osteoporotic fractures; some evidence of the ability to monitor therapies has been reported; the usefulness in the management of secondary osteoporosis has been confirmed; studies in paediatrics have reported reference curves for some QUS devices, and there have been relevant studies in conditions involving metabolic bone disorders. This article is an overview of the most relevant developments in the field of QUS, both in the clinical and in the experimental settings. The advantages and limitations of the present technique have been outlined, together with suggestions for the use in the clinical practice.

  12. Quantitative rotating frame relaxometry methods in MRI.

    Science.gov (United States)

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  14. Quantitative Analysis of Retrieved Glenoid Liners

    Directory of Open Access Journals (Sweden)

    Katelyn Childs

    2016-02-01

    Full Text Available Revision of orthopedic surgeries is often expensive and involves higher risk from complications. Since most total joint replacement devices use a polyethylene bearing, which serves as a weak link, the assessment of damage to the liner due to in vivo exposure is very important. The failures often are due to excessive polyethylene wear. The glenoid liners are complex and hemispherical in shape and present challenges while assessing the damage. Therefore, the study on the analysis of glenoid liners retrieved from revision surgery may lend insight into common wear patterns and improve future product designs. The purpose of this pilot study is to further develop the methods of segmenting a liner into four quadrants to quantify the damage in the liner. Different damage modes are identified and statistically analyzed. Multiple analysts were recruited to conduct the damage assessments. In this paper, four analysts evaluated nine glenoid liners, retrieved from revision surgery, two of whom had an engineering background and two of whom had a non-engineering background. Associated human factor mechanisms are reported in this paper. The wear patterns were quantified using the Hood/Gunther, Wasielewski, Brandt, and Lombardi methods. The quantitative assessments made by several observers were analyzed. A new, composite damage parameter was developed and applied to assess damage. Inter-observer reliability was assessed using a paired t-test. Data reported by four analysts showed a high standard deviation; however, only two analysts performed the tests in a significantly similar way and they had engineering backgrounds.

  15. Quantitative magnetotail characteristics of different magnetospheric states

    Directory of Open Access Journals (Sweden)

    M. A. Shukhtina

    2004-03-01

    Full Text Available Quantitative relationships allowing one to compute the lobe magnetic field, flaring angle and tail radius, and to evaluate magnetic flux based on solar wind/IMF parameters and spacecraft position are obtained for the middle magnetotail, X=(–15,–35RE, using 3.5 years of simultaneous Geotail and Wind spacecraft observations. For the first time it was done separately for different states of magnetotail including the substorm onset (SO epoch, the steady magnetospheric convection (SMC and quiet periods (Q. In the explored distance range the magnetotail parameters appeared to be similar (within the error bar for Q and SMC states, whereas at SO their values are considerably larger. In particular, the tail radius is larger by 1–3 RE at substorm onset than during Q and SMC states, for which the radius value is close to previous magnetopause model values. The calculated lobe magnetic flux value at substorm onset is ~1GWb, exceeding that at Q (SMC states by ~50%. The model magnetic flux values at substorm onset and SMC show little dependence on the solar wind dynamic pressure and distance in the tail, so the magnetic flux value can serve as an important discriminator of the state of the middle magnetotail. Key words. Magnetospheric physics (solar windmagnetosphere- interactions, magnetotail, storms and substorms

  16. Quantitative magnetotail characteristics of different magnetospheric states

    Directory of Open Access Journals (Sweden)

    M. A. Shukhtina

    2004-03-01

    Full Text Available Quantitative relationships allowing one to compute the lobe magnetic field, flaring angle and tail radius, and to evaluate magnetic flux based on solar wind/IMF parameters and spacecraft position are obtained for the middle magnetotail, X=(–15,–35RE, using 3.5 years of simultaneous Geotail and Wind spacecraft observations. For the first time it was done separately for different states of magnetotail including the substorm onset (SO epoch, the steady magnetospheric convection (SMC and quiet periods (Q. In the explored distance range the magnetotail parameters appeared to be similar (within the error bar for Q and SMC states, whereas at SO their values are considerably larger. In particular, the tail radius is larger by 1–3 RE at substorm onset than during Q and SMC states, for which the radius value is close to previous magnetopause model values. The calculated lobe magnetic flux value at substorm onset is ~1GWb, exceeding that at Q (SMC states by ~50%. The model magnetic flux values at substorm onset and SMC show little dependence on the solar wind dynamic pressure and distance in the tail, so the magnetic flux value can serve as an important discriminator of the state of the middle magnetotail.

    Key words. Magnetospheric physics (solar windmagnetosphere- interactions, magnetotail, storms and substorms

  17. Quantitative assessment of integrated phrenic nerve activity.

    Science.gov (United States)

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Location of airports - selected quantitative methods

    Directory of Open Access Journals (Sweden)

    Agnieszka Merkisz-Guranowska

    2016-09-01

    Full Text Available Background: The role of air transport in  the economic development of a country and its regions cannot be overestimated. The decision concerning an airport's location must be in line with the expectations of all the stakeholders involved. This article deals with the issues related to the choice of  sites where airports should be located. Methods: Two main quantitative approaches related to the issue of airport location are presented in this article, i.e. the question of optimizing such a choice and the issue of selecting the location from a predefined set. The former involves mathematical programming and formulating the problem as an optimization task, the latter, however, involves ranking the possible variations. Due to various methodological backgrounds, the authors present the advantages and disadvantages of both approaches and point to the one which currently has its own practical application. Results: Based on real-life examples, the authors present a multi-stage procedure, which renders it possible to solve the problem of airport location. Conclusions: Based on the overview of literature of the subject, the authors point to three types of approach to the issue of airport location which could enable further development of currently applied methods.

  19. Quantitative biological measurement in Transmission Electron Tomography

    International Nuclear Information System (INIS)

    Mantell, Judith M; Verkade, Paul; Arkill, Kenton P

    2012-01-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.

  20. Quantitative biological measurement in Transmission Electron Tomography

    Science.gov (United States)

    Mantell, Judith M.; Verkade, Paul; Arkill, Kenton P.

    2012-07-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.

  1. Expert judgement models in quantitative risk assessment

    International Nuclear Information System (INIS)

    Rosqvist, T.; Tuominen, R.

    1999-01-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed

  2. Quantitative myocardial perfusion by O-15-water PET

    DEFF Research Database (Denmark)

    Thomassen, Anders; Petersen, Henrik; Johansen, Allan

    2015-01-01

    AIMS: Reporting of quantitative myocardial blood flow (MBF) is typically performed in standard coronary territories. However, coronary anatomy and myocardial vascular territories vary among individuals, and a coronary artery may erroneously be deemed stenosed or not if territorial demarcation...... disease (CAD). METHODS AND RESULTS: Forty-four patients with suspected CAD were included prospectively and underwent coronary CT-angiography and quantitative MBF assessment with O-15-water PET followed by invasive, quantitative coronary angiography, which served as reference. MBF was calculated...

  3. Quantitative valuation of platform technology based intangibles companies

    OpenAIRE

    Achleitner, Ann-Kristin; Nathusius, Eva; Schraml, Stephanie

    2007-01-01

    In the course of raising external equity, e.g. from venture capitalists, a quantitative valuation is usually required for entrepreneurial ventures. This paper examines the challenges of quantitatively valuing platform technology based entrepreneurial ventures. The distinct characteristics of such companies pose specific requirements on the applicability of quantitative valuation methods. The entrepreneur can choose from a wide range of potential commercialization strategies to pursue in the c...

  4. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  5. Quantitative Measurement of Oxygen in Microgravity Combustion

    Science.gov (United States)

    Silver, Joel A.

    1997-01-01

    A low-gravity environment, in space or in ground-based facilities such as drop towers, provides a unique setting for studying combustion mechanisms. Understanding the physical phenomena controlling the ignition and spread of flames in microgravity has importance for space safety as well as for better characterization of dynamical and chemical combustion processes which are normally masked by buoyancy and other gravity-related effects. Due to restrictions associated with performing measurements in reduced gravity, diagnostic methods which have been applied to microgravity combustion studies have generally been limited to capture of flame emissions on film or video, laser Schlieren imaging and (intrusive) temperature measurements using thermocouples. Given the development of detailed theoretical models, more sophisticated diagnostic methods are needed to provide the kind of quantitative data necessary to characterize the properties of microgravity combustion processes as well as provide accurate feedback to improve the predictive capabilities of the models. When the demands of space flight are considered, the need for improved diagnostic systems which are rugged, compact, reliable, and operate at low power becomes apparent. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in both microgravity combustion research and as a sensor on-board Spacelab as either an air quality monitor or as part of a fire detection system. In our prior microgravity work, an eight line-of-sight fiber optic system measured

  6. Quantitative imaging of coronary blood flow

    Directory of Open Access Journals (Sweden)

    Adam M. Alessio

    2010-04-01

    Full Text Available Adam M. Alessio received his PhD in Electrical Engineering from the University of Notre Dame in 2003. During his graduate studies he developed tomographic reconstruction methods for correlated data and helped construct a high-resolution PET system. He is currently a Research Assistant Professor in Radiology at the University of Washington. His research interests focus on improved data processing and reconstruction algorithms for PET/CT systems with an emphasis on quantitative imaging. Erik Butterworth recieved the BA degree in Mathematics from the University of Chicago in 1977. Between 1977 and 1987 he worked as a computer programmer/analyst for several small commercial software firms. Since 1988, he has worked as a software engineer on various research projects at the University of Washington. Between 1988 and 1993 he developed a real-time data aquisition for the analysis of estuarine sediment transport in the department of Geophysics. Between 1988 and 2002 he developed I4, a system for the display and analysis of cardic PET images in the department of Cardiology. Since 1993 he has worked on physiological simulation systems (XSIM from 1993 to 1999, JSim since 1999 at the National Simulation Resource Facility in Cirulatory Mass Transport and Exchange, in the Department of Bioengineering. His research interests include simulation systems and medical imaging. James H. Caldwell, MD, University of Missouri-Columbia 1970, is Professor of Medicine (Cardiology and Radiology and Adjunct Professor of Bioengineering at the University of Washington School of Medicine and Acting Head, Division of Cardiology and Director of Nuclear Cardiology for the University of Washington Hospitals, Seattle WA, USA. James B. Bassingthwaighte, MD, Toronto 1955, PhD Mayo Grad Sch Med 1964, was Professor of Physiology and of Medicine at Mayo Clinic until 1975 when he moved to the University of Washington to chair Bioengineering. He is Professor of Bioengineering and

  7. Computer code for quantitative ALARA evaluations

    International Nuclear Information System (INIS)

    Voilleque, P.G.

    1984-01-01

    A FORTRAN computer code has been developed to simplify the determination of whether dose reduction actions meet the as low as is reasonably achievable (ALARA) criterion. The calculations are based on the methodology developed for the Atomic Industrial Forum. The code is used for analyses of eight types of dose reduction actions, characterized as follows: reduce dose rate, reduce job frequency, reduce productive working time, reduce crew size, increase administrative dose limit for the task, and increase the workers' time utilization and dose utilization through (a) improved working conditions, (b) basic skill training, or (c) refresher training for special skills. For each type of action, two analysis modes are available. The first is a generic analysis in which the program computes potential benefits (in dollars) for a range of possible improvements, e.g., for a range of lower dose rates. Generic analyses are most useful in the planning stage and for evaluating the general feasibility of alternative approaches. The second is a specific analysis in which the potential annual benefits of a specific level of improvement and the annual implementation cost are compared. The potential benefits reflect savings in operational and societal costs that can be realized if occupational radiation doses are reduced. Because the potential benefits depend upon many variables which characterize the job, the workplace, and the workers, there is no unique relationship between the potential dollar savings and the dose savings. The computer code permits rapid quantitative analyses of alternatives and is a tool that supplements the health physicist's professional judgment. The program output provides a rational basis for decision-making and a record of the assumptions employed

  8. Quantitative histological models suggest endothermy in plesiosaurs

    Directory of Open Access Journals (Sweden)

    Corinna V. Fleischle

    2018-06-01

    Full Text Available Background Plesiosaurs are marine reptiles that arose in the Late Triassic and survived to the Late Cretaceous. They have a unique and uniform bauplan and are known for their very long neck and hydrofoil-like flippers. Plesiosaurs are among the most successful vertebrate clades in Earth’s history. Based on bone mass decrease and cosmopolitan distribution, both of which affect lifestyle, indications of parental care, and oxygen isotope analyses, evidence for endothermy in plesiosaurs has accumulated. Recent bone histological investigations also provide evidence of fast growth and elevated metabolic rates. However, quantitative estimations of metabolic rates and bone growth rates in plesiosaurs have not been attempted before. Methods Phylogenetic eigenvector maps is a method for estimating trait values from a predictor variable while taking into account phylogenetic relationships. As predictor variable, this study employs vascular density, measured in bone histological sections of fossil eosauropterygians and extant comparative taxa. We quantified vascular density as primary osteon density, thus, the proportion of vascular area (including lamellar infillings of primary osteons to total bone area. Our response variables are bone growth rate (expressed as local bone apposition rate and resting metabolic rate (RMR. Results Our models reveal bone growth rates and RMRs for plesiosaurs that are in the range of birds, suggesting that plesiosaurs were endotherm. Even for basal eosauropterygians we estimate values in the range of mammals or higher. Discussion Our models are influenced by the availability of comparative data, which are lacking for large marine amniotes, potentially skewing our results. However, our statistically robust inference of fast growth and fast metabolism is in accordance with other evidence for plesiosaurian endothermy. Endothermy may explain the success of plesiosaurs consisting in their survival of the end-Triassic extinction

  9. Rapid quantitative assay for chloramphenicol acetyltransferase

    International Nuclear Information System (INIS)

    Neumann, J.R.; Morency, C.A.; Russian, K.O.

    1987-01-01

    Measuring the expression of exogenous genetic material in mammalian cells is commonly done by fusing the DNA of interest to a gene encoding an easily-detected enzyme. Chloramphenicol acetyltransferase(CAT) is a convenient marker because it is not normally found in eukaryotes. CAT activity has usually been detected using a thin-layer chromatographic separation followed by autoradiography. An organic solvent extraction-based method for CAT detection has also been described, as well as a procedure utilizing HPLC analysis. Building on the extraction technique, they developed a rapid sensitive kinetic method for measuring CAT activity in cell homogenates. The method exploits the differential organic solubility of the substrate ([ 3 H] or [ 14 C]acetyl CoA) and the product (labeled acetylchloramphenicol). The assay is a simple one-vial, two-phase procedure and requires no tedious manipulations after the initial setup. Briefly, a 0.25 ml reaction with 100mM Tris-HCL, 1mM chloramphenicol, 0.1mM [ 14 C]acetyl CoA and variable amounts of cell homogenate is pipetted into a miniscintillation vial, overlaid with 5 ml of a water-immiscible fluor, and incubated at 37 0 C. At suitable intervals the vial is counted and the CAT level is quantitatively determined as the rate of increase in counts/min of the labeled product as it diffuses into the fluor phase, compared to a standard curve. When used to measure CAT in transfected Balb 3T3 cells the method correlated well with the other techniques

  10. Quantitative indicators of fruit and vegetable consumption

    Directory of Open Access Journals (Sweden)

    Dagmar Kozelová

    2015-12-01

    Full Text Available The quantitative research of the market is often based on surveys and questionnaires which are finding out the behavior of customers in observed areas. Before purchasing process consumers consider where they will buy fruit and vegetables, what kind to choose and in what quantity of goods. Consumers' behavior is affected by the factors as: regional gastronomic traditions, price, product appearance, aroma, place of buying, own experience and knowledge, taste preferences as well as specific health issues of consumers and others. The consumption of fruit and vegetables brings into the human body biological active substances that favorably affect the health of consumers. In the presented research study we were interested in differences of consumers' behavior in the consumption of fruit and vegetables according to the place of residence and gender. In the survey 200 respondents has participated; their place of residence was city or village. The existence of dependences and statistical significance were examined by selected statistical testing methods. Firstly we analyzed the responses via statistical F-test whether observed random samples have the same variance. Then we applied two-sample unpaired t-test with equal variance and χ2-test of statistical independence. The statistical significance was tested by corresponding p values. Correlations were proved by the Cramer's V coefficient. We found that place of residence has no impact on the respondents' consumption of fruit. The gender of respondents does not affect their consumption of fruit. Equally, the gender does not affect the respondents' consumption of vegetables. Only in one observed case the significant differences proved that the place of respondent residence has impact on the consumption of vegetables. Higher consumption of vegetables is due to the fact that the majority of citizens, who live in villages, have a possibility to grow their own vegetables and, thus, the demand for it in village

  11. Quantitative tomography simulations and reconstruction algorithms

    International Nuclear Information System (INIS)

    Martz, H.E.; Aufderheide, M.B.; Goodman, D.; Schach von Wittenau, A.; Logan, C.; Hall, J.; Jackson, J.; Slone, D.

    2000-01-01

    X-ray, neutron and proton transmission radiography and computed tomography (CT) are important diagnostic tools that are at the heart of LLNL's effort to meet the goals of the DOE's Advanced Radiography Campaign. This campaign seeks to improve radiographic simulation and analysis so that radiography can be a useful quantitative diagnostic tool for stockpile stewardship. Current radiographic accuracy does not allow satisfactory separation of experimental effects from the true features of an object's tomographically reconstructed image. This can lead to difficult and sometimes incorrect interpretation of the results. By improving our ability to simulate the whole radiographic and CT system, it will be possible to examine the contribution of system components to various experimental effects, with the goal of removing or reducing them. In this project, we are merging this simulation capability with a maximum-likelihood (constrained-conjugate-gradient-CCG) reconstruction technique yielding a physics-based, forward-model image-reconstruction code. In addition, we seek to improve the accuracy of computed tomography from transmission radiographs by studying what physics is needed in the forward model. During FY 2000, an improved version of the LLNL ray-tracing code called HADES has been coupled with a recently developed LLNL CT algorithm known as CCG. The problem of image reconstruction is expressed as a large matrix equation relating a model for the object being reconstructed to its projections (radiographs). Using a constrained-conjugate-gradient search algorithm, a maximum likelihood solution is sought. This search continues until the difference between the input measured radiographs or projections and the simulated or calculated projections is satisfactorily small

  12. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  13. The National Benchmark Test of quantitative literacy: Does it ...

    African Journals Online (AJOL)

    This article explores the relationship between these two standardised assessments in the domain of mathematical/quantitative literacy. This is accomplished through a Pearson correlation analysis of 6,363 test scores obtained by Grade 12 learners on the NSC Mathematical Literacy examination and the Quantitative ...

  14. PCA-based groupwise image registration for quantitative MRI

    NARCIS (Netherlands)

    Huizinga, W.; Poot, D. H. J.; Guyader, J.-M.; Klaassen, R.; Coolen, B. F.; van Kranenburg, M.; van Geuns, R. J. M.; Uitterdijk, A.; Polfliet, M.; Vandemeulebroucke, J.; Leemans, A.; Niessen, W. J.; Klein, S.

    2016-01-01

    Quantitative magnetic resonance imaging (qMRI) is a technique for estimating quantitative tissue properties, such as the T5 and T2 relaxation times, apparent diffusion coefficient (ADC), and various perfusion measures. This estimation is achieved by acquiring multiple images with different

  15. Resources on quantitative/statistical research for applied linguists

    OpenAIRE

    Brown , James Dean

    2004-01-01

    Abstract The purpose of this review article is to survey and evaluate existing books on quantitative/statistical research in applied linguistics. The article begins by explaining the types of texts that will not be reviewed, then it briefly describes nine books that address how to do quantitative/statistical applied linguistics research. The review then compares (in prose and tables) the general characteris...

  16. Resources on Quantitative/Statistical Research for Applied Linguists

    Science.gov (United States)

    Brown, James Dean

    2004-01-01

    The purpose of this review article is to survey and evaluate existing books on quantitative/statistical research in applied linguistics. The article begins by explaining the types of texts that will not be reviewed, then it briefly describes nine books that address how to do quantitative/statistical applied linguistics research. The review then…

  17. Quantitative trait loci (QTL) mapping for inflorescence length traits in ...

    African Journals Online (AJOL)

    Lablab purpureus (L.) sweet is an ancient legume species whose immature pods serve as a vegetable in south and south-east Asia. The objective of this study is to identify quantitative trait loci (QTLs) associated with quantitative traits such as inflorescence length, peduncle length from branch to axil, peduncle length from ...

  18. Using the Blended Learning Approach in a Quantitative Literacy Course

    Science.gov (United States)

    Botts, Ryan T.; Carter, Lori; Crockett, Catherine

    2018-01-01

    The efforts to improve the quantitative reasoning (quantitative literacy) skills of college students in the United States have been gaining momentum in recent years. At the same time, the blended learning approach to course delivery has gained in popularity, promising better learning with flexible modalities and pace. This paper presents the…

  19. Quantitative modelling of the biomechanics of the avian syrinx

    DEFF Research Database (Denmark)

    Elemans, Coen P. H.; Larsen, Ole Næsbye; Hoffmann, Marc R.

    2003-01-01

    We review current quantitative models of the biomechanics of bird sound production. A quantitative model of the vocal apparatus was proposed by Fletcher (1988). He represented the syrinx (i.e. the portions of the trachea and bronchi with labia and membranes) as a single membrane. This membrane acts...

  20. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    Science.gov (United States)

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  1. urrent status and assessment of quantitative and qualitative one leg ...

    African Journals Online (AJOL)

    ... of only a quantitative assessment. These findings indicate that, when evaluating the one leg balance in children aged 3-6 years, a quantitative and qualitative assessment should be used in combination together to assure a more accurate assessment. (S. African J. for Research in Sport, Physical Ed. and Recreation: 2001 ...

  2. Quantitative Theoretical and Conceptual Framework Use in Agricultural Education Research

    Science.gov (United States)

    Kitchel, Tracy; Ball, Anna L.

    2014-01-01

    The purpose of this philosophical paper was to articulate the disciplinary tenets for consideration when using theory in agricultural education quantitative research. The paper clarified terminology around the concept of theory in social sciences and introduced inaccuracies of theory use in agricultural education quantitative research. Finally,…

  3. Quantitative Approaches to Group Research: Suggestions for Best Practices

    Science.gov (United States)

    McCarthy, Christopher J.; Whittaker, Tiffany A.; Boyle, Lauren H.; Eyal, Maytal

    2017-01-01

    Rigorous scholarship is essential to the continued growth of group work, yet the unique nature of this counseling specialty poses challenges for quantitative researchers. The purpose of this proposal is to overview unique challenges to quantitative research with groups in the counseling field, including difficulty in obtaining large sample sizes…

  4. Quantitative Auger analysis of Nb-Ge superconducting alloys

    International Nuclear Information System (INIS)

    Buitrago, R.H.

    1980-01-01

    The feasibility of using Auger electron analysis for quantitative analysis was investigated by studying Nb 3 Ge thin-film Auger data with different approaches. A method base on elemental standards gave consistent quantitative values with reported Nb-Ge data. Alloy sputter yields were also calculated and results were consistent with those for pure elements

  5. Undergraduate Students' Quantitative Reasoning in Economic Contexts

    Science.gov (United States)

    Mkhatshwa, Thembinkosi Peter; Doerr, Helen M.

    2018-01-01

    Contributing to a growing body of research on undergraduate students' quantitative reasoning, the study reported in this article used task-based interviews to investigate business calculus students' quantitative reasoning when solving two optimization tasks situated in the context of revenue and profit maximization. Analysis of verbal responses…

  6. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  7. Exploring Phytoplankton Population Investigation Growth to Enhance Quantitative Literacy

    Science.gov (United States)

    Baumgartner, Erin; Biga, Lindsay; Bledsoe, Karen; Dawson, James; Grammer, Julie; Howard, Ava; Snyder, Jeffrey

    2015-01-01

    Quantitative literacy is essential to biological literacy (and is one of the core concepts in "Vision and Change in Undergraduate Biology Education: A Call to Action"; AAAS 2009). Building quantitative literacy is a challenging endeavor for biology instructors. Integrating mathematical skills into biological investigations can help build…

  8. Genetic variability, heritability and genetic advance of quantitative ...

    African Journals Online (AJOL)

    Genetic variation has led to an increase in the quantitative traits of crops. The variability on genome is induced by mutation, which enhances the productivity. We evaluated variability on quantitative characters such as, plant height, number of branches/plant, number of leaves/plant, number of fruit clusters/plant, number of ...

  9. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study

  10. Quantitative Methods Intervention: What Do the Students Want?

    Science.gov (United States)

    Frankland, Lianne; Harrison, Jacqui

    2016-01-01

    The shortage of social science graduates with competent quantitative skills jeopardises the competitive UK economy, public policy making effectiveness and the status the UK has as a world leader in higher education and research (British Academy for Humanities and Social Sciences, 2012). There is a growing demand for quantitative skills across all…

  11. Mapcurves: a quantitative method for comparing categorical maps.

    Science.gov (United States)

    William W. Hargrove; M. Hoffman Forrest; Paul F. Hessburg

    2006-01-01

    We present Mapcurves, a quantitative goodness-of-fit (GOF) method that unambiguously shows the degree of spatial concordance between two or more categorical maps. Mapcurves graphically and quantitatively evaluate the degree of fit among any number of maps and quantify a GOF for each polygon, as well as the entire map. The Mapcurve method indicates a perfect fit even if...

  12. Statistical mechanics and the evolution of polygenic quantitative traits

    NARCIS (Netherlands)

    Barton, N.H.; De Vladar, H.P.

    The evolution of quantitative characters depends on the frequencies of the alleles involved, yet these frequencies cannot usually be measured. Previous groups have proposed an approximation to the dynamics of quantitative traits, based on an analogy with statistical mechanics. We present a modified

  13. Quantitative Phase Determination by Using a Michelson Interferometer

    Science.gov (United States)

    Pomarico, Juan A.; Molina, Pablo F.; D'Angelo, Cristian

    2007-01-01

    The Michelson interferometer is one of the best established tools for quantitative interferometric measurements. It has been, and is still successfully used, not only for scientific purposes, but it is also introduced in undergraduate courses for qualitative demonstrations as well as for quantitative determination of several properties such as…

  14. Unraveling possible association between quantitative trait loci (QTL ...

    African Journals Online (AJOL)

    Unraveling possible association between quantitative trait loci (QTL) for partial resistance and nonhost resistance in food barley ( Hordeum vulgaris L.) ... Abstract. Many quantitative trait loci (QTLs) in different barley populations were discovered for resistance to Puccinia hordei and heterologous rust species. Partial ...

  15. Quantitative Modeling of Landscape Evolution, Treatise on Geomorphology

    NARCIS (Netherlands)

    Temme, A.J.A.M.; Schoorl, J.M.; Claessens, L.F.G.; Veldkamp, A.; Shroder, F.S.

    2013-01-01

    This chapter reviews quantitative modeling of landscape evolution – which means that not just model studies but also modeling concepts are discussed. Quantitative modeling is contrasted with conceptual or physical modeling, and four categories of model studies are presented. Procedural studies focus

  16. Optimization method for quantitative calculation of clay minerals in soil

    Indian Academy of Sciences (India)

    However, no reliable method for quantitative analysis of clay minerals has been established so far. In this study, an attempt was made to propose an optimization method for the quantitative ... 2. Basic principles. The mineralogical constitution of soil is rather complex. ... K2O, MgO, and TFe as variables for the calculation.

  17. Quantitation of Proteinuria in Women With Pregnancy Induced ...

    African Journals Online (AJOL)

    This creates the need for a more accurate method for early detection and quantitation of proteinuria. Objective:To compare the accuracy of the Spot urine Protein to Creatinine ratio with that of Dipstick Tests in the quantitation of proteinuria in Nigerian women with Pregnancy Induced Hypertension. Methods: A cross-sectional ...

  18. Quantitative image of bone mineral content

    International Nuclear Information System (INIS)

    Katoh, Tsuguhisa

    1990-01-01

    A dual energy subtraction system was constructed on an experimental basis for the quantitative image of bone mineral content. The system consists of a radiographing system and an image processor. Two radiograms were taken with dual x-ray energy in a single exposure using an x-ray beam dichromized by a tin filter. In this system, a film cassette was used where a low speed film-screen system, a copper filter and a high speed film-screen system were layered on top of each other. The images were read by a microdensitometer and processed by a personal computer. The image processing included the corrections of the film characteristics and heterogeneity in the x-ray field, and the dual energy subtraction in which the effect of the high energy component of the dichromized beam on the tube side image was corrected. In order to determine the accuracy of the system, experiments using wedge phantoms made of mixtures of epoxy resin and bone mineral-equivalent materials in various fractions were performed for various tube potentials and film processing conditions. The results indicated that the relative precision of the system was within ±4% and that the propagation of the film noise was within ±11 mg/cm 2 for the 0.2 mm pixels. The results also indicated that the system response was independent of the tube potential and the film processing condition. The bone mineral weight in each phalanx of the freshly dissected hand of a rhesus monkey was measured by this system and compared with the ash weight. The results showed an error of ±10%, slightly larger than that of phantom experiments, which is probably due to the effect of fat and the variation of focus-object distance. The air kerma in free air at the object was approximately 0.5 mGy for one exposure. The results indicate that this system is applicable to clinical use and provides useful information for evaluating a time-course of localized bone disease. (author)

  19. An overview of quantitative approaches in Gestalt perception.

    Science.gov (United States)

    Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H

    2016-09-01

    Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  1. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  2. From themes to hypotheses: following up with quantitative methods.

    Science.gov (United States)

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  3. QTest: Quantitative Testing of Theories of Binary Choice.

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  4. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  5. Prospective Middle-School Mathematics Teachers' Quantitative Reasoning and Their Support for Students' Quantitative Reasoning

    Science.gov (United States)

    Kabael, Tangul; Akin, Ayca

    2018-01-01

    The aim of this research is to examine prospective mathematics teachers' quantitative reasoning, their support for students' quantitative reasoning and the relationship between them, if any. The teaching experiment was used as the research method in this qualitatively designed study. The data of the study were collected through a series of…

  6. Quantitative assessment of 201TlCl myocardial SPECT

    International Nuclear Information System (INIS)

    Uehara, Toshiisa

    1987-01-01

    Clinical evaluation of the quantitative analysis of Tl-201 myocardial tomography by SPECT (Single Photon Emission Computed Tomography) was performed in comparison with visual evaluation. The method of quantitative analysis has been already reported in our previous paper. In this study, the program of re-standardization in the case of lateral myocardial infarction was added. This program was useful mainly for the evaluation of lesions in the left circumflex coronary artery. Regarding the degree of diagnostic accuracy of myocardial infarction in general, quantitative evaluation of myocardial SPECT images was highest followed by visual evaluation of myocardial SPECT images, and visual evaluation of myocardial planar images. However, in the case of anterior myocardial infarction, visual evaluation of myocardial SPECT images has almost the same detectability as quantitative evaluation of myocardial SPECT images. In the case of infero-posterior myocardial infarction, quantitative evaluation was superior to visual evaluation. As for specificity, quantitative evaluation of SPECT images was slightly inferior to visual evaluation of SPECT images. An infarction map was made by quantitative analysis and this enabled us to determine the infarction site, extent and degree according to easily recognizable patterns. As a result, the responsible coronary artery lesion could be inferred correctly and the calculated infarction score could be correlated with the residual left ventricular function after myocardial infarction. (author)

  7. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies

    DEFF Research Database (Denmark)

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm

    2016-01-01

    OBJECTIVE: To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. METHODS: A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting...... quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. RESULTS: We provide a 9-point checklist encompassing aspects deemed...... relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. CONCLUSIONS...

  8. Development of Quantitative Framework for Event Significance Evaluation

    International Nuclear Information System (INIS)

    Lee, Durk Hun; Kim, Min Chull; Kim, Inn Seock

    2010-01-01

    There is an increasing trend in quantitative evaluation of the safety significance of operational events using Probabilistic Safety Assessment (PSA) technique. An integrated framework for evaluation of event significance has been developed by Korea Institute of Nuclear Safety (KINS), which consists of an assessment hierarchy and a number of matrices. The safety significance of various events, e.g., internal or external initiating events that occurred during at-power or shutdown conditions, can be quantitatively analyzed using this framework, and then, the events rated according to their significance. This paper briefly describes the basic concept of the integrated quantitative framework for evaluation of event significance, focusing on the assessment hierarchy

  9. The value and limitation of quantitative safety goals

    International Nuclear Information System (INIS)

    Dunster, H.J.

    1982-01-01

    Some of the philosophical and practical complexities of quantitative safety goals are reviewed with examples of how the problems have been dealt with in current safety objectives in Britain and by the International Commisson on Radiological Protection. Where possible, quantitative comparisons are shown. It is concluded that progress towards quantitative safety goals should be deliberate rather than rapid and that attention should be paid to the possible implications for industries other than the nuclear power industry and countries other than the United States of America

  10. Problems of standardized handling and quantitative evaluation of autoradiograms

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1985-01-01

    In the last years autoradiography has gained increasing importance as a quantitative method of measuring radioactivity or element concentration. Mostly relative measurements are carried out. The optical density of the photographic emulsion produced by a calibrated radiation source is compared with that produced by a sample. The influences of different parameters, such as beta particle energy, backscattering, fading of the latent image, developing conditions, matrix effects and others on the results are described and the errors of the quantitative evaluation of autoradiograms are assessed. The performance of the method is demonstrated taking the quantitative determination of gold in silicon as an example

  11. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  12. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources .... Data collection design is an important process in complex forest statistical ... Ideally, the sample size should be equal among groups and sufficiently large.

  13. Collecting data for quantitative research on pluvial flooding

    NARCIS (Netherlands)

    Spekkers, M.H.; Ten Veldhuis, J.A.E.; Clemens, F.H.L.R.

    2011-01-01

    Urban pluvial flood management requires detailed spatial and temporal information on flood characteristics and damaging consequences. There is lack of quantitative field data on pluvial flooding resulting in large uncertainties in urban flood model calculations and ensuing decisions for investments

  14. Quantitative stem cell biology: the threat and the glory.

    Science.gov (United States)

    Pollard, Steven M

    2016-11-15

    Major technological innovations over the past decade have transformed our ability to extract quantitative data from biological systems at an unprecedented scale and resolution. These quantitative methods and associated large datasets should lead to an exciting new phase of discovery across many areas of biology. However, there is a clear threat: will we drown in these rivers of data? On 18th July 2016, stem cell biologists gathered in Cambridge for the 5th annual Cambridge Stem Cell Symposium to discuss 'Quantitative stem cell biology: from molecules to models'. This Meeting Review provides a summary of the data presented by each speaker, with a focus on quantitative techniques and the new biological insights that are emerging. © 2016. Published by The Company of Biologists Ltd.

  15. Quantitative aspects of oxygen and carbon dioxide exchange ...

    African Journals Online (AJOL)

    Quantitative aspects of oxygen and carbon dioxide exchange through the ... ceratophthalmus (Crustacea: Decapoda) during rest and exercise in water and ... intersects zero time on the x-axis, indicating rapid gas exchange at the lung surface.

  16. Variation in quantitative characters of faba bean after seed ...

    African Journals Online (AJOL)

    Variation in quantitative characters of faba bean after seed irradiation and associated molecular changes. Sonia Mejri, Yassine Mabrouk, Marie Voisin, Philippe Delavault, Philippe Simier, Mouldi Saidi, Omrane Belhadj ...

  17. Quantitative Market Research Regarding Funding of District 8 Construction Projects

    Science.gov (United States)

    1995-05-01

    The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...

  18. A quantitative assessment of Arctic shipping in 2010–2014

    KAUST Repository

    Eguí luz, Victor M.; Ferná ndez-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-01-01

    considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant

  19. Quantitative Structure-Activity Relationship Analysis of the ...

    African Journals Online (AJOL)

    Erah

    Quantitative Structure-Activity Relationship Analysis of the Anticonvulsant ... Two types of molecular descriptors, including the 2D autocorrelation ..... It is based on the simulation of natural .... clustering anticonvulsant, antidepressant, and.

  20. The National Benchmark Test of quantitative literacy: Does it ...

    African Journals Online (AJOL)

    Windows User

    determine whether Grade 12 learners have mastered subject knowledge at the ... the NSC Mathematical Literacy examination and the Quantitative Literacy test of the ..... Method. Sample. The sample for this study consisted of 6,363 Grade. 12 ...

  1. Quantitative determination of grain sizes by means of scattered ultrasound

    International Nuclear Information System (INIS)

    Goebbels, K.; Hoeller, P.

    1976-01-01

    The scattering of ultrasounds makes possible the quantitative determination of grain sizes in metallic materials. Examples of measurements on steels with grain sizes between ASTM 1 and ASTM 12 are given

  2. Quantitative reconstruction from a single diffraction-enhanced image

    International Nuclear Information System (INIS)

    Paganin, D.M.; Lewis, R.A.; Kitchen, M.

    2003-01-01

    Full text: We develop an algorithm for using a single diffraction-enhanced image (DEI) to obtain a quantitative reconstruction of the projected thickness of a single-material sample which is embedded within a substrate of approximately constant thickness. This algorithm is used to quantitatively map inclusions in a breast phantom, from a single synchrotron DEI image. In particular, the reconstructed images quantitatively represent the projected thickness in the bulk of the sample, in contrast to DEI images which greatly emphasise sharp edges (high spatial frequencies). In the context of an ultimate aim of improved methods for breast cancer detection, the reconstructions are potentially of greater diagnostic value compared to the DEI data. Lastly, we point out that the methods of analysis presented here are also applicable to the quantitative analysis of differential interference contrast (DIC) images

  3. Quantitative Assays for RAS Pathway Proteins and Phosphorylation States

    Science.gov (United States)

    The NCI CPTAC program is applying its expertise in quantitative proteomics to develop assays for RAS pathway proteins. Targets include key phosphopeptides that should increase our understanding of how the RAS pathway is regulated.

  4. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    Science.gov (United States)

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  5. Quantitative-genetic analysis of wing form and bilateral asymmetry ...

    Indian Academy of Sciences (India)

    Unknown

    lines; Procrustes analysis; wing shape; wing size. ... Models of stochastic gene expression pre- dict that intrinsic noise ... Quantitative parameters of wing size and shape asymmetries ..... the residuals of a regression on centroid size produced.

  6. Normal values for quantitative muscle ultrasonography in adults.

    NARCIS (Netherlands)

    Arts, I.M.P.; Pillen, S.; Schelhaas, H.J.; Overeem, S.; Zwarts, M.J.

    2010-01-01

    Ultrasonography can detect structural muscle changes caused by neuromuscular disease. Quantitative analysis is the preferred method to determine if ultrasound findings are within normal limits, but normative data are incomplete. The purpose of this study was to provide normative muscle

  7. Quantile-Based Permutation Thresholds for Quantitative Trait Loci Hotspots

    NARCIS (Netherlands)

    Neto, Elias Chaibub; Keller, Mark P.; Broman, Andrew F.; Attie, Alan D.; Jansen, Ritsert C.; Broman, Karl W.; Yandell, Brian S.; Borevitz, J.

    Quantitative trait loci (QTL) hotspots (genomic locations affecting many traits) are a common feature in genetical genomics studies and are biologically interesting since they may harbor critical regulators. Therefore, statistical procedures to assess the significance of hotspots are of key

  8. Mapping of quantitative trait loci controlling Orobanche foetida Poir ...

    African Journals Online (AJOL)

    Mapping of quantitative trait loci controlling Orobanche foetida Poir. resistance in faba bean (Vicia faba L.) R Díaz-Ruiz, A Torres, MV Gutierrez, D Rubiales, JI Cubero, M Kharrat, Z Satovic, B Román ...

  9. Cloning and semi-quantitative expression of endochitinase ( ech42 ...

    African Journals Online (AJOL)

    Cloning and semi-quantitative expression of endochitinase (ech42) gene from Trichoderma spp. Pratibha Sharma, K Saravanan, R Ramesh, P Vignesh Kumar, Dinesh Singh, Manika Sharma, Monica S. Henry, Swati Deep ...

  10. A quantitative framework for estimating water resources in India

    Digital Repository Service at National Institute of Oceanography (India)

    Shankar, D.; Kotamraju, V.; Shetye, S.R

    of information on the variables associated with hydrology, and second, the absence of an easily accessible quantitative framework to put these variables in perspective. In this paper, we discuss a framework that has been assembled to address both these issues...

  11. Qualitative and Quantitative Analysis for US Army Recruiting Input Allocation

    National Research Council Canada - National Science Library

    Brence, John

    2004-01-01

    .... An objective study of the quantitative and qualitative aspects of recruiting is necessary to meet the future needs of the Army, in light of strong possibilities of recruiting resource reduction...

  12. Book Review: Qualitative-Quantitative Analyses of Dutch and ...

    African Journals Online (AJOL)

    Abstract. Book Title: Qualitative-Quantitative Analyses of Dutch and Afrikaans Grammar and Lexicon. Book Author: Robert S. Kirsner. 2014. John Benjamins Publishing Company ISBN 9789027215772, price ZAR481.00. 239 pages ...

  13. Quantitative assessment of target dependence of pion fluctuation in ...

    Indian Academy of Sciences (India)

    journal of. December 2012 physics pp. 1395–1405. Quantitative assessment ... The analysis reveals the erratic behaviour of the produced pions signifying ..... authors (Sitaram Pal) gratefully acknowledges the financial help from the University.

  14. Quantitative trait loci mapping for stomatal traits in interspecific ...

    Indian Academy of Sciences (India)

    M. Sumathi

    2018-02-23

    Feb 23, 2018 ... Journal of Genetics, Vol. ... QTL analysis was carried out to identify the chromosomal regions affecting ... Keywords. linkage map; quantitative trait loci; stomata; stress ..... of India for providing financial support for the project.

  15. Analysis association of milk fat and protein percent in quantitative ...

    African Journals Online (AJOL)

    Analysis association of milk fat and protein percent in quantitative trait locus ... African Journal of Biotechnology ... Protein and fat percent as content of milk are high-priority criteria for financial aims and selection of programs in dairy cattle.

  16. Integrated quantitative pharmacology for treatment optimization in oncology

    NARCIS (Netherlands)

    Hasselt, J.G.C. van

    2014-01-01

    This thesis describes the development and application of quantitative pharmacological models in oncology for treatment optimization and for the design and analysis of clinical trials with respect to pharmacokinetics, toxicity, efficacy and cost-effectiveness. A recurring theme throughout this

  17. [Rapid analysis of suppositories by quantitative 1H NMR spectroscopy].

    Science.gov (United States)

    Abramovich, R A; Kovaleva, S A; Goriainov, S V; Vorob'ev, A N; Kalabin, G A

    2012-01-01

    Rapid analysis of suppositories with ibuprofen and arbidol by quantitative 1H NMR spectroscopy was performed. Optimal conditions for the analysis were developed. The results are useful for design of rapid methods for quality control of suppositories with different components

  18. Lesion detection and quantitation of positron emission mammography

    International Nuclear Information System (INIS)

    Qi, Jinyi; Huesman, Ronald H.

    2001-01-01

    A Positron Emission Mammography (PEM) scanner dedicated to breast imaging is being developed at our laboratory. We have developed a list mode likelihood reconstruction algorithm for this scanner. Here we theoretically study the lesion detection and quantitation. The lesion detectability is studied theoretically using computer observers. We found that for the zero-order quadratic prior, the region of interest observer can achieve the performance of the prewhitening observer with a properly selected smoothing parameter. We also study the lesion quantitation using the test statistic of the region of interest observer. The theoretical expressions for the bias, variance, and ensemble mean squared error of the quantitation are derived. Computer simulations show that the theoretical predictions are in good agreement with the Monte Carlo results for both lesion detection and quantitation

  19. A quantitative exploration of the effects of workplace bullying on ...

    African Journals Online (AJOL)

    hefere

    types of hostile communication and behaviour are used (Tracy, .... Qualitative and quantitative studies explored the effects of WPB on educators. Whereas ... An array of research methods has thus been used to investigate the effects of WPB on.

  20. Doing Quantitative Grounded Theory: A theory of trapped travel consumption

    Directory of Open Access Journals (Sweden)

    Mark S. Rosenbaum, Ph.D.

    2008-11-01

    Full Text Available All is data. Grounded theorists employ this sentence in their quest to create original theoretical frameworks. Yet researchers typically interpret the word gdatah to mean qualitative data or, more specifically, interview data collected from respondents. This is not to say that qualitative data is deficient; however, grounded theorists may be missing vast opportunities to create pioneering theories from quantitative data. Indeed, Glaser and Strauss (1967 argued that researchers would use qualitative and/or quantitative data to fashion original frameworks and related hypotheses, and Glaserfs (2008 recently published book, titledDoing Quantitative Grounded Theory, is an attempt to help researchers understand how to use quantitative data for grounded theory (GT.

  1. Quantitative analysis of some brands of chloroquine tablets ...

    African Journals Online (AJOL)

    Quantitative analysis of some brands of chloroquine tablets marketed in Maiduguri using spectrophotometric ... and compared with that of the standard, wavelength of maximum absorbance at 331nm for chloroquine. ... HOW TO USE AJOL.

  2. Effects of feed forms, levels of quantitative feed restriction on ...

    African Journals Online (AJOL)

    Nigerian Journal of Animal Production ... Data were collected on growth performance, carcass characteristics, and cost benefits were calculated. Data were subjected to ... Keywords: Broilers, carcass, performance, quantitative feed restriction ...

  3. Methods and instrumentation for quantitative microchip capillary electrophoresis

    NARCIS (Netherlands)

    Revermann, T.

    2007-01-01

    The development of novel instrumentation and analytical methodology for quantitative microchip capillary electrophoresis (MCE) is described in this thesis. Demanding only small quantities of reagents and samples, microfluidic instrumentation is highly advantageous. Fast separations at high voltages

  4. Smile line assessment comparing quantitative measurement and visual estimation

    NARCIS (Netherlands)

    Geld, P. Van der; Oosterveld, P.; Schols, J.; Kuijpers-Jagtman, A.M.

    2011-01-01

    INTRODUCTION: Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation

  5. Quantitation of glial fibrillary acidic protein in human brain tumours

    DEFF Research Database (Denmark)

    Rasmussen, S; Bock, E; Warecka, K

    1980-01-01

    The glial fibrillary acidic protein (GFA) content of 58 human brain tumours was determined by quantitative immunoelectrophoresis, using monospecific antibody against GFA. Astrocytomas, glioblastomas, oligodendrogliomas, spongioblastomas, ependymomas and medulloblastomas contained relatively high...

  6. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  7. A quantitative comparison of corrective and perfective maintenance

    Science.gov (United States)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  8. Spectroscopic Tools for Quantitative Studies of DNA Structure and Dynamics

    DEFF Research Database (Denmark)

    Preus, Søren

    The main objective of this thesis is to develop quantitative fluorescence-based, spectroscopic tools for probing the 3D structure and dynamics of DNA and RNA. The thesis is founded on six peer-reviewed papers covering mainly the development, characterization and use of fluorescent nucleobase...... analogues. In addition, four software packages is presented for the simulation and quantitative analysis of time-resolved and steady-state UV-Vis absorption and fluorescence experiments....

  9. Quantitative analysis of psychological personality for NPP operators

    International Nuclear Information System (INIS)

    Gao Jia; Huang Xiangrui

    1998-01-01

    The author introduces the relevant personality quantitative psychological research work carried out by 'Prognoz' Laboratory and Taiwan, and presents the primary results of the research for Chinese Nuclear Power Plant (NPP) operator's psychological personality assessment, which based on the survey of MMPI, and presents the main contents for the personality quantitative psychological research in NPP of China. And emphasizes the need to carry out psychological selection and training in nuclear industry

  10. Quantitative Information Flow as Safety and Liveness Hyperproperties

    Directory of Open Access Journals (Sweden)

    Hirotoshi Yasuoka

    2012-07-01

    Full Text Available We employ Clarkson and Schneider's "hyperproperties" to classify various verification problems of quantitative information flow. The results of this paper unify and extend the previous results on the hardness of checking and inferring quantitative information flow. In particular, we identify a subclass of liveness hyperproperties, which we call "k-observable hyperproperties", that can be checked relative to a reachability oracle via self composition.

  11. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    OpenAIRE

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-01-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connecti...

  12. Quantitative method for determination of body inorganic iodine

    International Nuclear Information System (INIS)

    Filatov, A.A.; Tatsievskij, V.A.

    1991-01-01

    An original method of quantitation of body inorganic iodine, based upon a simultaneous administration of a known dose of stable and radioactive iodine with subsequent radiometry of the thyroid was proposed. The calculation is based upon the principle of the dilution of radiactive iodine in human inorganic iodine space. The method permits quantitation of the amount of inorganic iodine with regard to individual features of inorganic space. The method is characterized by simplicity and is not invasive for a patient

  13. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    OpenAIRE

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2007-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often ...

  14. Sexual Harassment Prevention Initiatives: Quantitative and Qualitative Approaches

    Science.gov (United States)

    2010-10-28

    Quantitative Approach: The Survey The quantitative approach appears to be the dominant form of mainstream psychological research today , and Gelo et al. (2008...that viewpoint and remark that the characteristics of today ‟s psychological research demonstrate realities that can be replicated through studies...2000). The right of passage? The experiences of female pilots in commercial aviation. Feminism & Psychology, 10, 195-225. Davis, A., & Bremner, G

  15. QUALITATIVE AND QUANTITATIVE METHODS OF SUICIDE RESEARCH IN OLD AGE

    OpenAIRE

    Ojagbemi, A.

    2017-01-01

    This paper examines the merits of the qualitative and quantitative methods of suicide research in the elderly using two studies identified through a free search of the Pubmed database for articles that might have direct bearing on suicidality in the elderly. The studies have been purposively selected for critical appraisal because they meaningfully reflect the quantitative and qualitative divide as well as the social, economic, and cultural boundaries between the elderly living in sub-Saharan...

  16. Quantitative image quality evaluation of pixel-binning in a flat-panel detector for x-ray fluoroscopy

    International Nuclear Information System (INIS)

    Srinivas, Yogesh; Wilson, David L.

    2004-01-01

    X-ray fluoroscopy places stringent design requirements on new flat-panel (FP) detectors, requiring both low-noise electronics and high data transfer rates. Pixel-binning, wherein data from more that one detector pixel are collected simultaneously, not only lowers the data transfer rate but also increases x-ray counts and pixel signal-to-noise ratio (SNR). In this study, we quantitatively assessed image quality of image sequences from four acquisition methods; no-binning and three types of binning; in synthetic images using a clinically relevant task of detecting an extended guidewire in a four-alternative forced-choice paradigm. Binning methods were conventional data-line (D) and gate-line (G) binning, and a novel method in which alternate frames in an image sequence used D and G binning. Two detector orientations placed the data lines either parallel or perpendicular to the guide wire. At a low exposure of 0.6 μR (1.548x10 -10 C/kg) per frame, irrespective of detector orientation, D binning with its reduced electronic noise was significantly (p -10 C/kg) per frame, with data lines parallel to the guidewire, detection with D binning was significantly (p<0.1) better than G binning. However, with data lines perpendicular to the guidewire, G binning was significantly (p<0.1) better than D binning because the partial area effect was reduced. Alternate binning was the best binning method when results were averaged over both orientations, and it was as good as the best binning method at either orientation. In addition, at low and high exposures, alternate binning gave a temporally fused image with a smooth guidewire, an important image quality feature not assessed in a detection experiment. While at high exposure, detection with no binning was as good, or better, than the best binning method, it might be impractical at fluoroscopy imaging rates. A computational observer model based on signal detection theory successfully fit data and was used to predict effects of

  17. Identification of ginseng root using quantitative X-ray microtomography.

    Science.gov (United States)

    Ye, Linlin; Xue, Yanling; Wang, Yudan; Qi, Juncheng; Xiao, Tiqiao

    2017-07-01

    The use of X-ray phase-contrast microtomography for the investigation of Chinese medicinal materials is advantageous for its nondestructive, in situ , and three-dimensional quantitative imaging properties. The X-ray phase-contrast microtomography quantitative imaging method was used to investigate the microstructure of ginseng, and the phase-retrieval method is also employed to process the experimental data. Four different ginseng samples were collected and investigated; these were classified according to their species, production area, and sample growth pattern. The quantitative internal characteristic microstructures of ginseng were extracted successfully. The size and position distributions of the calcium oxalate cluster crystals (COCCs), important secondary metabolites that accumulate in ginseng, are revealed by the three-dimensional quantitative imaging method. The volume and amount of the COCCs in different species of the ginseng are obtained by a quantitative analysis of the three-dimensional microstructures, which shows obvious difference among the four species of ginseng. This study is the first to provide evidence of the distribution characteristics of COCCs to identify four types of ginseng, with regard to species authentication and age identification, by X-ray phase-contrast microtomography quantitative imaging. This method is also expected to reveal important relationships between COCCs and the occurrence of the effective medicinal components of ginseng.

  18. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    Directory of Open Access Journals (Sweden)

    Thomas Jensen

    2016-01-01

    Full Text Available Background: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including staining may benefit. Methods: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm are presented. Results: It is shown that the autofluorescence intensity of unstained microsections at two different wavelengths is a suitable starting point for automated digital analysis of myocytes, fibrous tissue, lipofuscin, and the extracellular compartment. The output of the method is absolute quantitation along with accurate outlines of above-mentioned components. The digital quantitations are verified by comparison to point grid quantitations performed on the microsections after Van Gieson staining. Conclusion: The presented method is amply described as a prestain multicomponent quantitation and outlining tool for histological sections of cardiac tissue. The main perspective is the opportunity for combination with digital analysis of stained microsections, for which the method may provide an accurate digital framework.

  19. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  20. Quantitative nuclear medicine imaging: application of computers to the gamma camera and whole-body scanner

    International Nuclear Information System (INIS)

    Budinger, T.F.

    1974-01-01

    The following topics are reviewed: properties of computer systems for nuclear medicine quantitation; quantitative information concerning the relation between organ isotope concentration and detected projections of the isotope distribution; quantitation using two conjugate views; three-dimensional reconstruction from projections; quantitative cardiac radioangiography; and recent advances leading to quantitative nuclear medicine of clinical importance. (U.S.)