WorldWideScience

Sample records for making quantitative measurements

  1. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  2. Using measurement uncertainty in decision-making and conformity assessment

    Science.gov (United States)

    Pendrill, L. R.

    2014-08-01

    Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such

  3. A simple, semi-quantitative method for measuring pulsed soft x-rays

    International Nuclear Information System (INIS)

    Takahama, Y.; Du, J.; Yanagidaira, T.; Hirano, K.

    1993-01-01

    A simple semi-quantitative measurement and image processing system for pulsed soft X-rays with a time and spatial resolution is proposed. Performance of the system is examined using a cylindrical soft X-ray source generated with a plasma device. The system consists of commercial facilities which are easily obtained such as a microchannel plate-phosphor screen combination, a CCD camera, an image memory board and a personal computer. To make a quantitative measurement possible, the image processing and observation of the phosphor screen current are used in conjunction. (author)

  4. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    Science.gov (United States)

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  5. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)

  6. Advanced quantitative measurement methodology in physics education research

    Science.gov (United States)

    Wang, Jing

    The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three

  7. Quantitative Ultrasound Measurements at the Heel

    DEFF Research Database (Denmark)

    Daugschies, M.; Brixen, K.; Hermann, P.

    2015-01-01

    Calcaneal quantitative ultrasound can be used to predict osteoporotic fracture risk, but its ability to monitor therapy is unclear possibly because of its limited precision. We developed a quantitative ultrasound device (foot ultrasound scanner) that measures the speed of sound at the heel...... with the foot ultrasound scanner reduced precision errors by half (p quantitative ultrasound measurements is feasible. (E-mail: m.daugschies@rad.uni-kiel.de) (C) 2015 World Federation for Ultrasound in Medicine & Biology....

  8. Quantitative autoradiography - a method of radioactivity measurement

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1988-01-01

    In the last years the autoradiography has been developed to a quantitative method of radioactivity measurement. Operating techniques of quantitative autoradiography are demonstrated using special standard objects. Influences of irradiation quality, of backscattering in sample and detector materials, and of sensitivity and fading of the detectors are considered. Furthermore, questions of quantitative evaluation of autoradiograms are dealt with, and measuring errors are discussed. Finally, some practical uses of quantitative autoradiography are demonstrated by means of the estimation of activity distribution in radioactive foil samples. (author)

  9. A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems

    Directory of Open Access Journals (Sweden)

    Sangmin Shin

    2018-02-01

    Full Text Available Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measures addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.

  10. Investigating Children's Abilities to Count and Make Quantitative Comparisons

    Science.gov (United States)

    Lee, Joohi; Md-Yunus, Sham'ah

    2016-01-01

    This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…

  11. Quantitative measures of healthy aging and biological age

    Science.gov (United States)

    Kim, Sangkyu; Jazwinski, S. Michal

    2015-01-01

    Numerous genetic and non-genetic factors contribute to aging. To facilitate the study of these factors, various descriptors of biological aging, including ‘successful aging’ and ‘frailty’, have been put forth as integrative functional measures of aging. A separate but related quantitative approach is the ‘frailty index’, which has been operationalized and frequently used. Various frailty indices have been constructed. Although based on different numbers and types of health variables, frailty indices possess several common properties that make them useful across different studies. We have been using a frailty index termed FI34 based on 34 health variables. Like other frailty indices, FI34 increases non-linearly with advancing age and is a better indicator of biological aging than chronological age. FI34 has a substantial genetic basis. Using FI34, we found elevated levels of resting metabolic rate linked to declining health in nonagenarians. Using FI34 as a quantitative phenotype, we have also found a genomic region on chromosome 12 that is associated with healthy aging and longevity. PMID:26005669

  12. Cross-method validation as a solution to the problem of excessive simplification of measurement in quantitative IR research

    DEFF Research Database (Denmark)

    Beach, Derek

    2007-01-01

    The purpose of this article is to make IR scholars more aware of the costs of choosing quantitative methods. The article first shows that quantification can have analytical ‘costs’ when the measures created are too simple to capture the essence of the systematized concept that was supposed...... detail based upon a review of the democratic peace literature. I then offer two positive suggestions for a way forward. First, I argue that quantitative scholars should spend more time validating their measures, and in particular should engage in multi-method partnerships with qualitative scholars...... that have a deep understanding of particular cases in order to exploit the comparative advantages of qualitative methodology, using the more accurate qualitative measures to validate their own quantitative measures. Secondly, quantitative scholars should lower their level of ambition given the often poor...

  13. On the usability of quantitative modelling in operations strategy decission making

    NARCIS (Netherlands)

    Akkermans, H.A.; Bertrand, J.W.M.

    1997-01-01

    Quantitative modelling seems admirably suited to help managers in their strategic decision making on operations management issues, but in practice models are rarely used for this purpose. Investigates the reasons why, based on a detailed cross-case analysis of six cases of modelling-supported

  14. SU-E-J-155: Automatic Quantitative Decision Making Metric for 4DCT Image Quality

    Energy Technology Data Exchange (ETDEWEB)

    Kiely, J Blanco; Olszanski, A; Both, S; White, B [University of Pennsylvania, Philadelphia, PA (United States); Low, D [Deparment of Radiation Oncology, University of California Los Angeles, Los Angeles, CA (United States)

    2015-06-15

    Purpose: To develop a quantitative decision making metric for automatically detecting irregular breathing using a large patient population that received phase-sorted 4DCT. Methods: This study employed two patient cohorts. Cohort#1 contained 256 patients who received a phasesorted 4DCT. Cohort#2 contained 86 patients who received three weekly phase-sorted 4DCT scans. A previously published technique used a single abdominal surrogate to calculate the ratio of extreme inhalation tidal volume to normal inhalation tidal volume, referred to as the κ metric. Since a single surrogate is standard for phase-sorted 4DCT in radiation oncology clinical practice, tidal volume was not quantified. Without tidal volume, the absolute κ metric could not be determined, so a relative κ (κrel) metric was defined based on the measured surrogate amplitude instead of tidal volume. Receiver operator characteristic (ROC) curves were used to quantitatively determine the optimal cutoff value (jk) and efficiency cutoff value (τk) of κrel to automatically identify irregular breathing that would reduce the image quality of phase-sorted 4DCT. Discriminatory accuracy (area under the ROC curve) of κrel was calculated by a trapezoidal numeric integration technique. Results: The discriminatory accuracy of ?rel was found to be 0.746. The key values of jk and tk were calculated to be 1.45 and 1.72 respectively. For values of ?rel such that jk≤κrel≤τk, the decision to reacquire the 4DCT would be at the discretion of the physician. This accounted for only 11.9% of the patients in this study. The magnitude of κrel held consistent over 3 weeks for 73% of the patients in cohort#3. Conclusion: The decision making metric, ?rel, was shown to be an accurate classifier of irregular breathing patients in a large patient population. This work provided an automatic quantitative decision making metric to quickly and accurately assess the extent to which irregular breathing is occurring during phase

  15. SU-E-J-155: Automatic Quantitative Decision Making Metric for 4DCT Image Quality

    International Nuclear Information System (INIS)

    Kiely, J Blanco; Olszanski, A; Both, S; White, B; Low, D

    2015-01-01

    Purpose: To develop a quantitative decision making metric for automatically detecting irregular breathing using a large patient population that received phase-sorted 4DCT. Methods: This study employed two patient cohorts. Cohort#1 contained 256 patients who received a phasesorted 4DCT. Cohort#2 contained 86 patients who received three weekly phase-sorted 4DCT scans. A previously published technique used a single abdominal surrogate to calculate the ratio of extreme inhalation tidal volume to normal inhalation tidal volume, referred to as the κ metric. Since a single surrogate is standard for phase-sorted 4DCT in radiation oncology clinical practice, tidal volume was not quantified. Without tidal volume, the absolute κ metric could not be determined, so a relative κ (κrel) metric was defined based on the measured surrogate amplitude instead of tidal volume. Receiver operator characteristic (ROC) curves were used to quantitatively determine the optimal cutoff value (jk) and efficiency cutoff value (τk) of κrel to automatically identify irregular breathing that would reduce the image quality of phase-sorted 4DCT. Discriminatory accuracy (area under the ROC curve) of κrel was calculated by a trapezoidal numeric integration technique. Results: The discriminatory accuracy of ?rel was found to be 0.746. The key values of jk and tk were calculated to be 1.45 and 1.72 respectively. For values of ?rel such that jk≤κrel≤τk, the decision to reacquire the 4DCT would be at the discretion of the physician. This accounted for only 11.9% of the patients in this study. The magnitude of κrel held consistent over 3 weeks for 73% of the patients in cohort#3. Conclusion: The decision making metric, ?rel, was shown to be an accurate classifier of irregular breathing patients in a large patient population. This work provided an automatic quantitative decision making metric to quickly and accurately assess the extent to which irregular breathing is occurring during phase

  16. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  17. A quantitative method for evaluating alternatives. [aid to decision making

    Science.gov (United States)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  18. Quantitative measurement of solvation shells using frequency modulated atomic force microscopy

    Science.gov (United States)

    Uchihashi, T.; Higgins, M.; Nakayama, Y.; Sader, J. E.; Jarvis, S. P.

    2005-03-01

    The nanoscale specificity of interaction measurements and additional imaging capability of the atomic force microscope make it an ideal technique for measuring solvation shells in a variety of liquids next to a range of materials. Unfortunately, the widespread use of atomic force microscopy for the measurement of solvation shells has been limited by uncertainties over the dimensions, composition and durability of the tip during the measurements, and problems associated with quantitative force calibration of the most sensitive dynamic measurement techniques. We address both these issues by the combined use of carbon nanotube high aspect ratio probes and quantifying the highly sensitive frequency modulation (FM) detection technique using a recently developed analytical method. Due to the excellent reproducibility of the measurement technique, additional information regarding solvation shell size as a function of proximity to the surface has been obtained for two very different liquids. Further, it has been possible to identify differences between chemical and geometrical effects in the chosen systems.

  19. An Introduction to Quantitative Measures for Software Maintenance of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jo, Hyun Jun; Seong, Poong Hyun

    2007-01-01

    The I and C system of NPP has changed from the analog system to the digital-based system using microcontrollers and software. Thus, software has become very important for NPP control system. The software life cycle is divided into the development and maintenance phase largely. Because poor software maintenance work introduces new errors and makes software much complex, we have to consider the effective maintenance methods for the reliability and maintainability of NPP software. Function Block Diagram (FBD) is a standard application programming language for the Programmable Logic Controller (PLC) and currently being used in the development of a fully-digitalized reactor protection system (RPS) under the KNICS project. Therefore, the maintenance work will be of great importance in a few years. This paper studies on the measures which give quantitative information to software maintainer and manager before and after modification. The remainder of this paper is organized as follows. Section 2 briefly describes software maintenance types and model. In Section 3-5, we introduce the quantitative measures for software maintenance and characteristics of FBD program. A conclusion is provided in Section 6

  20. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    Science.gov (United States)

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  1. Age-related quantitative and qualitative changes in decision making ability.

    Science.gov (United States)

    Isella, Valeria; Mapelli, Cristina; Morielli, Nadia; Pelati, Oriana; Franceschi, Massimo; Appollonio, Ildebrando Marco

    2008-01-01

    The "frontal aging hypothesis" predicts that brain senescence affects predominantly the prefrontal regions. Preliminary evidence has recently been gathered in favour of an age-related change in a typically frontal process, i.e. decision making, using the Iowa Gambling Task (IGT), but overall findings have been conflicting. Following the traditional scoring method, coupled with a qualitative analysis, in the present study we compared IGT performance of 40 young (mean age: 27.9+/-4.7) and 40 old (mean age: 65.4+/-8.6) healthy adults and of 18 patients affected by frontal lobe dementia of mild severity (mean age: 65.1+/-7.4, mean MMSE score: 24.1+/-3.9). Quantitative findings support the notion that decision making ability declines with age; moreover, it approximates the impairment observed in executive dysfunction due to neurodegeneration. Results of the qualitative analysis did not reach statistical significance for the motivational and learning decision making components considered, but approached significance for the attentional component for elderly versus young normals, suggesting a possible decrease in the ability to maintain sustained attention during complex and prolonged tasks as the putative deficit underlying impaired decision making in normal aging.

  2. The Relationship between Quantitative and Qualitative Measures of Writing Skills.

    Science.gov (United States)

    Howerton, Mary Lou P.; And Others

    The relationships of quantitative measures of writing skills to overall writing quality as measured by the E.T.S. Composition Evaluation Scale (CES) were examined. Quantitative measures included indices of language productivity, vocabulary diversity, spelling, and syntactic maturity. Power of specific indices to account for variation in overall…

  3. Don't bet on it! Wagering as a measure of awareness in decision making under uncertainty.

    Science.gov (United States)

    Konstantinidis, Emmanouil; Shanks, David R

    2014-12-01

    Can our decisions be guided by unconscious or implicit influences? According to the somatic marker hypothesis, emotion-based signals can guide our decisions in uncertain environments outside awareness. Postdecision wagering, in which participants make wagers on the outcomes of their decisions, has been recently proposed as an objective and sensitive measure of conscious content. In 5 experiments we employed variations of a classic decision-making assessment, the Iowa Gambling Task, in combination with wagering in order to investigate the role played by unconscious influences. We examined the validity of postdecision wagering by comparing it with alternative measures of conscious knowledge, specifically confidence ratings and quantitative questions. Consistent with a putative role for unconscious influences, in Experiments 2 and 3 we observed a lag between choice accuracy and the onset of advantageous wagering. However, the lag was eliminated by a change in the wagering payoff matrix (Experiment 2) and by a switch from a binary wager response to either a binary or a 4-point confidence response (Experiment 3), and wagering underestimated awareness compared to explicit quantitative questions (Experiments 1 and 4). Our results demonstrate the insensitivity of postdecision wagering as a direct measure of conscious knowledge and challenge the claim that implicit processes influence decision making under uncertainty. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  4. Evaluation of quantitative PCR measurement of bacterial colonization of epithelial cells.

    Science.gov (United States)

    Schmidt, Marcin T; Olejnik-Schmidt, Agnieszka K; Myszka, Kamila; Borkowska, Monika; Grajek, Włodzimierz

    2010-01-01

    Microbial colonization is an important step in establishing pathogenic or probiotic relations to host cells and in biofilm formation on industrial or medical devices. The aim of this work was to verify the applicability of quantitative PCR (Real-Time PCR) to measure bacterial colonization of epithelial cells. Salmonella enterica and Caco-2 intestinal epithelial cell line was used as a model. To verify sensitivity of the assay a competition of the pathogen cells to probiotic microorganism was tested. The qPCR method was compared to plate count and radiolabel approach, which are well established techniques in this area of research. The three methods returned similar results. The best quantification accuracy had radiolabel method, followed by qPCR. The plate count results showed coefficient of variation two-times higher than this of qPCR. The quantitative PCR proved to be a reliable method for enumeration of microbes in colonization assay. It has several advantages that make it very useful in case of analyzing mixed populations, where several different species or even strains can be monitored at the same time.

  5. Smile line assessment comparing quantitative measurement and visual estimation.

    Science.gov (United States)

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  6. Quantitative tomographic measurements of opaque multiphase flows

    Energy Technology Data Exchange (ETDEWEB)

    GEORGE,DARIN L.; TORCZYNSKI,JOHN R.; SHOLLENBERGER,KIM ANN; O' HERN,TIMOTHY J.; CECCIO,STEVEN L.

    2000-03-01

    An electrical-impedance tomography (EIT) system has been developed for quantitative measurements of radial phase distribution profiles in two-phase and three-phase vertical column flows. The EIT system is described along with the computer algorithm used for reconstructing phase volume fraction profiles. EIT measurements were validated by comparison with a gamma-densitometry tomography (GDT) system. The EIT system was used to accurately measure average solid volume fractions up to 0.05 in solid-liquid flows, and radial gas volume fraction profiles in gas-liquid flows with gas volume fractions up to 0.15. In both flows, average phase volume fractions and radial volume fraction profiles from GDT and EIT were in good agreement. A minor modification to the formula used to relate conductivity data to phase volume fractions was found to improve agreement between the methods. GDT and EIT were then applied together to simultaneously measure the solid, liquid, and gas radial distributions within several vertical three-phase flows. For average solid volume fractions up to 0.30, the gas distribution for each gas flow rate was approximately independent of the amount of solids in the column. Measurements made with this EIT system demonstrate that EIT may be used successfully for noninvasive, quantitative measurements of dispersed multiphase flows.

  7. Safeguards decision making in the public and regulatory environment, and the potential role of quantitative approaches

    International Nuclear Information System (INIS)

    Sherr, T.S.

    1981-01-01

    This paper briefly examines the nature of the safeguards program's objectives and constraints, and the inherent limitations on comprehensive quantification. It discusses the nature of the public and regulatory processes employed in safeguards decision making, and examines their implications regarding the potential role of quantitative approaches to safeguards policy and operational decision making

  8. Calibration of quantitative neutron radiography method for moisture measurement

    International Nuclear Information System (INIS)

    Nemec, T.; Jeraj, R.

    1999-01-01

    Quantitative measurements of moisture and hydrogenous matter in building materials by neutron radiography (NR) are regularly performed at TRIGA Mark II research of 'Jozef Stefan' Institute in Ljubljana. Calibration of quantitative method is performed using standard brick samples with known moisture content and also with a secondary standard, plexiglas step wedge. In general, the contribution of scattered neutrons to the neutron image is not determined explicitly what introduces an error to the measured signal. Influence of scattered neutrons is significant in regions with high gradients of moisture concentrations, where the build up of scattered neutrons causes distortion of the moisture concentration profile. In this paper detailed analysis of validity of our calibration method for different geometrical parameters is presented. The error in the measured hydrogen concentration is evaluated by an experiment and compared with results obtained by Monte Carlo calculation with computer code MCNP 4B. Optimal conditions are determined for quantitative moisture measurements in order to minimize the error due to scattered neutrons. The method is tested on concrete samples with high moisture content.(author)

  9. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  10. Lesion measurement in non-radioactive DNA by quantitative gel electrophoresis

    International Nuclear Information System (INIS)

    Sutherland, J.C.; Chen, Chun Zhang; Emrick, A.; Hacham, H; Monteleone, D.; Ribeiro, E.; Trunk, J.; Sutherland, B.M.

    1989-01-01

    The gel electrophoresis method developed during the past ten years in our laboratories makes possible the quantitation of UV induced pyrimidine dimers, gamma ray induced single- and double-strand breaks and many other types of lesions in nanogram quantities of DNA. The DNA does not have to be labeled with radionuclides or of a particular conformation, thus facilitating the use of the method in measuring damage levels and repair rates in the DNA of intact organisms -- including man. The gel method can quantitate any lesion in DNA that either is, or can be converted to a single- or double-strand break. The formation of a strand break produces two shorter DNA molecules for each molecule that existed before the treatment that produced the break. Determining the number of breaks, and hence the number of lesions, becomes a matter of comparing the average lengths of molecules in samples differing only in lesion-induced breaks. This requires that we determine the distribution of mass of DNA on a gel as a function of its distance of migration and also the dispersion function of its distance of migration and also the dispersion function (the relationship between molecular length and distance of migration) in the gel electrophoresis system. 40 refs., 5 figs

  11. Application of quantitative autoradiography to the measurement of biochemical processes in vivo

    International Nuclear Information System (INIS)

    Sokoloff, L.

    1985-01-01

    Quantitative autoradiography makes it possible to measure the concentrations of isotopes in tissues of animals labeled in vivo. In a few cases, the administration of a judiciously selected labeled chemical compound and a properly designed procedure has made it possible to use this capability to measure the rate of a chemical process in animals in vivo. Emission tomography, and particularly positron emission tomography, provides a means to extend this capability to man and to assay the rates of biochemical processes in human tissues in vivo. It does not, however, obviate the need to adhere to established principles of chemical and enzyme kinetics and tracer theory. Generally, all such methods, whether to be used in man with positron emission tomography or in animals with autoradiography, must first be developed by research in animals with autoradiography, because it is only in animals that the measurements needed to validate the basic assumptions of the methods can be tested and evaluated

  12. Insights into the concept and measurement of health literacy from a study of shared decision-making in a low literacy population.

    Science.gov (United States)

    Smith, Sian K; Nutbeam, Don; McCaffery, Kirsten J

    2013-08-01

    This article explores the concept and measurement of health literacy in the context of shared health decision-making. It draws upon a series of qualitative and quantitative studies undertaken in the development and evaluation of a bowel cancer screening decision aid for low literacy populations. The findings indicate that different types of health literacy (functional, interactive and critical) are required in decision-making and present a set of instruments to assess and discriminate between higher level health literacy skills required for engagement in decision-making. It concludes that greater sophistication in both the definition and measurement of health literacy in research is needed.

  13. Quantitative angle-insensitive flow measurement using relative standard deviation OCT.

    Science.gov (United States)

    Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping

    2017-10-30

    Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .

  14. Quantitative 99mTc diphosphonate uptake measurements

    International Nuclear Information System (INIS)

    Smith, M.L.

    1987-01-01

    There are several different techniques currently in use for quantifying diphosphonate uptake by the skeleton. These can be considered in two main categories: local bone or whole-body uptake measurements. The choice of technique depends on the clinical problem being investigated and also on available equipment an expertise. The wide variety of approaches to diphosphonate quantitation ensures that these measurements can be obtained in almost any nuclear medicine department. This chapter discusses the general factors which may influence diphosphonate uptake measurements and outlines the techniques most relevant to current clinical practice

  15. Smile line assessment comparing quantitative measurement and visual estimation

    NARCIS (Netherlands)

    Geld, P. Van der; Oosterveld, P.; Schols, J.; Kuijpers-Jagtman, A.M.

    2011-01-01

    INTRODUCTION: Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation

  16. Decision-Making in Multiple Sclerosis Patients: A Systematic Review

    Directory of Open Access Journals (Sweden)

    Mireille Neuhaus

    2018-01-01

    Full Text Available Background. Multiple sclerosis (MS is frequently associated with cognitive and behavioural deficits. A growing number of studies suggest an impact of MS on decision-making abilities. The aim of this systematic review was to assess if (1 performance of MS patients in decision-making tasks was consistently different from controls and (2 whether this modification was associated with cognitive dysfunction and emotional alterations. Methods. The search was conducted on Pubmed/Medline database. 12 studies evaluating the difference between MS patients and healthy controls using validated decision-making tasks were included. Outcomes considered were quantitative (net scores and qualitative measurements (deliberation time and learning from feedback. Results. Quantitative and qualitative decision-making impairment in MS was present in 64.7% of measurements. Patients were equally impaired in tasks for decision-making under risk and ambiguity. A correlation to other cognitive functions was present in 50% of cases, with the highest associations in the domains of processing speed and attentional capacity. Conclusions. In MS patients, qualitative and quantitative modifications may be present in any kind of decision-making task and can appear independently of other cognitive measures. Since decision-making abilities have a significant impact on everyday life, this cognitive aspect has an influential importance in various MS-related treatment settings.

  17. Decision-Making in Multiple Sclerosis Patients: A Systematic Review.

    Science.gov (United States)

    Neuhaus, Mireille; Calabrese, Pasquale; Annoni, Jean-Marie

    2018-01-01

    Multiple sclerosis (MS) is frequently associated with cognitive and behavioural deficits. A growing number of studies suggest an impact of MS on decision-making abilities. The aim of this systematic review was to assess if (1) performance of MS patients in decision-making tasks was consistently different from controls and (2) whether this modification was associated with cognitive dysfunction and emotional alterations. The search was conducted on Pubmed/Medline database. 12 studies evaluating the difference between MS patients and healthy controls using validated decision-making tasks were included. Outcomes considered were quantitative (net scores) and qualitative measurements (deliberation time and learning from feedback). Quantitative and qualitative decision-making impairment in MS was present in 64.7% of measurements. Patients were equally impaired in tasks for decision-making under risk and ambiguity. A correlation to other cognitive functions was present in 50% of cases, with the highest associations in the domains of processing speed and attentional capacity. In MS patients, qualitative and quantitative modifications may be present in any kind of decision-making task and can appear independently of other cognitive measures. Since decision-making abilities have a significant impact on everyday life, this cognitive aspect has an influential importance in various MS-related treatment settings.

  18. Family involvement in decision making for people with dementia in residential aged care: a systematic review of quantitative literature.

    Science.gov (United States)

    Petriwskyj, Andrea; Gibson, Alexandra; Parker, Deborah; Banks, Susan; Andrews, Sharon; Robinson, Andrew

    2014-06-01

    Ensuring older adults' involvement in their care is accepted as good practice and is vital, particularly for people with dementia, whose care and treatment needs change considerably over the course of the illness. However, involving family members in decision making on people's behalf is still practically difficult for staff and family. The aim of this review was to identify and appraise the existing quantitative evidence about family involvement in decision making for people with dementia living in residential aged care. The present Joanna Briggs Institute (JBI) metasynthesis assessed studies that investigated involvement of family members in decision making for people with dementia in residential aged care settings. While quantitative and qualitative studies were included in the review, this paper presents the quantitative findings. A comprehensive search of 15 electronic databases was performed. The search was limited to papers published in English, from 1990 to 2013. Twenty-six studies were identified as being relevant; 10 were quantitative, with 1 mixed method study. Two independent reviewers assessed the studies for methodological validity and extracted the data using the JBI Meta Analysis of Statistics Assessment and Review Instrument (JBI-MAStARI). The findings were synthesized and presented in narrative form. The findings related to decisions encountered and made by family surrogates, variables associated with decisions, surrogates' perceptions of, and preferences for, their roles, as well as outcomes for people with dementia and their families. The results identified patterns within, and variables associated with, surrogate decision making, all of which highlight the complexity and variation regarding family involvement. Attention needs to be paid to supporting family members in decision making in collaboration with staff.

  19. Quantitative Decision Making Model for Carbon Reduction in Road Construction Projects Using Green Technologies

    Directory of Open Access Journals (Sweden)

    Woosik Jang

    2015-08-01

    Full Text Available Numerous countries have established policies for reducing greenhouse gas emissions and have suggested goals pertaining to these reductions. To reach the target reduction amounts, studies on the reduction of carbon emissions have been conducted with regard to all stages and processes in construction projects. According to a study on carbon emissions, the carbon emissions generated during the construction stage of road projects account for approximately 76 to 86% of the total carbon emissions, far exceeding the other stages, such as maintenance or demolition. Therefore, this study aims to develop a quantitative decision making model that supports the application of green technologies (GTs to reduce carbon emissions during the construction stage of road construction projects. First, the authors selected environmental soundness, economic feasibility and constructability as the key assessment indices for evaluating 20 GTs. Second, a fuzzy set/qualitative comparative analysis (FS/QCA was used to establish an objective decision-making model for the assessment of both the quantitative and qualitative characteristics of the key indices. To support the developed model, an expert survey was performed to assess the applicability of each GT from a practical perspective, which was verified with a case study using two additional GTs. The proposed model is expected to support practitioners in the application of suitable GTs to road projects and reduce carbon emissions, resulting in better decision making during road construction projects.

  20. Quantitative measurements of shear displacement using atomic force microscopy

    International Nuclear Information System (INIS)

    Wang, Wenbo; Wu, Weida; Sun, Ying; Zhao, Yonggang

    2016-01-01

    We report a method to quantitatively measure local shear deformation with high sensitivity using atomic force microscopy. The key point is to simultaneously detect both torsional and buckling motions of atomic force microscopy (AFM) cantilevers induced by the lateral piezoelectric response of the sample. This requires the quantitative calibration of torsional and buckling response of AFM. This method is validated by measuring the angular dependence of the in-plane piezoelectric response of a piece of piezoelectric α-quartz. The accurate determination of the amplitude and orientation of the in-plane piezoelectric response, without rotation, would greatly enhance the efficiency of lateral piezoelectric force microscopy.

  1. A quantitative measure of myelination development in infants, using MR images

    International Nuclear Information System (INIS)

    Carmody, Dennis P.; Dunn, Stanley M.; Boddie-Willis, Akiza S.; DeMarco, J. Kevin; Lewis, Michael

    2004-01-01

    The objective of this study was to measure myelination of frontal lobe changes in infants and young children. Twenty-four cases of infants and children (age range 12-121 months) were evaluated by a quantitative assessment of T2-weighted MR image features. Reliable quantitative changes between white and gray matter correlated with developmental age in a group of children with no neurological findings. Myelination appears to be an increasing exponential function with the greatest rate of change occurring over the first 3 years of life. The quantitative changes observed were in accordance with previous qualitative judgments of myelination development. Children with periventricular leukomalacia (PVL) showed delays in achieving levels of myelination when compared to normal children and adjusted for chronological age. The quantitative measure of myelination development may prove to be useful in assessing the stages of development and helpful in the quantitative descriptions of white matter disorders such as PVL. (orig.)

  2. A quantitative measure of myelination development in infants, using MR images

    Energy Technology Data Exchange (ETDEWEB)

    Carmody, Dennis P. [Robert Wood Johnson Medical School, New Brunswick, NJ (United States); Dunn, Stanley M.; Boddie-Willis, Akiza S. [The State University of New Jersey, Rutgers, New Brunswick, NJ (United States); DeMarco, J. Kevin [Laurie Imaging Center, New Brunswick, NJ (United States); Lewis, Michael [Robert Wood Johnson Medical School, New Brunswick, NJ (United States); Robert Wood Johnson Medical School, University of Medicine and Dentistry of New Jersey, Institute for the Study of Child Development, New Brunswick (United States)

    2004-09-01

    The objective of this study was to measure myelination of frontal lobe changes in infants and young children. Twenty-four cases of infants and children (age range 12-121 months) were evaluated by a quantitative assessment of T2-weighted MR image features. Reliable quantitative changes between white and gray matter correlated with developmental age in a group of children with no neurological findings. Myelination appears to be an increasing exponential function with the greatest rate of change occurring over the first 3 years of life. The quantitative changes observed were in accordance with previous qualitative judgments of myelination development. Children with periventricular leukomalacia (PVL) showed delays in achieving levels of myelination when compared to normal children and adjusted for chronological age. The quantitative measure of myelination development may prove to be useful in assessing the stages of development and helpful in the quantitative descriptions of white matter disorders such as PVL. (orig.)

  3. Quantitative measurement of cerebral blood flow on patients with early syphilis

    International Nuclear Information System (INIS)

    Zhong Jijun; Wu Jinchang; Yang Yi; Tang Jun; Liu Zengli; Shi Xin

    2005-01-01

    To study quantitative change of cerebral blood flow (CBF) on patients with early syphilis, the authors have established a method on absolute measurement of rCBF by using SPECT with Ethyl Cysteinate Dimmer (ECD) as imaging agent, and the method was applied to measure rCBF on patients with early syphilis. The rCBF values measured by this method are highly consistent with the values measured by other classical methods such as SPECT ( 123 I-IMP) and PET( 15 O-H 2 O). The rCBF values for early syphilis patients and the normal control show some statistical differences. A routine quantitative absolute measurement of rCBF featured with simple procedures is therefore on the way of maturation. (authors)

  4. A new lowry's technique for quantitative measurement of protein

    International Nuclear Information System (INIS)

    Chen Ge; Zou Wenquan; Sun Jianzhong; Zhang Yanggang; Shu Bohua; Liu Shenpei; Gong Xiaoliang

    1990-01-01

    According to the queneching principle in beta ray measurement, liquid scintillation counters are used for quantitative measurement of protein. The results show linear relationship between the colored protein samples with different concentrations and the counting rate of LSC. It is proved that LSC method is less erroneous and has larger measurement range than the traditional photoelectric colorimetry, and the analysis is easy to be automatized

  5. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  6. Conjugate whole-body scanning system for quantitative measurement of organ distribution in vivo

    International Nuclear Information System (INIS)

    Tsui, B.M.W.; Chen, C.T.; Yasillo, N.J.; Ortega, C.J.; Charleston, D.B.; Lathrop, K.A.

    1979-01-01

    The determination of accurate, quantitative, biokinetic distribution of an internally dispersed radionuclide in humans is important in making realistic radiation absorbed dose estimates, studying biochemical transformations in health and disease, and developing clinical procedures indicative of abnormal functions. In order to collect these data, a whole-body imaging system is required which provides both adequate spatial resolution and some means of absolute quantitation. Based on these considerations, a new whole-body scanning system has been designed and constructed that employs the conjugate counting technique. The conjugate whole-body scanning system provides an efficient and accurate means of collecting absolute quantitative organ distribution data of radioactivity in vivo

  7. Convergent validity of two decision making style measures

    Directory of Open Access Journals (Sweden)

    Berisha, Gentrit

    2018-04-01

    Full Text Available Decision making research has witnessed a growing number of studies on individual differences and decision making styles, yet the lack of comprehensive frameworks and widely accepted measures has hindered research for a long time. There is an ongoing debate on whether individuals’ styles dynamically change across time and situations according to circumstances. Furthermore, it is an open question whether these styles are mutually exclusive. Decision style measures seek to determine one’s dominant style as well as less used styles. To our knowledge this is the first study of the convergent validity of two widely used decision making style measures: The Decision Style Inventory (DSI and the General Decision Making Style (GDMS. The direction and strength of correlation between directive, analytical, conceptual and behavioral styles as measured by DSI and rational, intuitive, dependent, avoidant and spontaneous styles as measured by GDMS have been tested. Results of the current study are compared with previous studies that have used one or both of the instruments. Correlations between styles are consistent with findings from other studies using one of the decision style measures, but the strength of correlations indicates that there is no convergent validity between DSI and GDMS.

  8. Semi-automatic quantitative measurements of intracranial internal carotid artery stenosis and calcification using CT angiography

    International Nuclear Information System (INIS)

    Bleeker, Leslie; Berg, Rene van den; Majoie, Charles B.; Marquering, Henk A.; Nederkoorn, Paul J.

    2012-01-01

    Intracranial carotid artery atherosclerotic disease is an independent predictor for recurrent stroke. However, its quantitative assessment is not routinely performed in clinical practice. In this diagnostic study, we present and evaluate a novel semi-automatic application to quantitatively measure intracranial internal carotid artery (ICA) degree of stenosis and calcium volume in CT angiography (CTA) images. In this retrospective study involving CTA images of 88 consecutive patients, intracranial ICA stenosis was quantitatively measured by two independent observers. Stenoses were categorized with cutoff values of 30% and 50%. The calcification in the intracranial ICA was qualitatively categorized as absent, mild, moderate, or severe and quantitatively measured using the semi-automatic application. Linear weighted kappa values were calculated to assess the interobserver agreement of the stenosis and calcium categorization. The average and the standard deviation of the quantitative calcium volume were calculated for the calcium categories. For the stenosis measurements, the CTA images of 162 arteries yielded an interobserver correlation of 0.78 (P < 0.001). Kappa values of the categorized stenosis measurements were moderate: 0.45 and 0.58 for cutoff values of 30% and 50%, respectively. The kappa value for the calcium categorization was 0.62, with a good agreement between the qualitative and quantitative calcium assessment. Quantitative degree of stenosis measurement of the intracranial ICA on CTA is feasible with a good interobserver agreement ICA. Qualitative calcium categorization agrees well with quantitative measurements. (orig.)

  9. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    Science.gov (United States)

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  10. Prediction of Coronal Mass Ejections from Vector Magnetograms: Quantitative Measures as Predictors

    Science.gov (United States)

    Falconer, D. A.; Moore, R. L.; Gary, G. A.

    2001-05-01

    In a pilot study of 4 active regions (Falconer, D.A. 2001, JGR, in press), we derived two quantitative measures of an active region's global nonpotentiality from the region's vector magnetogram, 1) the net current (IN), and 2) the length of the strong-shear, strong-field main neutral line (LSS), and used these two measures of the CME productivity of the active regions. We compared the global nonpotentiality measures to the active regions' CME productivity determined from GOES and Yohkoh/SXT observations. We found that two of the active regions were highly globally nonpotential and were CME productive, while the other two active regions had little global nonpotentiality and produced no CMEs. At the Fall 2000 AGU (Falconer, Moore, & Gary, 2000, EOS 81, 48 F998), we reported on an expanded study (12 active regions and 17 magnetograms) in which we evaluated four quantitative global measures of an active region's magnetic field and compared these measures with the CME productivity. The four global measures (all derived from MSFC vector magnetograms) included our two previous measures (IN and LSS) as well as two new ones, the total magnetic flux (Φ ) (a measure of an active region's size), and the normalized twist (α =μ IN/Φ ). We found that the three measures of global nonpotentiality (IN, LSS, α ) were all well correlated (>99% confidence level) with an active region's CME productivity within (2 days of the day of the magnetogram. We will now report on our findings of how good our quantitative measures are as predictors of active-region CME productivity, using only CMEs that occurred after the magnetogram. We report the preliminary skill test of these quantitative measures as predictors. We compare the CME prediction success of our quantitative measures to the CME prediction success based on an active region's past CME productivity. We examine the cases of the handful of false positive and false negatives to look for improvements to our predictors. This work is

  11. An Evaluation Model of Quantitative and Qualitative Fuzzy Multi-Criteria Decision-Making Approach for Location Selection of Transshipment Ports

    Directory of Open Access Journals (Sweden)

    Ji-Feng Ding

    2013-01-01

    Full Text Available The role of container logistics centre as home bases for merchandise transportation has become increasingly important. The container carriers need to select a suitable centre location of transshipment port to meet the requirements of container shipping logistics. In the light of this, the main purpose of this paper is to develop a fuzzy multi-criteria decision-making (MCDM model to evaluate the best selection of transshipment ports for container carriers. At first, some concepts and methods used to develop the proposed model are briefly introduced. The performance values of quantitative and qualitative subcriteria are discussed to evaluate the fuzzy ratings. Then, the ideal and anti-ideal concepts and the modified distance measure method are used in the proposed model. Finally, a step-by-step example is illustrated to study the computational process of the quantitative and qualitative fuzzy MCDM model. The proposed approach has successfully accomplished our goal. In addition, the proposed fuzzy MCDM model can be empirically employed to select the best location of transshipment port for container carriers in the future study.

  12. Radioisotope studies for quantitative measurement of manganese absorption

    International Nuclear Information System (INIS)

    Helbig, U.

    1981-01-01

    Purpose of the present study was to quantitatively determine the manganese absorption in growing rats by means of radioisotopes. First of all the following factors had to be investigated, which are significant for this determination: Measurability of stable and radioactive Mn in rat tissues; labelling of stable Mn and distribution of stable and radioactive Mn in the organism; verification of the isotope dilution method and of the comparative balance method with regard to its applicability for the determination of the true Mn absorption. We useed male and female Sprague-Dawley rats. The most important results are summarized in the following: in some separate tissues measurement of stable Mn was accompanied by difficulties. The measurement of radioactive Mn however, could be performed without any problems. 10 d after i.m. injection of 54 Mn only 17% of the administered Mn was still detectable in the organism. However, there was no uniform tissue labelling found. Therefore it is possible to an only restricted extent to draw quantitative conclusions on the content of stable Mn. A high percentage of stable and radioactive Mn was found above all in the liver. The isotope dilution method permits by feces analysis to differentiate between unabsorbed Mn coming from the food and endogenic Mn coming from the organism itself. The effective Mn absorption was also determined by means of the comparative balance method. By means of the isotope dilution method we determined the quantitative Mn-absorption with staged Mn administration and the contribution of absorption and excretion to the homeostatic regulation mechanisms of Mn. We found that absorption and excretion help the organism to keep an almost constant Mn concentration even with a differing Mn supply. (orig./MG) [de

  13. Radioimmunoassay to quantitatively measure cell surface immunoglobulins

    International Nuclear Information System (INIS)

    Krishman, E.C.; Jewell, W.R.

    1975-01-01

    A radioimmunoassay techniques developed to quantitatively measure the presence of immunoglobulins on the surface of cells, is described. The amount of immunoglobulins found on different tumor cells varied from 200 to 1140 ng/10 6 cells. Determination of immunoglobulins on the peripheral lymphocytes obtained from different cancer patients varied between 340 to 1040 ng/10 6 cells. Cultured tumor cells, on the other hand, were found to contain negligible quantities of human IgG [pt

  14. Developing model-making and model-breaking skills using direct measurement video-based activities

    Science.gov (United States)

    Vonk, Matthew; Bohacek, Peter; Militello, Cheryl; Iverson, Ellen

    2017-12-01

    This study focuses on student development of two important laboratory skills in the context of introductory college-level physics. The first skill, which we call model making, is the ability to analyze a phenomenon in a way that produces a quantitative multimodal model. The second skill, which we call model breaking, is the ability to critically evaluate if the behavior of a system is consistent with a given model. This study involved 116 introductory physics students in four different sections, each taught by a different instructor. All of the students within a given class section participated in the same instruction (including labs) with the exception of five activities performed throughout the semester. For those five activities, each class section was split into two groups; one group was scaffolded to focus on model-making skills and the other was scaffolded to focus on model-breaking skills. Both conditions involved direct measurement videos. In some cases, students could vary important experimental parameters within the video like mass, frequency, and tension. Data collected at the end of the semester indicate that students in the model-making treatment group significantly outperformed the other group on the model-making skill despite the fact that both groups shared a common physical lab experience. Likewise, the model-breaking treatment group significantly outperformed the other group on the model-breaking skill. This is important because it shows that direct measurement video-based instruction can help students acquire science-process skills, which are critical for scientists, and which are a key part of current science education approaches such as the Next Generation Science Standards and the Advanced Placement Physics 1 course.

  15. Quantitative computed tomography for measuring bone mineral content

    International Nuclear Information System (INIS)

    Felsenberg, D.; Kalender, W.A.; Banzer, D.; Schmilinsky, G.; Heyse, M.; Fischer, E.; Schneider, U.; Siemens A.G., Erlangen; Krankenhaus Zehlendorf, Berlin

    1988-01-01

    Quantitative computed tomography (QCT) for measuring bone mineral content of lumbar vertebrae is increasingly used internationally. The effect of using conventional CT (single energy CT, SE-CT) and dual energy CT (DE-CT) on reproducibility has been examined. We defined a standard measurement protocol, which automatically evaluates a calibration phantom. This should ensure an in vivo reproducibility of 1 to 2%. Reference data, which has been obtained with this protocol from 113 normal subjects, using SE-CT ad DE-CT, are presented. (orig.) [de

  16. Computer controlled scanning systems for quantitative track measurements

    International Nuclear Information System (INIS)

    Gold, R.; Roberts, J.H.; Preston, C.C.; Ruddy, F.H.

    1982-01-01

    The status of three computer cntrolled systems for quantitative track measurements is described. Two systems, an automated optical track scanner (AOTS) and an automated scanning electron microscope (ASEM) are used for scanning solid state track recorders (SSTR). The third system, the emulsion scanning processor (ESP), is an interactive system used to measure the length of proton tracks in nuclear research emulsions (NRE). Recent advances achieved with these systems are presented, with emphasis placed upon the current limitation of these systems for reactor neutron dosimetry

  17. The four principles: can they be measured and do they predict ethical decision making?

    Science.gov (United States)

    Page, Katie

    2012-05-20

    The four principles of Beauchamp and Childress--autonomy, non-maleficence, beneficence and justice--have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed.

  18. The four principles: Can they be measured and do they predict ethical decision making?

    Directory of Open Access Journals (Sweden)

    Page Katie

    2012-05-01

    Full Text Available Abstract Background The four principles of Beauchamp and Childress - autonomy, non-maleficence, beneficence and justice - have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. Methods The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Results Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. Conclusions People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed.

  19. The four principles: Can they be measured and do they predict ethical decision making?

    Science.gov (United States)

    2012-01-01

    Background The four principles of Beauchamp and Childress - autonomy, non-maleficence, beneficence and justice - have been extremely influential in the field of medical ethics, and are fundamental for understanding the current approach to ethical assessment in health care. This study tests whether these principles can be quantitatively measured on an individual level, and then subsequently if they are used in the decision making process when individuals are faced with ethical dilemmas. Methods The Analytic Hierarchy Process was used as a tool for the measurement of the principles. Four scenarios, which involved conflicts between the medical ethical principles, were presented to participants who then made judgments about the ethicality of the action in the scenario, and their intentions to act in the same manner if they were in the situation. Results Individual preferences for these medical ethical principles can be measured using the Analytic Hierarchy Process. This technique provides a useful tool in which to highlight individual medical ethical values. On average, individuals have a significant preference for non-maleficence over the other principles, however, and perhaps counter-intuitively, this preference does not seem to relate to applied ethical judgements in specific ethical dilemmas. Conclusions People state they value these medical ethical principles but they do not actually seem to use them directly in the decision making process. The reasons for this are explained through the lack of a behavioural model to account for the relevant situational factors not captured by the principles. The limitations of the principles in predicting ethical decision making are discussed. PMID:22606995

  20. Making coarse grained polymer simulations quantitatively predictive for statics and dynamics

    Science.gov (United States)

    Kremer, Kurt

    2010-03-01

    By combining input from short simulation runs of rather small systems with all atomistic details together with properly adapted coarse grained models we are able quantitatively predict static and especially dynamical properties of both pure polymer melts of long fully entangled but also of systems with low molecular weight additives. Comparisons to rather different experiments such as diffusion constant measurements or NMR relaxation experiments show a remarkable quantitative agreement without any adjustable parameter. Reintroduction of chemical details into the coarse grained trajectories allows the study of long time trajectories in all atomistic detail providing the opportunity for rather different means of data analysis. References: V. Harmandaris, K. Kremer, Macromolecules, in press (2009) V. Harmandaris et al, Macromolecules, 40, 7026 (2007) B. Hess, S. Leon, N. van der Vegt, K. Kremer, Soft Matter 2, 409 (2006) D. Fritz et al, Soft Matter 5, 4556 (2009)

  1. Quantitative measurement of the cerebral blood flow

    International Nuclear Information System (INIS)

    Houdart, R.; Mamo, H.; Meric, P.; Seylaz, J.

    1976-01-01

    The value of the cerebral blood flow measurement (CBF) is outlined, its limits are defined and some future prospects discussed. The xenon 133 brain clearance study is at present the most accurate quantitative method to evaluate the CBF in different regions of the brain simultaneously. The method and the progress it has led to in the physiological, physiopathological and therapeutic fields are described. The major disadvantage of the method is shown to be the need to puncture the internal carotid for each measurement. Prospects are discussed concerning methods derived from the same general principle but using a simpler, non-traumatic way to introduce the radio-tracer, either by breathing into the lungs or intraveinously [fr

  2. A magneto-optical microscope for quantitative measurement of magnetic microstructures.

    Science.gov (United States)

    Patterson, W C; Garraud, N; Shorman, E E; Arnold, D P

    2015-09-01

    An optical system is presented to quantitatively map the stray magnetic fields of microscale magnetic structures, with field resolution down to 50 μT and spatial resolution down to 4 μm. The system uses a magneto-optical indicator film (MOIF) in conjunction with an upright reflective polarizing light microscope to generate optical images of the magnetic field perpendicular to the image plane. A novel single light path construction and discrete multi-image polarimetry processing method are used to extract quantitative areal field measurements from the optical images. The integrated system including the equipment, image analysis software, and experimental methods are described. MOIFs with three different magnetic field ranges are calibrated, and the entire system is validated by measurement of the field patterns from two calibration samples.

  3. Quantitative Reasoning in Environmental Science: Rasch Measurement to Support QR Assessment

    Directory of Open Access Journals (Sweden)

    Robert L. Mayes

    2015-07-01

    Full Text Available The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR learning progression, with associated QR assessments in the content areas of biodiversity, water, and carbon, was developed based on three QR progress variables: quantification act, quantitative interpretation, and quantitative modeling. Diagnostic instruments were developed specifically for the progress variable quantitative interpretation (QI, each consisting of 96 Likert-scale items. Each content version of the instrument focused on three scale levels (macro scale, micro scale, and landscape scale and four elements of QI identified in prior research (trend, translation, prediction, and revision. The QI assessments were completed by 362, 6th to 12th grade students in three U.S. states. Rasch (1960/1980 measurement was used to determine item and person measures for the QI instruments, both to examine validity and reliability characteristics of the instrument administration and inform the evolution of the learning progression. Rasch methods allowed identification of several QI instrument revisions, including modification of specific items, reducing number of items to avoid cognitive fatigue, reconsidering proposed item difficulty levels, and reducing Likert scale to 4 levels. Rasch diagnostics also indicated favorable levels of instrument reliability and appropriate targeting of item abilities to student abilities for the majority of participants. A revised QI instrument is available for STEM researchers and educators.

  4. Initial Description of a Quantitative, Cross-Species (Chimpanzee-Human) Social Responsiveness Measure

    Science.gov (United States)

    Marrus, Natasha; Faughn, Carley; Shuman, Jeremy; Petersen, Steve E.; Constantino, John N.; Povinelli, Daniel J.; Pruett, John R., Jr.

    2011-01-01

    Objective: Comparative studies of social responsiveness, an ability that is impaired in autism spectrum disorders, can inform our understanding of both autism and the cognitive architecture of social behavior. Because there is no existing quantitative measure of social responsiveness in chimpanzees, we generated a quantitative, cross-species…

  5. Portable instrumentation for quantitatively measuring radioactive surface contaminations, including 90Sr

    International Nuclear Information System (INIS)

    Brodzinski, R.L.

    1983-10-01

    In order to measure the effectiveness of decontamination efforts, a quantitative analysis of the radiocontamination is necessary, both before and after decontamination. Since it is desirable to release the decontaminated material for unrestricted use or disposal, the assay equipment must provide adequate sensitivity to measure the radioactivity at or below the release limit. In addition, the instrumentation must be capable of measuring all kinds of radiocontaminants including fission products, activation products, and transuranic materials. Finally, the survey instrumentation must be extremely versatile in order to assay the wide variety of contaminated surfaces in many environments, some of which may be extremely hostile or remote. This communication describes the development and application of portable instrumentation capable of quantitatively measuring most transuranics, activation products, and fission products, including 90 Sr, on almost any contaminated surface in nearly any location

  6. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  7. Rethinking the Numerate Citizen: Quantitative Literacy and Public Issues

    Directory of Open Access Journals (Sweden)

    Ander W. Erickson

    2016-07-01

    Full Text Available Does a citizen need to possess quantitative literacy in order to make responsible decisions on behalf of the public good? If so, how much is enough? This paper presents an analysis of the quantitative claims made on behalf of ballot measures in order to better delineate the role of quantitative literacy for the citizen. I argue that this role is surprisingly limited due to the contextualized nature of quantitative claims that are encountered outside of a school setting. Instead, rational dependence, or the reasoned dependence on the knowledge of others, is proposed as an educational goal that can supplement quantitative literacy and, in so doing, provide a more realistic plan for informed evaluations of quantitative claims.

  8. Measurements in quantitative research: how to select and report on research instruments.

    Science.gov (United States)

    Hagan, Teresa L

    2014-07-01

    Measures exist to numerically represent degrees of attributes. Quantitative research is based on measurement and is conducted in a systematic, controlled manner. These measures enable researchers to perform statistical tests, analyze differences between groups, and determine the effectiveness of treatments. If something is not measurable, it cannot be tested.

  9. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    International Nuclear Information System (INIS)

    Hwang, Ji Young; Lee, Sun Wha; Park, Youn Soo

    2006-01-01

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) (ρ < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option

  10. Quantitative measurement of intervertebral disc signal using MRI

    International Nuclear Information System (INIS)

    Niemelaeinen, R.; Videman, T.; Dhillon, S.S.; Battie, M.C.

    2008-01-01

    Aim: To investigate the spinal cord as an alternative intra-body reference to cerebrospinal fluid (CSF) in evaluating thoracic disc signal intensity. Materials and methods: T2-weighted magnetic resonance imaging (MRI) images of T6-T12 were obtained using 1.5 T machines for a population-based sample of 523 men aged 35-70 years. Quantitative data on the signal intensities were acquired using an image analysis program (SpEx (copy right) ). A random sample of 30 subjects and intraclass correlation coeffcients (ICC) were used to examine the repeatability of the spinal cord measurements. The validity of using the spinal cord as a reference was examined by correlating cord and CSF samples. Finally, thoracic disc signal was validated by correlating it with age without adjustment and adjusting for either cord or CSF. Pearson's r was used for correlational analyses. Results: The repeatability of the spinal cord signal measurements was extremely high (≥0.99). The correlations between the signals of spinal cord and CSF by level were all above 0.9. The spinal cord-adjusted disc signal and age correlated similarly with CSF-adjusted disc signal and age (r = -0.30 to -0.40 versus r = -0.26 to -0.36). Conclusion: Adjacent spinal cord is a good alternative reference to the current reference standard, CSF, for quantitative measurements of disc signal intensity. Clearly fewer levels were excluded when using spinal cord as compared to CSF due to missing reference samples

  11. Quantitative measurement of intervertebral disc signal using MRI

    Energy Technology Data Exchange (ETDEWEB)

    Niemelaeinen, R. [Faculty of Rehabilitation Medicine, University of Alberta, Edmonton (Canada)], E-mail: riikka.niemelainen@ualberta.ca; Videman, T. [Faculty of Rehabilitation Medicine, University of Alberta, Edmonton (Canada); Dhillon, S.S. [Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton (Canada); Battie, M.C. [Faculty of Rehabilitation Medicine, University of Alberta, Edmonton (Canada)

    2008-03-15

    Aim: To investigate the spinal cord as an alternative intra-body reference to cerebrospinal fluid (CSF) in evaluating thoracic disc signal intensity. Materials and methods: T2-weighted magnetic resonance imaging (MRI) images of T6-T12 were obtained using 1.5 T machines for a population-based sample of 523 men aged 35-70 years. Quantitative data on the signal intensities were acquired using an image analysis program (SpEx (copy right) ). A random sample of 30 subjects and intraclass correlation coeffcients (ICC) were used to examine the repeatability of the spinal cord measurements. The validity of using the spinal cord as a reference was examined by correlating cord and CSF samples. Finally, thoracic disc signal was validated by correlating it with age without adjustment and adjusting for either cord or CSF. Pearson's r was used for correlational analyses. Results: The repeatability of the spinal cord signal measurements was extremely high ({>=}0.99). The correlations between the signals of spinal cord and CSF by level were all above 0.9. The spinal cord-adjusted disc signal and age correlated similarly with CSF-adjusted disc signal and age (r = -0.30 to -0.40 versus r = -0.26 to -0.36). Conclusion: Adjacent spinal cord is a good alternative reference to the current reference standard, CSF, for quantitative measurements of disc signal intensity. Clearly fewer levels were excluded when using spinal cord as compared to CSF due to missing reference samples.

  12. Quantitative Model for Economic Analyses of Information Security Investment in an Enterprise Information System

    Directory of Open Access Journals (Sweden)

    Bojanc Rok

    2012-11-01

    Full Text Available The paper presents a mathematical model for the optimal security-technology investment evaluation and decision-making processes based on the quantitative analysis of security risks and digital asset assessments in an enterprise. The model makes use of the quantitative analysis of different security measures that counteract individual risks by identifying the information system processes in an enterprise and the potential threats. The model comprises the target security levels for all identified business processes and the probability of a security accident together with the possible loss the enterprise may suffer. The selection of security technology is based on the efficiency of selected security measures. Economic metrics are applied for the efficiency assessment and comparative analysis of different protection technologies. Unlike the existing models for evaluation of the security investment, the proposed model allows direct comparison and quantitative assessment of different security measures. The model allows deep analyses and computations providing quantitative assessments of different options for investments, which translate into recommendations facilitating the selection of the best solution and the decision-making thereof. The model was tested using empirical examples with data from real business environment.

  13. Tracer-based laser-induced fluorescence measurement technique for quantitative fuel/air-ratio measurements in a hydrogen internal combustion engine.

    Science.gov (United States)

    Blotevogel, Thomas; Hartmann, Matthias; Rottengruber, Hermann; Leipertz, Alfred

    2008-12-10

    A measurement technique for the quantitative investigation of mixture formation processes in hydrogen internal combustion engines (ICEs) has been developed using tracer-based laser-induced fluorescence (TLIF). This technique can be employed to fired and motored engine operation. The quantitative TLIF fuel/air-ratio results have been verified by means of linear Raman scattering measurements. Exemplary results of the simultaneous investigation of mixture formation and combustion obtained at an optical accessible hydrogen ICE are shown.

  14. Quantitative mixture fraction measurements in combustion system via laser induced breakdown spectroscopy

    KAUST Repository

    Mansour, Mohy S.

    2015-01-01

    Laser induced breakdown spectroscopy (LIBS) technique has been applied to quantitative mixture fraction measurements in flames. The measured spectra of different mixtures of natural gas and air are used to obtain the calibration parameters for local elemental mass fraction measurements and hence calculate the mixture fraction. The results are compared with the mixture fraction calculations based on the ratios of the spectral lines of H/N elements, H/O elements and C/(N+O) and they show good agreement within the reaction zone of the flames. Some deviations are observed outside the reaction zone. The ability of LIBS technique as a tool for quantitative mixture fraction as well as elemental fraction measurements in reacting and non-reacting of turbulent flames is feasible. © 2014 Elsevier Ltd. All rights reserved.

  15. Quantitative measures of walking and strength provide insight into brain corticospinal tract pathology in multiple sclerosis

    Directory of Open Access Journals (Sweden)

    Nora E Fritz

    2017-01-01

    Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.

  16. Quantitative mixture fraction measurements in combustion system via laser induced breakdown spectroscopy

    KAUST Repository

    Mansour, Mohy S.; Imam, Hisham; Elsayed, Khaled A.; Elbaz, Ayman M.; Abbass, Wafaa

    2015-01-01

    Laser induced breakdown spectroscopy (LIBS) technique has been applied to quantitative mixture fraction measurements in flames. The measured spectra of different mixtures of natural gas and air are used to obtain the calibration parameters for local

  17. Detecting Genetic Interactions for Quantitative Traits Using m-Spacing Entropy Measure

    Directory of Open Access Journals (Sweden)

    Jaeyong Yee

    2015-01-01

    Full Text Available A number of statistical methods for detecting gene-gene interactions have been developed in genetic association studies with binary traits. However, many phenotype measures are intrinsically quantitative and categorizing continuous traits may not always be straightforward and meaningful. Association of gene-gene interactions with an observed distribution of such phenotypes needs to be investigated directly without categorization. Information gain based on entropy measure has previously been successful in identifying genetic associations with binary traits. We extend the usefulness of this information gain by proposing a nonparametric evaluation method of conditional entropy of a quantitative phenotype associated with a given genotype. Hence, the information gain can be obtained for any phenotype distribution. Because any functional form, such as Gaussian, is not assumed for the entire distribution of a trait or a given genotype, this method is expected to be robust enough to be applied to any phenotypic association data. Here, we show its use to successfully identify the main effect, as well as the genetic interactions, associated with a quantitative trait.

  18. Quantitative computed tomography measures of emphysema and airway wall thickness are related to respiratory symptoms

    DEFF Research Database (Denmark)

    Grydeland, Thomas B; Dirksen, Asger; Coxson, Harvey O

    2010-01-01

    There is limited knowledge about the relationship between respiratory symptoms and quantitative high-resolution computed tomography measures of emphysema and airway wall thickness.......There is limited knowledge about the relationship between respiratory symptoms and quantitative high-resolution computed tomography measures of emphysema and airway wall thickness....

  19. [Cholinesterases in total blood measured with a semiquantitative technique, and plasma or erythrocyte cholinesterases measured with quantitative techniques].

    Science.gov (United States)

    Carmona-Fonseca, Jaime

    2007-06-01

    An equivalence model which allows comparison of blood cholinesterase values, measured by Lovibond (semiquantitative technique), and Michel, EQM, Monotest (erythrocyte and plasma cholinesterases) values measured by quantitative techniques is required. The performance of Lovibond (Edson tintometric and Limperos & Ranta techniques) were compared with quantitative techniques. The experimental design was descriptive, cross-sectional, and prospective. From a working population (18-59 years) in Valle de Aburrá and Near East of Antioquia. 827 representative samples were chosen for their lack of exposure to cholinesterase-inhibiting plaguicides and affiliated to the Social Security System. (1) 827 workers were classified by Lovibond in four categories: 821 values with 75% of cholinesterase activity or greater (categories 75, 87.5 and 100%) and 6 with cholinesterase activity smaller than 75%. (2) With each quantitative method, the mean values of erythrocyte and plasmatic cholinesterase corresponding to the four values obtained with Lovibond were statistically different to each other. (3) The mean values of each quantitative technique increased when increased the tintometric method value. (4) Lovibond classified the low enzymatic erythrocyte activity very poorly (61-73%), but the classification of the low enzymatic plasma activity was almost completely in error (94-96%). The values of erythrocyte or plasma cholinesterase were adequately estimated by both the quantitative techniques of Michel and EQM and by Lovibond, but only when the enzymatic activity is normal. Lovibond, however, had a poor capacity to designate as "low" the values that were low according to the quantitative tests.

  20. Quantitative determination of grain sizes by means of scattered ultrasound

    International Nuclear Information System (INIS)

    Goebbels, K.; Hoeller, P.

    1976-01-01

    The scattering of ultrasounds makes possible the quantitative determination of grain sizes in metallic materials. Examples of measurements on steels with grain sizes between ASTM 1 and ASTM 12 are given

  1. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  2. Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research

    Science.gov (United States)

    Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah

    2013-01-01

    Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…

  3. Quantitative lymphoscintigraphy in post-mastectomy lymphedema: correlation with circumferential measurements

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Joon Young; Lee, Kyung Han; Kim, Sang Eun; Kim, Byung Tae; Hwang, Jee Hea; Lee, Byung Boong [Samsung Medical Center, Seoul (Korea, Republic of)

    1997-07-01

    An objective measure for the severity and progression is important for the management of lymphedema. To evaluate the usefulness of lympho-scintigraphy in this regard, we compared various quantitative indices from upper extremity lymphoscintigraphy with circumferential measurements, before and after physiotheraphy. Upper extremity lymphoscintigraphy was performed in 38 patients with unilateral postmastectomy lymphedema. Tc-99m antimony sulfide colloid (37 MBq) was injected s.c. into the second and third interdigital spaces. The injection sites were imaged immediately after injection. After standardized exercise for 15 min, upper extremity images were acquired 30 min, 1 hr and 2 hr after injection. The clearance of the injection site (CL), and % uptake in regional lymph nodes (%LN) and soft tissue of the extremity (i.e., the degree of dermal backflow) (%EXT) compared to the initial injection site were calculated. Circumference of each extremity was measured at 7 levels; the severity of lymphedema was expressed as the percentage difference of total circumferential difference (TCD) between healthy and edematous extremities compared to the total circumference of healthy extremity (%TCD). In 19 patients who received physiotherapy, the therapeutic effect was measured by % decrease of TCD (%DTCD) before and after therapy (Raines. et al., 1977). The quantitative indices calculated in the image at 2 hr p.i. had better correlation with either %TCD or %DTCD than those from earlier images (Table). The CL, %LN and %EXT of edematous extremity had a significant correlation with TCD. The %EXT was correlated best with either TCD or %DTCD. The results suggest that the %EXT which corresponds to the degree of dermal backflow may be a simple and useful quantitative index for evaluating the severity and progression in lymphedema and predicting the effect of therapy.

  4. Quantitative lymphoscintigraphy in post-mastectomy lymphedema: correlation with circumferential measurements

    International Nuclear Information System (INIS)

    Choi, Joon Young; Lee, Kyung Han; Kim, Sang Eun; Kim, Byung Tae; Hwang, Jee Hea; Lee, Byung Boong

    1997-01-01

    An objective measure for the severity and progression is important for the management of lymphedema. To evaluate the usefulness of lympho-scintigraphy in this regard, we compared various quantitative indices from upper extremity lymphoscintigraphy with circumferential measurements, before and after physiotheraphy. Upper extremity lymphoscintigraphy was performed in 38 patients with unilateral postmastectomy lymphedema. Tc-99m antimony sulfide colloid (37 MBq) was injected s.c. into the second and third interdigital spaces. The injection sites were imaged immediately after injection. After standardized exercise for 15 min, upper extremity images were acquired 30 min, 1 hr and 2 hr after injection. The clearance of the injection site (CL), and % uptake in regional lymph nodes (%LN) and soft tissue of the extremity (i.e., the degree of dermal backflow) (%EXT) compared to the initial injection site were calculated. Circumference of each extremity was measured at 7 levels; the severity of lymphedema was expressed as the percentage difference of total circumferential difference (TCD) between healthy and edematous extremities compared to the total circumference of healthy extremity (%TCD). In 19 patients who received physiotherapy, the therapeutic effect was measured by % decrease of TCD (%DTCD) before and after therapy (Raines. et al., 1977). The quantitative indices calculated in the image at 2 hr p.i. had better correlation with either %TCD or %DTCD than those from earlier images (Table). The CL, %LN and %EXT of edematous extremity had a significant correlation with TCD. The %EXT was correlated best with either TCD or %DTCD. The results suggest that the %EXT which corresponds to the degree of dermal backflow may be a simple and useful quantitative index for evaluating the severity and progression in lymphedema and predicting the effect of therapy

  5. An improved fast neutron radiography quantitative measurement method

    International Nuclear Information System (INIS)

    Matsubayashi, Masahito; Hibiki, Takashi; Mishima, Kaichiro; Yoshii, Koji; Okamoto, Koji

    2004-01-01

    The validity of a fast neutron radiography quantification method, the Σ-scaling method, which was originally proposed for thermal neutron radiography was examined with Monte Carlo calculations and experiments conducted at the YAYOI fast neutron source reactor. Water and copper were selected as comparative samples for a thermal neutron radiography case and a dense object, respectively. Although different characteristics on effective macroscopic cross-sections were implied by the simulation, the Σ-scaled experimental results with the fission neutron spectrum cross-sections were well fitted to the measurements for both the water and copper samples. This indicates that the Σ-scaling method could be successfully adopted for quantitative measurements in fast neutron radiography

  6. New possibilities for quantitative measurements of regional cerebral blood flow with Au-195 m

    International Nuclear Information System (INIS)

    Lindner, P.; Nickel, O.

    1984-01-01

    A previously reported theory for quantitative cerebral blood flow measurement for nondiffusible radiotracers has been applied on patients after stroke and an volunteers undergoing a mental stimulation exercise. Quantitative measurements of cerebral blood flow patterns not only in p-a. but also in lateral views of the brain are possible by the use of the recently developed generator for the short lived (30 sec) isotope Au-195 m. The energy spectrum of the eluate of the generator shows two strong photon peaks, one at an energy level of 68 KeV and a second at an energy-level of 262 KeV. The low energy peak is suitable for perfusion studies in lateral views of the hemispheres, no ''look through'' effect is seen. The high energy level is good for studies in p-a-positions. The studies last less than 1 minute and can be repeated after 3 minutes. Parametric images for quantitative regional cerebral blood flow can be generated. The area of occluded vessels in the case of stroke can be detected. Quantitative activation patterns of cerebral blood flow during mental stimulation can be generated. The results prove that not only with freely diffusible indicators like Xenon but also with nondiffusible indicators it is possible to measure quantitatively cerebral blood flow patterns. (orig.)

  7. Quantitative measurement of maritime sediment movement using radioactive tracers

    International Nuclear Information System (INIS)

    Makovski, E.; Grissener, G.

    1967-01-01

    The quantitative method described in the paper involves burying appropriate detectors over a given area of the sea bottom, the detectors being connected to recording equipment which is itself buried in the sediment or situated on the shore. Detectors arranged in this way are covered by a certain layer of radioactive sediment whose activity is proportional to its mass. Before the labelled sediments are removed, their initial activity is measured, and then, as the covering is removed, measurements are made of the gradual decrease in activity corresponding to loss of the surface layer of the bottom deposit area under investigation, expressed in g/cm 2 . The tracers used in the investigations discussed were natural ones such as sea with 31 Si and artificial ones such as activated fragments of sodium glass (with a 6.5% admixture of Fe 2 O 3 ) with 24 Na . The proportional dependence of activity on mass has been confirmed for both tracers; this is an essential point for a tracer intended for quantitative measurements. This proportionality is very well maintained if a sample of highly active sediment is introduced into a large mass of inactive sediments (10 -2 - 10 -3 ). The concluding section describes the advantages of this method as a possible way of using radioisotopes with a short half-life and a low total activity of the order of a few millicuries. (author)

  8. Measuring teamwork in primary care: Triangulation of qualitative and quantitative data.

    Science.gov (United States)

    Brown, Judith Belle; Ryan, Bridget L; Thorpe, Cathy; Markle, Emma K R; Hutchison, Brian; Glazier, Richard H

    2015-09-01

    This article describes the triangulation of qualitative dimensions, reflecting high functioning teams, with the results of standardized teamwork measures. The study used a mixed methods design using qualitative and quantitative approaches to assess teamwork in 19 Family Health Teams in Ontario, Canada. This article describes dimensions from the qualitative phase using grounded theory to explore the issues and challenges to teamwork. Two quantitative measures were used in the study, the Team Climate Inventory (TCI) and the Providing Effective Resources and Knowledge (PERK) scale. For the triangulation analysis, the mean scores of these measures were compared with the qualitatively derived ratings for the dimensions. The final sample for the qualitative component was 107 participants. The qualitative analysis identified 9 dimensions related to high team functioning such as common philosophy, scope of practice, conflict resolution, change management, leadership, and team evolution. From these dimensions, teams were categorized numerically as high, moderate, or low functioning. Three hundred seventeen team members completed the survey measures. Mean site scores for the TCI and PERK were 3.87 and 3.88, respectively (of 5). The TCI was associated will all dimensions except for team location, space allocation, and executive director leadership. The PERK was associated with all dimensions except team location. Data triangulation provided qualitative and quantitative evidence of what constitutes teamwork. Leadership was pivotal in forging a common philosophy and encouraging team collaboration. Teams used conflict resolution strategies and adapted to the changes they encountered. These dimensions advanced the team's evolution toward a high functioning team. (c) 2015 APA, all rights reserved).

  9. Quantitative Thermochemical Measurements in High-Pressure Gaseous Combustion

    Science.gov (United States)

    Kojima, Jun J.; Fischer, David G.

    2012-01-01

    We present our strategic experiment and thermochemical analyses on combustion flow using a subframe burst gating (SBG) Raman spectroscopy. This unconventional laser diagnostic technique has promising ability to enhance accuracy of the quantitative scalar measurements in a point-wise single-shot fashion. In the presentation, we briefly describe an experimental methodology that generates transferable calibration standard for the routine implementation of the diagnostics in hydrocarbon flames. The diagnostic technology was applied to simultaneous measurements of temperature and chemical species in a swirl-stabilized turbulent flame with gaseous methane fuel at elevated pressure (17 atm). Statistical analyses of the space-/time-resolved thermochemical data provide insights into the nature of the mixing process and it impact on the subsequent combustion process in the model combustor.

  10. Miniature rainbow schlieren deflectometry system for quantitative measurements in microjets and flames

    International Nuclear Information System (INIS)

    Satti, Rajani P.; Kolhe, Pankaj S.; Olcmen, Semih; Agrawal, Ajay K.

    2007-01-01

    Recent interest in small-scale flow devices has created the need for miniature instruments capable of measuring scalar flow properties with high spatial resolution. We present a miniature rainbow schlieren deflectometry system to nonintrusively obtain quantitative species concentration and temperature data across the whole field. The optical layout of the miniature system is similar to that of a macroscale system, although the field of view is smaller by an order of magnitude. Employing achromatic lenses and a CCD array together with a camera lens and extension tubes, we achieved spatial resolution down to 4 μm. Quantitative measurements required a careful evaluation of the optical components. The capability of the system is demonstrated by obtaining concentration measurements in a helium microjet (diameter, d=650 μm) and temperature and concentration measurements in a hydrogen jet diffusion flame from a microinjector(d=50 μm). Further, the flow field of underexpanded nitrogen jets is visualized to reveal details of the shock structures existing downstream of the jet exit

  11. A quantitative method to measure and evaluate the peelability of shrimps (Pandalus borealis)

    DEFF Research Database (Denmark)

    Gringer, Nina; Dang, Tem Thi; Orlien, Vibeke

    2018-01-01

    A novel, standardized method has been developed in order to provide a quantitative description of shrimp peelability. The peeling process was based on the measure of the strength of the shell-muscle attachment of the shrimp using a texture analyzer, and calculated into the peeling work. The self......-consistent method, insensitive of the shrimp size, was proven valid for assessment of ice maturation of shrimps. The quantitative peeling efficiency (peeling work) and performance (degree of shell removal) showed that the decrease in peeling work correlated with the amount of satisfactory peeled shrimps, indicating...... an effective weakening of the shell-muscle attachment. The developed method provides the industry with a quantitative analysis for measurement of peeling efficiency and peeling performance of shrimps. It may be used for comparing different maturation conditions in relation to optimization of shrimps peeling....

  12. A quantitative method for measuring the quality of history matches

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, T.S. [Kerr-McGee Corp., Oklahoma City, OK (United States); Knapp, R.M. [Univ. of Oklahoma, Norman, OK (United States)

    1997-08-01

    History matching can be an efficient tool for reservoir characterization. A {open_quotes}good{close_quotes} history matching job can generate reliable reservoir parameters. However, reservoir engineers are often frustrated when they try to select a {open_quotes}better{close_quotes} match from a series of history matching runs. Without a quantitative measurement, it is always difficult to tell the difference between a {open_quotes}good{close_quotes} and a {open_quotes}better{close_quotes} matches. For this reason, we need a quantitative method for testing the quality of matches. This paper presents a method for such a purpose. The method uses three statistical indices to (1) test shape conformity, (2) examine bias errors, and (3) measure magnitude of deviation. The shape conformity test insures that the shape of a simulated curve matches that of a historical curve. Examining bias errors assures that model reservoir parameters have been calibrated to that of a real reservoir. Measuring the magnitude of deviation assures that the difference between the model and the real reservoir parameters is minimized. The method was first tested on a hypothetical model and then applied to published field studies. The results showed that the method can efficiently measure the quality of matches. It also showed that the method can serve as a diagnostic tool for calibrating reservoir parameters during history matching.

  13. Weighing evidence: quantitative measures of the importance of bitemark evidence.

    Science.gov (United States)

    Kittelson, J M; Kieser, J A; Buckingham, D M; Herbison, G P

    2002-12-01

    Quantitative measures of the importance of evidence such as the "likelihood ratio" have become increasingly popular in the courtroom. These measures have been used by expert witnesses formally to describe their certainty about a piece of evidence. These measures are commonly interpreted as the amount by which the evidence should revise the opinion of guilt, and thereby summarize the importance of a particular piece of evidence. Unlike DNA evidence, quantitative measures have not been widely used by forensic dentists to describe their certainty when testifying about bitemark evidence. There is, however, no inherent reason why they should not be used to evaluate bitemarks. The purpose of this paper is to describe the likelihood ratio as it might be applied to bitemark evidence. We use a simple bitemark example to define the likelihood ratio, its application, and interpretation. In particular we describe how the jury interprets the likelihood ratio from a Bayesian perspective when evaluating the impact of the evidence on the odds that the accused is guilty. We describe how the dentist would calculate the likelihood ratio based on frequentist interpretations. We also illustrate some of the limitations of the likelihood ratio, and show how those limitations apply to bitemark evidence. We conclude that the quality of bitemark evidence cannot be adequately summarized by the likelihood ratio, and argue that its application in this setting may be more misleading than helpful.

  14. Quantitative CT measures of emphysema and airway wall thickness are related to D(L)CO

    DEFF Research Database (Denmark)

    Grydeland, Thomas B; Thorsen, Einar; Dirksen, Asger

    2011-01-01

    There is limited knowledge on the relationship between diffusing capacity of the lung for carbon monoxide (D(L)CO) and quantitative computed tomography (CT) measures of emphysema and airway wall thickness.......There is limited knowledge on the relationship between diffusing capacity of the lung for carbon monoxide (D(L)CO) and quantitative computed tomography (CT) measures of emphysema and airway wall thickness....

  15. A quantitative impact analysis of sensor failures on human operator's decision making in nuclear power plants

    International Nuclear Information System (INIS)

    Seong, Poong Hyun

    2004-01-01

    In emergency or accident situations in nuclear power plants, human operators take important roles in generating appropriate control signals to mitigate accident situation. In human reliability analysis (HRA) in the framework of probabilistic safety assessment (PSA), the failure probabilities of such appropriate actions are estimated and used for the safety analysis of nuclear power plants. Even though understanding the status of the plant is basically the process of information seeking and processing by human operators, it seems that conventional HRA methods such as THERP, HCR, and ASEP does not pay a lot of attention to the possibilities of providing wrong information to human operators. In this paper, a quantitative impact analysis of providing wrong information to human operators due to instrument faults or sensor failures is performed. The quantitative impact analysis is performed based on a quantitative situation assessment model. By comparing the situation in which there are sensor failures and the situation in which there are not sensor failures, the impact of sensor failures can be evaluated quantitatively. It is concluded that the impact of sensor failures are quite significant at the initial stages, but the impact is gradually reduced as human operators make more and more observations. Even though the impact analysis is highly dependent on the situation assessment model, it is expected that the conclusions made based on other situation assessment models with be consistent with the conclusion made in this paper. (author)

  16. Quantitative Measurements using Ultrasound Vector Flow Imaging

    DEFF Research Database (Denmark)

    Jensen, Jørgen Arendt

    2016-01-01

    scanner for pulsating flow mimicking the femoral artery from a CompuFlow 1000 pump (Shelley Medical). Data were used in four estimators based on directional transverse oscillation for velocity, flow angle, volume flow, and turbulence estimation and their respective precisions. An adaptive lag scheme gave...... the ability to estimate a large velocity range, or alternatively measure at two sites to find e.g. stenosis degree in a vessel. The mean angle at the vessel center was estimated to 90.9◦±8.2◦ indicating a laminar flow from a turbulence index being close to zero (0.1 ±0.1). Volume flow was 1.29 ±0.26 mL/stroke...... (true: 1.15 mL/stroke, bias: 12.2%). Measurements down to 160 mm were obtained with a relative standard deviation and bias of less than 10% for the lateral component for stationary, parabolic flow. The method can, thus, find quantitative velocities, angles, and volume flows at sites currently...

  17. New possibilities for quantitative measurements of regional cerebral blood flow with gold-195m

    International Nuclear Information System (INIS)

    Lindner, P.; Nickel, O.

    1985-01-01

    A previously reported theory for quantitative cerebral blood flow measurement for nondiffusible radiotracers has been applied to patients after stroke and to volunteers undergoing a mental stimulation exercise. The energy spectrum of gold-195m shows two strong photon peaks, one at an energy level of 68 keV and a second at an energy-level of 262 keV. The low energy peak is suitable for perfusion studies in lateral views of the hemispheres; no look-through effect is seen. The high energy level is good for studies in posterior-anterior positions. Parametric images for quantitative regional cerebral blood flow can be generated. The area of occluded vessels in the case of stroke can be detected. Quantitative activation patterns of cerebral blood flow during mental stimulation can be generated. The results prove that, not only with freely diffusible indicators like xenon but also with nondiffusible indicators, it is possible to measure quantitatively cerebral blood flow patterns

  18. Quantitation of chemical exchange rates using pulsed-field-gradient diffusion measurements

    International Nuclear Information System (INIS)

    Andrec, Michael; Prestegard, James H.

    1997-01-01

    A new approach to the quantitation of chemical exchange rates is presented, and its utility is illustrated with application to the exchange of protein amide protons with bulk water. The approach consists of a selective-inversion exchange HMQC experiment in which a short spin echo diffusion filter has been inserted into the exchange period. In this way, the kinetics of exchange are encoded directly in an apparent diffusion coefficient which is a function of the position of the diffusion filter in the pulse sequence. A detailed theoretical analysis of this experiment indicates that, in addition to the measurement of simple exchange rates, the experiment is capable of measuring the effect of mediated exchange, e.g. the transfer of magnetization from bulk water to an amide site mediated by an internal bound water molecule or a labile protein side-chain proton in fast exchange with bulk water. Experimental results for rapid water/amide exchange in acyl carrier protein are shown to be quantitatively consistent with the exchange rates measured using a selective-inversion exchange experiment

  19. Measurement of shared decision making - a review of instruments

    NARCIS (Netherlands)

    Scholl, I.; Koelewijn-van Loon, M.; Sepucha, K.; Elwyn, G.; Legare, F.; Harter, M.; Dirmaier, J.

    2011-01-01

    The last years have seen a clear move towards shared decision making (SDM) and increased patient involvement in many countries. However, as the field of SDM research is still relatively young, new instruments for the measurement of (shared) decision making (process, outcome and surrounding elements)

  20. Qualitative pattern classification of shear wave elastography for breast masses: how it correlates to quantitative measurements.

    Science.gov (United States)

    Yoon, Jung Hyun; Ko, Kyung Hee; Jung, Hae Kyoung; Lee, Jong Tae

    2013-12-01

    To determine the correlation of qualitative shear wave elastography (SWE) pattern classification to quantitative SWE measurements and whether it is representative of quantitative SWE values with similar performances. From October 2012 to January 2013, 267 breast masses of 236 women (mean age: 45.12 ± 10.54 years, range: 21-88 years) who had undergone ultrasonography (US), SWE, and subsequent biopsy were included. US BI-RADS final assessment and qualitative and quantitative SWE measurements were recorded. Correlation between pattern classification and mean elasticity, maximum elasticity, elasticity ratio and standard deviation were evaluated. Diagnostic performances of grayscale US, SWE parameters, and US combined to SWE values were calculated and compared. Of the 267 breast masses, 208 (77.9%) were benign and 59 (22.1%) were malignant. Pattern classifications significantly correlated with all quantitative SWE measurements, showing highest correlation with maximum elasticity, r = 0.721 (P0.05). Pattern classification shows high correlation to maximum stiffness and may be representative of quantitative SWE values. When combined to grayscale US, SWE improves specificity of US. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Quantitative nanoscale surface voltage measurement on organic semiconductor blends

    International Nuclear Information System (INIS)

    Cuenat, Alexandre; Muñiz-Piniella, Andrés; Muñoz-Rojo, Miguel; Murphy, Craig E; Tsoi, Wing C

    2012-01-01

    We report on the validation of a method based on Kelvin probe force microscopy (KPFM) able to measure the different phases and the relative work function of polymer blend heterojunctions at the nanoscale. The method does not necessitate complex ultra-high vacuum setup. The quantitative information that can be extracted from the topography and the Kelvin probe measurements is critically analysed. Surface voltage difference can be observed at the nanoscale on poly(3-hexyl-thiophene):[6,6]-phenyl-C61-butyric acid methyl ester (P3HT:PCBM) blends and dependence on the annealing condition and the regio-regularity of P3HT is observed. (paper)

  2. An improved in situ measurement of offset phase shift towards quantitative damping-measurement with AFM

    International Nuclear Information System (INIS)

    Minary-Jolandan, Majid; Yu Minfeng

    2008-01-01

    An improved approach is introduced in damping measurement with atomic force microscope (AFM) for the in situ measurement of the offset phase shift needed for determining the intrinsic mechanical damping in nanoscale materials. The offset phase shift is defined and measured at a point of zero contact force according to the deflection part of the AFM force plot. It is shown that such defined offset phase shift is independent of the type of sample material, varied from hard to relatively soft materials in this study. This improved approach allows the self-calibrated and quantitative damping measurement with AFM. The ability of dynamic mechanical analysis for the measurement of damping in isolated one-dimensional nanostructures, e.g. individual multiwalled carbon nanotubes, was demonstrated

  3. Quantitative analysis of real-time radiographic systems

    International Nuclear Information System (INIS)

    Barker, M.D.; Condon, P.E.; Barry, R.C.; Betz, R.A.; Klynn, L.M.

    1988-01-01

    A method was developed which yields quantitative information on the spatial resolution, contrast sensitivity, image noise, and focal spot size from real time radiographic images. The method uses simple image quality indicators and computer programs which make it possible to readily obtain quantitative performance measurements of single or multiple radiographic systems. It was used for x-ray and optical images to determine which component of the system was not operating up to standard. Focal spot size was monitored by imaging a bar pattern. This paper constitutes the second progress report on the development of the camera and radiation image quality indicators

  4. Methods for quantitative infrared directional-hemispherical and diffuse reflectance measurements using an FTIR and a commercial integrating sphere

    Energy Technology Data Exchange (ETDEWEB)

    Blake, Thomas A.; Johnson, Timothy J.; Tonkyn, Russell G.; Forland, Brenda M.; Myers, Tanya L.; Brauer, Carolyn S.; Su, Yin-Fong; Bernacki, Bruce E.; Hanssen, Leonard; Gonzalez, Gerardo

    2018-01-01

    Infrared integrating sphere measurements of solid samples are important in providing reference data for contact, standoff and remote sensing applications. At the Pacific Northwest National Laboratory (PNNL) we have developed protocols to measure both the directional-hemispherical ( and diffuse (d) reflectances of powders, liquids, and disks of powders and solid materials using a commercially available, matte gold-coated integrating sphere and Fourier transform infrared spectrometer. Detailed descriptions of the sphere alignment and its use for making these reflectance measurements are given. Diffuse reflectance values were found to be dependent on the bidirectional reflection distribution function (BRDF) of the sample and the solid angle intercepted by the sphere’s specular exclusion port. To determine how well the sphere and protocols produce quantitative reflectance data, measurements were made of three diffuse and two specular standards prepared by the National institute of Standards and Technology (NIST, USA), LabSphere Infragold and Spectralon standards, hand-loaded sulfur and talc powder samples, and water. The five NIST standards behaved as expected: the three diffuse standards had a high degree of “diffuseness,” d/ = D > 0.9, whereas the two specular standards had D ≤ 0.03. The average absolute differences between the NIST and PNNL measurements of the NIST standards for both directional-hemispherical and diffuse reflectances are on the order of 0.01 reflectance units. Other quantitative differences between the PNNL-measured and calibration (where available) or literature reflectance values for these standards and materials are given and the possible origins of discrepancies are discussed. Random uncertainties and estimates of systematic uncertainties are presented. Corrections necessary to provide better agreement between the PNNL reflectance values as measured for the NIST standards and the NIST reflectance values for these same standards are also

  5. Quantitative Measurement of Oxygen in Microgravity Combustion

    Science.gov (United States)

    Silver, Joel A.

    1997-01-01

    A low-gravity environment, in space or in ground-based facilities such as drop towers, provides a unique setting for studying combustion mechanisms. Understanding the physical phenomena controlling the ignition and spread of flames in microgravity has importance for space safety as well as for better characterization of dynamical and chemical combustion processes which are normally masked by buoyancy and other gravity-related effects. Due to restrictions associated with performing measurements in reduced gravity, diagnostic methods which have been applied to microgravity combustion studies have generally been limited to capture of flame emissions on film or video, laser Schlieren imaging and (intrusive) temperature measurements using thermocouples. Given the development of detailed theoretical models, more sophisticated diagnostic methods are needed to provide the kind of quantitative data necessary to characterize the properties of microgravity combustion processes as well as provide accurate feedback to improve the predictive capabilities of the models. When the demands of space flight are considered, the need for improved diagnostic systems which are rugged, compact, reliable, and operate at low power becomes apparent. The objective of this research is twofold. First, we want to develop a better understanding of the relative roles of diffusion and reaction of oxygen in microgravity combustion. As the primary oxidizer species, oxygen plays a major role in controlling the observed properties of flames, including flame front speed (in solid or liquid flames), extinguishment characteristics, flame size and flame temperature. The second objective is to develop better diagnostics based on diode laser absorption which can be of real value in both microgravity combustion research and as a sensor on-board Spacelab as either an air quality monitor or as part of a fire detection system. In our prior microgravity work, an eight line-of-sight fiber optic system measured

  6. Design of a Michelson Interferometer for Quantitative Refraction Index Profile Measurements

    NARCIS (Netherlands)

    Nijholt, J.L.M.

    1998-01-01

    This book describes the theoretical design of a three camera Michelson interferometer set-up for quantitative refractive index measuerments. Although a two camera system is easier to align and less expensive, a three camera interferometer is preferred because the expected measuring accuracy is much

  7. Quantitative density measurements from a real-time neutron radiography system

    International Nuclear Information System (INIS)

    McRae, D.D.; Jenkins, R.W. Jr.; Brenizer, J.S.; Tobin, K.W.; Hosticka, B.; Sulcoski, M.F.

    1986-01-01

    An advanced video system has been assembled from commercially available equipment to support the real-time neutron radiography facility established jointly by the University of Virginia Department of Nuclear Engineering and Engineering Physics, and the Philip Morris Research Center. A schematic diagram of the equipment used for real-time neutron radiography is presented. To obtain quantitative density measurements with this system, several modifications of both hardware and image processing software were required. After implementation of these changes, the system was capable of determining material densities by measuring the degree of neutron attenuation

  8. Measuring local autonomy: A decision-making approach

    NARCIS (Netherlands)

    Fleurke, F.; Willemse, R.

    2006-01-01

    In studies on central-local relations it is common to assess local autonomy in a deductive way. The extent of local autonomy is determined by measuring the central legal and financial competence, after which the remaining room for local decision-making is determined. The outcome of this indirect

  9. Quantitation without Calibration: Response Profile as an Indicator of Target Amount.

    Science.gov (United States)

    Debnath, Mrittika; Farace, Jessica M; Johnson, Kristopher D; Nesterova, Irina V

    2018-06-21

    Quantitative assessment of biomarkers is essential in numerous contexts from decision-making in clinical situations to food quality monitoring to interpretation of life-science research findings. However, appropriate quantitation techniques are not as widely addressed as detection methods. One of the major challenges in biomarker's quantitation is the need to have a calibration for correlating a measured signal to a target amount. The step complicates the methodologies and makes them less sustainable. In this work we address the issue via a new strategy: relying on position of response profile rather than on an absolute signal value for assessment of a target's amount. In order to enable the capability we develop a target-probe binding mechanism based on a negative cooperativity effect. A proof-of-concept example demonstrates that the model is suitable for quantitative analysis of nucleic acids over a wide concentration range. The general principles of the platform will be applicable toward a variety of biomarkers such as nucleic acids, proteins, peptides, and others.

  10. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  11. Quantitative phase determination by using a Michelson interferometer

    International Nuclear Information System (INIS)

    Pomarico, Juan A; Molina, Pablo F; Angelo, Cristian D'

    2007-01-01

    The Michelson interferometer is one of the best established tools for quantitative interferometric measurements. It has been, and is still successfully used, not only for scientific purposes, but it is also introduced in undergraduate courses for qualitative demonstrations as well as for quantitative determination of several properties such as refractive index, wavelength, optical thickness, etc. Generally speaking, most of the measurements are carried out by determining phase distortions through the changes in the location and/or shape of the interference fringes. However, the extreme sensitivity of this tool, for which minimum deviations of the conditions of its branches can cause very large modifications in the fringe pattern, makes phase changes difficult to follow and measure. The purpose of this communication is to show that, under certain conditions, the sensitivity of the Michelson interferometer can be 'turned down' allowing the quantitative measurement of phase changes with relative ease. As an example we present how the angle (or, optionally, the refractive index) of a transparent standard optical wedge can be determined. Experimental results are shown and compared with the data provided by the manufacturer showing very good agreement

  12. Make-or-buy decisions and the manipulability of performance measures

    OpenAIRE

    Andersson, Fredrik

    2009-01-01

    Abstract in Undetermined The make-or-buy decision is analyzed in a simple framework combining contractual incompleteness with the existence of an imperfect but contractible performance measure. Contractual incompleteness gives rise to two regimes, identified with make and buy. The performance measure on which comprehensive contracts can be written is imperfect in the sense of being subject to manipulation. The performance incentives faced by the agent are stronger in the “buy” regime. A posit...

  13. Residual DNA analysis in biologics development: review of measurement and quantitation technologies and future directions.

    Science.gov (United States)

    Wang, Xing; Morgan, Donna M; Wang, Gan; Mozier, Ned M

    2012-02-01

    Residual DNA (rDNA) is comprised of deoxyribonucleic acid (DNA) fragments and longer length molecules originating from the host organism that may be present in samples from recombinant biological processes. Although similar in basic structural base pair units, rDNA may exist in different sizes and physical forms. Interest in measuring rDNA in recombinant products is based primarily on demonstration of effective purification during manufacturing, but also on some hypothetical concerns that, in rare cases, depending on the host expression system, some DNA sequences may be potentially infectious or oncogenic (e.g., HIV virus and the Ras oncogene, respectively). Recent studies suggest that a sequence known as long interspersed nucleotide element-1 (LINE-1), widely distributed in the mammalian genome, is active as a retrotransposon that can be transcribed to RNA, reverse-transcribed into DNA and inserts into a new site in genome. This integration process could potentially disrupt critical gene functions or induce tumorigenesis in mammals. Genomic DNA from microbial sources, on the other hand, could add to risk of immunogenicity to the target recombinant protein being expressed, due to the high CpG content and unmethylated DNA sequence. For these and other reasons, it is necessary for manufacturers to show clearance of DNA throughout production processes and to confirm low levels in the final drug substance using an appropriately specific and quantitative analytical method. The heterogeneity of potential rDNA sequences that might be makes the testing of all potential analytes challenging. The most common methodology for rDNA quantitation used currently is real-time polymerase chain reaction (RT-PCR), a robust and proven technology. Like most rDNA quantitation methods, the specificity of RT-PCR is limited by the sequences to which the primers are directed. To address this, primase-based whole genome amplification is introduced herein. This paper will review the recent

  14. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    Science.gov (United States)

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  15. Single-case synthesis tools II: Comparing quantitative outcome measures.

    Science.gov (United States)

    Zimmerman, Kathleen N; Pustejovsky, James E; Ledford, Jennifer R; Barton, Erin E; Severini, Katherine E; Lloyd, Blair P

    2018-03-07

    Varying methods for evaluating the outcomes of single case research designs (SCD) are currently used in reviews and meta-analyses of interventions. Quantitative effect size measures are often presented alongside visual analysis conclusions. Six measures across two classes-overlap measures (percentage non-overlapping data, improvement rate difference, and Tau) and parametric within-case effect sizes (standardized mean difference and log response ratio [increasing and decreasing])-were compared to determine if choice of synthesis method within and across classes impacts conclusions regarding effectiveness. The effectiveness of sensory-based interventions (SBI), a commonly used class of treatments for young children, was evaluated. Separately from evaluations of rigor and quality, authors evaluated behavior change between baseline and SBI conditions. SBI were unlikely to result in positive behavior change across all measures except IRD. However, subgroup analyses resulted in variable conclusions, indicating that the choice of measures for SCD meta-analyses can impact conclusions. Suggestions for using the log response ratio in SCD meta-analyses and considerations for understanding variability in SCD meta-analysis conclusions are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Quantitative remote visual inspection in nuclear power industry

    International Nuclear Information System (INIS)

    Stone, M.C.

    1992-01-01

    A borescope is an instrument that is used within the power industry to visually inspect remote locations. It is typically used for inspections of heat exchangers, condensers, boiler tubes, and steam generators and in many general inspection applications. The optical system of a borescope, like the human eye, does not have a fixed magnification. When viewing an object close up, it appears large; when the same object is viewed from afar, it appears small. Humans, though, have two separate eyes and a brain that process information to calculate the size of an object. These attributes are considered secondary information. Until now, making a measurement using a borescope has been an educated guess. There has always been a need to make accurate measurements from borescope images. The realization of this capability would make remote visual inspection a quantitative nondestructive testing method versus a qualitative one. For nuclear power plants, it is an excellent technique for maintaining radiation levels as low as reasonably achievable. Remote visual measurement provides distance and limits the exposure time needed to make accurate measurements. The design problem, therefore, was to develop the capability to make accurate and repeatable measurements of objects or physical defects with a borescope-type instrument. The solution was achieved by designing a borescope with a novel shadow projection mechanism, integrated with an electronics module containing the video display circuitry and a measurement computer

  17. Quantitative liquid and vapor distribution measurements in evaporating fuel sprays using laser-induced exciplex fluorescence

    International Nuclear Information System (INIS)

    Fansler, Todd D; Drake, Michael C; Gajdeczko, Boguslaw; Düwel, Isabell; Koban, Wieland; Zimmermann, Frank P; Schulz, Christof

    2009-01-01

    Fully quantitative two-dimensional measurements of liquid- and vapor-phase fuel distributions (mass per unit volume) from high-pressure direct-injection gasoline injectors are reported for conditions of both slow and rapid vaporization in a heated, high-pressure spray chamber. The measurements employ the coevaporative gasoline-like fluorobenzene (FB)/diethylmethylamine (DEMA)/hexane exciplex tracer/fuel system. In contrast to most previous laser-induced exciplex-fluorescence (LIEF) experiments, the quantitative results here include regions in which liquid and vapor fuel coexist (e.g. near the injector exit). A unique aspect is evaluation of both vapor- and liquid-phase distributions at varying temperature and pressure using only in situ vapor-phase fluorescence calibration measurements at room temperature and atmospheric pressure. This approach draws on recent extensive measurements of the temperature-dependent spectroscopic properties of the FB–DEMA exciplex system, in particular on knowledge of the quantum efficiencies of the vapor-phase and liquid-phase (exciplex) fluorescence. In addition to procedures necessary for quantitative measurements, we discuss corrections for liquid–vapor crosstalk (liquid fluorescence that overlaps the vapor-fluorescence bandpass), the unknown local temperature due to vaporization-induced cooling, and laser-sheet attenuation by scattering and absorption

  18. Dual energy quantitative computed tomography (QCT). Precision of the mineral density measurements

    International Nuclear Information System (INIS)

    Braillon, P.; Bochu, M.

    1989-01-01

    The improvement that could be obtained in quantitative bone mineral measurements by dual energy computed tomography was tested in vitro. From the results of 15 mineral density measurements (in mg Ca/cm 3 , done on a precise lumbar spine phantom (Hologic) and referred to the values obtained on the same slices on a Siemens Osteo-CT phantom, the precision found was 0.8%, six times better than the precision calculated from the uncorrected measured values [fr

  19. PRISM, a Novel Visual Metaphor Measuring Personally Salient Appraisals, Attitudes and Decision-Making: Qualitative Evidence Synthesis.

    Directory of Open Access Journals (Sweden)

    Tom Sensky

    Full Text Available PRISM (the Pictorial Representation of Illness and Self Measure is a novel, simple visual instrument. Its utility was initially discovered serendipitously, but has been validated as a quantitative measure of suffering. Recently, new applications for different purposes, even in non-health settings, have encouraged further exploration of how PRISM works, and how it might be applied. This review will summarise the results to date from applications of PRISM and propose a generic conceptualisation of how PRISM works which is consistent with all these applications.A systematic review, in the form of a qualitative evidence synthesis, was carried out of all available published data on PRISM.Fifty-two publications were identified, with a total of 8254 participants. Facilitated by simple instructions, PRISM has been used with patient groups in a variety of settings and cultures. As a measure of suffering, PRISM has, with few exceptions, behaved as expected according to Eric Cassell's seminal conceptualisation of suffering. PRISM has also been used to assess beliefs about or attitudes to stressful working conditions, interpersonal relations, alcohol consumption, and suicide, amongst others.This review supports PRISM behaving as a visual metaphor of the relationship of objects (eg 'my illness' to a subject (eg 'myself' in a defined context (eg 'my life at the moment'. As a visual metaphor, it is quick to complete and yields personally salient information. PRISM is likely to have wide applications in assessing beliefs, attitudes, and decision-making, because of its properties, and because it yields both quantitative and qualitative data. In medicine, it can serve as a generic patient-reported outcome measure. It can serve as a tool for representational guidance, can be applied to developing strategies visually, and is likely to have applications in coaching, psychological assessment and therapeutic interventions.

  20. Longitudinal change in quantitative meniscus measurements in knee osteoarthritis - data from the Osteoarthritis Initiative

    International Nuclear Information System (INIS)

    Bloecker, Katja; Wirth, W.; Eckstein, F.; Guermazi, A.; Hitzl, W.; Hunter, D.J.

    2015-01-01

    We aimed to apply 3D MRI-based measurement technology to studying 2-year change in quantitative measurements of meniscus size and position. Forty-seven knees from the Osteoarthritis Initiative with medial radiographic joint space narrowing had baseline and 2-year follow-up MRIs. Quantitative measures were obtained from manual segmentation of the menisci and tibia using coronal DESSwe images. The standardized response mean (SRM = mean/SD change) was used as measure of sensitivity to longitudinal change. Medial tibial plateau coverage decreased from 34.8 % to 29.9 % (SRM -0.82; p < 0.001). Change in medial meniscus extrusion in a central image (SRM 0.18) and in the central five slices (SRM 0.22) did not reach significance, but change in extrusion across the entire meniscus (SRM 0.32; p = 0.03) and in the relative area of meniscus extrusion (SRM 0.56; p < 0.001) did. There was a reduction in medial meniscus volume (10 %; p < 0.001), width (7 %; p < 0.001), and height (2 %; p = 0.08); meniscus substance loss was strongest in the posterior (SRM -0.51; p = 0.001) and weakest in the anterior horn (SRM -0.15; p = 0.31). This pilot study reports, for the first time, longitudinal change in quantitative 3D meniscus measurements in knee osteoarthritis. It provides evidence of improved sensitivity to change of 3D measurements compared with single slice analysis. (orig.)

  1. Longitudinal change in quantitative meniscus measurements in knee osteoarthritis - data from the Osteoarthritis Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Bloecker, Katja [Paracelsus Medical University Salzburg and Nuremberg (Austria); Salzburg, Institute of Anatomy, Salzburg (Austria); BHS Linz, Department of Orthopaedics, Linz (Austria); Wirth, W.; Eckstein, F. [Paracelsus Medical University Salzburg and Nuremberg (Austria); Salzburg, Institute of Anatomy, Salzburg (Austria); Chondrometrics GmbH, Ainring (Germany); Guermazi, A. [Boston University School of Medicine, Boston, MA (United States); Boston Imaging Core Lab (BICL), Boston, MA (United States); Hitzl, W. [Paracelsus Medical University Salzburg and Nuremberg, Research Office, Salzburg (Austria); Hunter, D.J. [University of Sydney, Royal North Shore Hospital and Institute of Bone and Joint Research, Kolling Institute, Sydney (Australia)

    2015-10-15

    We aimed to apply 3D MRI-based measurement technology to studying 2-year change in quantitative measurements of meniscus size and position. Forty-seven knees from the Osteoarthritis Initiative with medial radiographic joint space narrowing had baseline and 2-year follow-up MRIs. Quantitative measures were obtained from manual segmentation of the menisci and tibia using coronal DESSwe images. The standardized response mean (SRM = mean/SD change) was used as measure of sensitivity to longitudinal change. Medial tibial plateau coverage decreased from 34.8 % to 29.9 % (SRM -0.82; p < 0.001). Change in medial meniscus extrusion in a central image (SRM 0.18) and in the central five slices (SRM 0.22) did not reach significance, but change in extrusion across the entire meniscus (SRM 0.32; p = 0.03) and in the relative area of meniscus extrusion (SRM 0.56; p < 0.001) did. There was a reduction in medial meniscus volume (10 %; p < 0.001), width (7 %; p < 0.001), and height (2 %; p = 0.08); meniscus substance loss was strongest in the posterior (SRM -0.51; p = 0.001) and weakest in the anterior horn (SRM -0.15; p = 0.31). This pilot study reports, for the first time, longitudinal change in quantitative 3D meniscus measurements in knee osteoarthritis. It provides evidence of improved sensitivity to change of 3D measurements compared with single slice analysis. (orig.)

  2. High temperature and high pressure gas cell for quantitative spectroscopic measurements

    DEFF Research Database (Denmark)

    Christiansen, Caspar; Stolberg-Rohr, Thomine; Fateev, Alexander

    2016-01-01

    A high temperature and high pressure gas cell (HTPGC) has been manufactured for quantitative spectroscopic measurements in the pressure range 1-200 bar and temperature range 300-1300 K. In the present work the cell was employed at up to 100 bar and 1000 K, and measured absorption coefficients...... of a CO2-N2 mixture at 100 bar and 1000 K are revealed for the first time, exceeding the high temperature and pressure combinations previously reported. This paper discusses the design considerations involved in the construction of the cell and presents validation measurements compared against simulated...

  3. Quantitative stress measurement of elastic deformation using mechanoluminescent sensor: An intensity ratio model

    Science.gov (United States)

    Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng

    2018-04-01

    The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.

  4. Qualitative pattern classification of shear wave elastography for breast masses: How it correlates to quantitative measurements

    International Nuclear Information System (INIS)

    Yoon, Jung Hyun; Ko, Kyung Hee; Jung, Hae Kyoung; Lee, Jong Tae

    2013-01-01

    Objective: To determine the correlation of qualitative shear wave elastography (SWE) pattern classification to quantitative SWE measurements and whether it is representative of quantitative SWE values with similar performances. Methods: From October 2012 to January 2013, 267 breast masses of 236 women (mean age: 45.12 ± 10.54 years, range: 21–88 years) who had undergone ultrasonography (US), SWE, and subsequent biopsy were included. US BI-RADS final assessment and qualitative and quantitative SWE measurements were recorded. Correlation between pattern classification and mean elasticity, maximum elasticity, elasticity ratio and standard deviation were evaluated. Diagnostic performances of grayscale US, SWE parameters, and US combined to SWE values were calculated and compared. Results: Of the 267 breast masses, 208 (77.9%) were benign and 59 (22.1%) were malignant. Pattern classifications significantly correlated with all quantitative SWE measurements, showing highest correlation with maximum elasticity, r = 0.721 (P < 0.001). Sensitivity was significantly decreased in US combined to SWE measurements to grayscale US: 69.5–89.8% to 100.0%, while specificity was significantly improved: 62.5–81.7% to 13.9% (P < 0.001). Area under the ROC curve (A z ) did not show significant differences between grayscale US to US combined to SWE (P > 0.05). Conclusion: Pattern classification shows high correlation to maximum stiffness and may be representative of quantitative SWE values. When combined to grayscale US, SWE improves specificity of US

  5. Qualitative pattern classification of shear wave elastography for breast masses: How it correlates to quantitative measurements

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Jung Hyun, E-mail: lvjenny0417@gmail.com [Department of Radiology, CHA Bundang Medical Center, CHA University, School of Medicine (Korea, Republic of); Department of Radiology, Research Institute of Radiological Science, Yonsei University, College of Medicine (Korea, Republic of); Ko, Kyung Hee, E-mail: yourheeya@cha.ac.kr [Department of Radiology, CHA Bundang Medical Center, CHA University, School of Medicine (Korea, Republic of); Jung, Hae Kyoung, E-mail: AA40501@cha.ac.kr [Department of Radiology, CHA Bundang Medical Center, CHA University, School of Medicine (Korea, Republic of); Lee, Jong Tae, E-mail: jtlee@cha.ac.kr [Department of Radiology, CHA Bundang Medical Center, CHA University, School of Medicine (Korea, Republic of)

    2013-12-01

    Objective: To determine the correlation of qualitative shear wave elastography (SWE) pattern classification to quantitative SWE measurements and whether it is representative of quantitative SWE values with similar performances. Methods: From October 2012 to January 2013, 267 breast masses of 236 women (mean age: 45.12 ± 10.54 years, range: 21–88 years) who had undergone ultrasonography (US), SWE, and subsequent biopsy were included. US BI-RADS final assessment and qualitative and quantitative SWE measurements were recorded. Correlation between pattern classification and mean elasticity, maximum elasticity, elasticity ratio and standard deviation were evaluated. Diagnostic performances of grayscale US, SWE parameters, and US combined to SWE values were calculated and compared. Results: Of the 267 breast masses, 208 (77.9%) were benign and 59 (22.1%) were malignant. Pattern classifications significantly correlated with all quantitative SWE measurements, showing highest correlation with maximum elasticity, r = 0.721 (P < 0.001). Sensitivity was significantly decreased in US combined to SWE measurements to grayscale US: 69.5–89.8% to 100.0%, while specificity was significantly improved: 62.5–81.7% to 13.9% (P < 0.001). Area under the ROC curve (A{sub z}) did not show significant differences between grayscale US to US combined to SWE (P > 0.05). Conclusion: Pattern classification shows high correlation to maximum stiffness and may be representative of quantitative SWE values. When combined to grayscale US, SWE improves specificity of US.

  6. Making better sense of the mosaic of environmental measurement networks: a system-of-systems approach and quantitative assessment

    Directory of Open Access Journals (Sweden)

    P. W. Thorne

    2017-11-01

    Full Text Available There are numerous networks and initiatives concerned with the non-satellite-observing segment of Earth observation. These are owned and operated by various entities and organisations often with different practices, norms, data policies, etc. The Horizon 2020 project GAIA–CLIM is working to improve our collective ability to use an appropriate subset of these observations to rigorously characterise satellite observations. The first fundamental question is which observations from the mosaic of non-satellite observational capabilities are appropriate for such an application. This requires an assessment of the relevant, quantifiable aspects of the measurement series which are available. While fundamentally poor or incorrect measurements can be relatively easily identified, it is metrologically impossible to be sure that a measurement series is correct. Certain assessable aspects of the measurement series can, however, build confidence in their scientific maturity and appropriateness for given applications. These are aspects such as that it is well documented, well understood, representative, updated, publicly available and maintains rich metadata. Entities such as the Global Climate Observing System have suggested a hierarchy of networks whereby different subsets of the observational capabilities are assigned to different layers based on such assessable aspects. Herein, we make a first attempt to formalise both such a system-of-systems networks concept and a means by which to, as objectively as possible, assess where in this framework different networks may reside. In this study, we concentrate on networks measuring primarily a subset of the atmospheric Essential Climate Variables of interest to GAIA–CLIM activities. We show assessment results from our application of the guidance and how we plan to use this in downstream example applications of the GAIA–CLIM project. However, the approach laid out should be more widely applicable across

  7. Quantitative shearography: error reduction by using more than three measurement channels

    International Nuclear Information System (INIS)

    Charrett, Tom O. H.; Francis, Daniel; Tatam, Ralph P.

    2011-01-01

    Shearography is a noncontact optical technique used to measure surface displacement derivatives. Full surface strain characterization can be achieved using shearography configurations employing at least three measurement channels. Each measurement channel is sensitive to a single displacement gradient component defined by its sensitivity vector. A matrix transformation is then required to convert the measured components to the orthogonal displacement gradients required for quantitative strain measurement. This transformation, conventionally performed using three measurement channels, amplifies any errors present in the measurement. This paper investigates the use of additional measurement channels using the results of a computer model and an experimental shearography system. Results are presented showing that the addition of a fourth channel can reduce the errors in the computed orthogonal components by up to 33% and that, by using 10 channels, reductions of around 45% should be possible.

  8. Measurement error of a simplified protocol for quantitative sensory tests in chronic pain patients

    DEFF Research Database (Denmark)

    Müller, Monika; Biurrun Manresa, José; Limacher, Andreas

    2017-01-01

    BACKGROUND AND OBJECTIVES: Large-scale application of Quantitative Sensory Tests (QST) is impaired by lacking standardized testing protocols. One unclear methodological aspect is the number of records needed to minimize measurement error. Traditionally, measurements are repeated 3 to 5 times...

  9. Investigation of the genetic association between quantitative measures of psychosis and schizophrenia

    DEFF Research Database (Denmark)

    Derks, Eske M; Vorstman, Jacob A S; Ripke, Stephan

    2012-01-01

    The presence of subclinical levels of psychosis in the general population may imply that schizophrenia is the extreme expression of more or less continuously distributed traits in the population. In a previous study, we identified five quantitative measures of schizophrenia (positive, negative, d...

  10. Quantitative laser-induced fluorescence measurements of nitric oxide in a heavy-duty Diesel engine

    NARCIS (Netherlands)

    Verbiezen, K.; Klein-Douwel, R. J. H.; van Viet, A. P.; Donkerbroek, A. J.; Meerts, W. L.; Dam, N. J.; ter Meulen, J. J.

    2007-01-01

    We present quantitative, in-cylinder, UV-laser-induced fluorescence measurements of nitric oxide in a heavy-duty Diesel engine. Processing of the raw fluorescence signals includes a detailed correction, based on additional measurements, for the effect of laser beam and fluorescence attenuation, and

  11. Quantitative measurements of electromechanical response with a combined optical beam and interferometric atomic force microscope

    Energy Technology Data Exchange (ETDEWEB)

    Labuda, Aleksander; Proksch, Roger [Asylum Research an Oxford Instruments Company, Santa Barbara, California 93117 (United States)

    2015-06-22

    An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement. The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.

  12. Progress towards in vitro quantitative imaging of human femur using compound quantitative ultrasonic tomography

    International Nuclear Information System (INIS)

    Lasaygues, Philippe; Ouedraogo, Edgard; Lefebvre, Jean-Pierre; Gindre, Marcel; Talmant, Marilyne; Laugier, Pascal

    2005-01-01

    The objective of this study is to make cross-sectional ultrasonic quantitative tomography of the diaphysis of long bones. Ultrasonic propagation in bones is affected by the severe mismatch between the acoustic properties of this biological solid and those of the surrounding soft medium, namely, the soft tissues in vivo or water in vitro. Bone imaging is then a nonlinear inverse-scattering problem. In this paper, we showed that in vitro quantitative images of sound velocities in a human femur cross section could be reconstructed by combining ultrasonic reflection tomography (URT), which provides images of the macroscopic structure of the bone, and ultrasonic transmission tomography (UTT), which provides quantitative images of the sound velocity. For the shape, we developed an image-processing tool to extract the external and internal boundaries and cortical thickness measurements. For velocity mapping, we used a wavelet analysis tool adapted to ultrasound, which allowed us to detect precisely the time of flight from the transmitted signals. A brief review of the ultrasonic tomography that we developed using correction algorithms of the wavepaths and compensation procedures are presented. Also shown are the first results of our analyses on models and specimens of long bone using our new iterative quantitative protocol

  13. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    Science.gov (United States)

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability

  14. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  15. DGT Passive Sampling for Quantitative in Situ Measurements of Compounds from Household and Personal Care Products in Waters.

    Science.gov (United States)

    Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C

    2017-11-21

    Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.

  16. Quantitative Assay of Pu-239 and Pu-240 by Neutron Transmission Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, E

    1971-04-15

    A method for quantitative assay of 239Pu and 240Pu has been tested at the reactor R1 in Stockholm. The method makes use of a fast chopper to measure the neutron transmission through a sample around the main resonances of these two isotopes - at 0.296 eV in 239Pu and at 1.056 eV in 240Pu. The transmission data measured are then combined with the known resonance cross sections to give the content of the isotopes. The method is nondestructive, i.e., one can use fuel pins as samples, even highly irradiated ones. A time-of-flight spectrometer of moderate capacity, like our fast chopper, is sufficient as the resonances are located at low energy. Altogether five samples have been used in the tests of the method. The results have been compared with mass spectrometer values. This comparison came out quite well for 239Pu whereas the chopper results for 240Pu were more than 10 per cent higher than the mass spectrometer results. This large deviation might be due to errors in the resonance cross section for 240Pu used in the analysis of the transmission data from the chopper. The best possible accuracy for a 15-hour run with our equipment is +- 1 per cent for 239Pu and +- 2 per cent for 240Pu, obtained for thick samples - about 3 x 1020 atoms per cm2 for each isotope. The accuracy corresponds to 68 per cent confidence level and does not include any contribution from the uncertainty in the resonance cross section

  17. A Major Locus for Quantitatively Measured Shank Skin Color Traits in Korean Native Chicken

    Directory of Open Access Journals (Sweden)

    S. Jin

    2016-11-01

    Full Text Available Shank skin color of Korean native chicken (KNC shows large color variations. It varies from white, yellow, green, bluish or grey to black, whilst in the majority of European breeds the shanks are typically yellow-colored. Three shank skin color-related traits (i.e., lightness [L*], redness [a*], and yellowness [b*] were measured by a spectrophotometer in 585 progeny from 68 nuclear families in the KNC resource population. We performed genome scan linkage analysis to identify loci that affect quantitatively measured shank skin color traits in KNC. All these birds were genotyped with 167 DNA markers located throughout the 26 autosomes. The SOLAR program was used to conduct multipoint variance-component quantitative trait locus (QTL analyses. We detected a major QTL that affects b* value (logarithm of odds [LOD] = 47.5, p = 1.60×10−49 on GGA24 (GGA for Gallus gallus. At the same location, we also detected a QTL that influences a* value (LOD = 14.2, p = 6.14×10−16. Additionally, beta-carotene dioxygenase 2 (BCDO2, the obvious positional candidate gene under the linkage peaks on GGA24, was investigated by the two association tests: i.e., measured genotype association (MGA and quantitative transmission disequilibrium test (QTDT. Significant associations were detected between BCDO2 g.9367 A>C and a* (PMGA = 1.69×10−28; PQTDT = 2.40×10−25. The strongest associations were between BCDO2 g.9367 A>C and b* (PMGA = 3.56×10−66; PQTDT = 1.68×10−65. However, linkage analyses conditional on the single nucleotide polymorphism indicated that other functional variants should exist. Taken together, we demonstrate for the first time the linkage and association between the BCDO2 locus on GGA24 and quantitatively measured shank skin color traits in KNC.

  18. Quantitative Method to Measure Thermal Conductivity of One-Dimensional Nanostructures Based on Scanning Thermal Wave Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Park, Kyung Bae; Chung, Jae Hun; Hwang, Gwang Seok; Jung, Eui Han; Kwon, Oh Myoung [Korea University, Seoul (Korea, Republic of)

    2014-12-15

    We present a method to quantitatively measure the thermal conductivity of one-dimensional nanostructures by utilizing scanning thermal wave microscopy (STWM) at a nanoscale spatial resolution. In this paper, we explain the principle for measuring the thermal diffusivity of one-dimensional nanostructures using STWM and the theoretical analysis procedure for quantifying the thermal diffusivity. The SWTM measurement method obtains the thermal conductivity by measuring the thermal diffusivity, which has only a phase lag relative to the distance corresponding to the transferred thermal wave. It is not affected by the thermal contact resistances between the heat source and nanostructure and between the nanostructure and probe. Thus, the heat flux applied to the nanostructure is accurately obtained. The proposed method provides a very simple and quantitative measurement relative to conventional measurement techniques.

  19. Visual characterization and quantitative measurement of artemisinin-induced DNA breakage

    Energy Technology Data Exchange (ETDEWEB)

    Cai Huaihong [Bionanotechnology Lab, and Department of Chemistry, Jinan University, Guangzhou 510632 (China); Yang Peihui [Bionanotechnology Lab, and Department of Chemistry, Jinan University, Guangzhou 510632 (China)], E-mail: typh@jnu.edu.cn; Chen Jianan [Bionanotechnology Lab, and Department of Chemistry, Jinan University, Guangzhou 510632 (China); Liang Zhihong [Experiment and Technology Center, Jinan University, Guangzhou 510632 (China); Chen Qiongyu [Institute of Genetic Engineering, Jinan University, Guangzhou 510632 (China); Cai Jiye [Bionanotechnology Lab, and Department of Chemistry, Jinan University, Guangzhou 510632 (China)], E-mail: tjycai@jnu.edu.cn

    2009-05-01

    DNA conformational change and breakage induced by artemisinin, a traditional Chinese herbal medicine, have been visually characterized and quantitatively measured by the multiple tools of electrochemistry, UV-vis absorption spectroscopy, atomic force microscopy (AFM), and DNA electrophoresis. Electrochemical and spectroscopic results confirm that artemisinin can intercalate into DNA double helix, which causes DNA conformational changes. AFM imaging vividly demonstrates uneven DNA strand breaking induced by QHS interaction. To assess these DNA breakages, quantitative analysis of the extent of DNA breakage has been performed by analyzing AFM images. Basing on the statistical analysis, the occurrence of DNA breaks is found to depend on the concentration of artemisinin. DNA electrophoresis further validates that the intact DNA molecules are unwound due to the breakages occur at the single strands. A reliable scheme is proposed to explain the process of artemisinin-induced DNA cleavage. These results can provide further information for better understanding the anticancer activity of artemisinin.

  20. Developing quantitative tools for measuring aspects of prisonization

    DEFF Research Database (Denmark)

    Kjær Minke, Linda

    2013-01-01

    The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners.......The article describes and discusses the preparation and completion of a quantitative study among prison officers and prisoners....

  1. Integration of Quantitative Positron Emission Tomography Absolute Myocardial Blood Flow Measurements in the Clinical Management of Coronary Artery Disease.

    Science.gov (United States)

    Gewirtz, Henry; Dilsizian, Vasken

    2016-05-31

    In the >40 years since planar myocardial imaging with(43)K-potassium was introduced into clinical research and management of patients with coronary artery disease (CAD), diagnosis and treatment have undergone profound scientific and technological changes. One such innovation is the current state-of-the-art hardware and software for positron emission tomography myocardial perfusion imaging, which has advanced it from a strictly research-oriented modality to a clinically valuable tool. This review traces the evolving role of quantitative positron emission tomography measurements of myocardial blood flow in the evaluation and management of patients with CAD. It presents methodology, currently or soon to be available, that offers a paradigm shift in CAD management. Heretofore, radionuclide myocardial perfusion imaging has been primarily qualitative or at best semiquantitative in nature, assessing regional perfusion in relative terms. Thus, unlike so many facets of modern cardiovascular practice and CAD management, which depend, for example, on absolute values of key parameters such as arterial and left ventricular pressures, serum lipoprotein, and other biomarker levels, the absolute levels of rest and maximal myocardial blood flow have yet to be incorporated into routine clinical practice even in most positron emission tomography centers where the potential to do so exists. Accordingly, this review focuses on potential value added for improving clinical CAD practice by measuring the absolute level of rest and maximal myocardial blood flow. Physiological principles and imaging fundamentals necessary to understand how positron emission tomography makes robust, quantitative measurements of myocardial blood flow possible are highlighted. © 2016 American Heart Association, Inc.

  2. Quantitative prediction of drug side effects based on drug-related features.

    Science.gov (United States)

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  3. Quantitative and Isolated Measurement of Far-Field Light Scattering by a Single Nanostructure

    Science.gov (United States)

    Kim, Donghyeong; Jeong, Kwang-Yong; Kim, Jinhyung; Ee, Ho-Seok; Kang, Ju-Hyung; Park, Hong-Gyu; Seo, Min-Kyo

    2017-11-01

    Light scattering by nanostructures has facilitated research on various optical phenomena and applications by interfacing the near fields and free-propagating radiation. However, direct quantitative measurement of far-field scattering by a single nanostructure on the wavelength scale or less is highly challenging. Conventional back-focal-plane imaging covers only a limited solid angle determined by the numerical aperture of the objectives and suffers from optical aberration and distortion. Here, we present a quantitative measurement of the differential far-field scattering cross section of a single nanostructure over the full hemisphere. In goniometer-based far-field scanning with a high signal-to-noise ratio of approximately 27.4 dB, weak scattering signals are efficiently isolated and detected under total-internal-reflection illumination. Systematic measurements reveal that the total and differential scattering cross sections of a Au nanorod are determined by the plasmonic Fabry-Perot resonances and the phase-matching conditions to the free-propagating radiation, respectively. We believe that our angle-resolved far-field measurement scheme provides a way to investigate and evaluate the physical properties and performance of nano-optical materials and phenomena.

  4. Quantitative measurement and analysis for detection and treatment planning of developmental dysplasia of the hip

    Science.gov (United States)

    Liu, Xin; Lu, Hongbing; Chen, Hanyong; Zhao, Li; Shi, Zhengxing; Liang, Zhengrong

    2009-02-01

    Developmental dysplasia of the hip is a congenital hip joint malformation affecting the proximal femurs and acetabulum that are subluxatable, dislocatable, and dislocated. Conventionally, physicians made diagnoses and treatments only based on findings from two-dimensional (2D) images by manually calculating clinic parameters. However, anatomical complexity of the disease and the limitation of current standard procedures make accurate diagnosis quite difficultly. In this study, we developed a system that provides quantitative measurement of 3D clinical indexes based on computed tomography (CT) images. To extract bone structure from surrounding tissues more accurately, the system firstly segments the bone using a knowledge-based fuzzy clustering method, which is formulated by modifying the objective function of the standard fuzzy c-means algorithm with additive adaptation penalty. The second part of the system calculates automatically the clinical indexes, which are extended from 2D to 3D for accurate description of spatial relationship between femurs and acetabulum. To evaluate the system performance, experimental study based on 22 patients with unilateral or bilateral affected hip was performed. The results of 3D acetabulum index (AI) automatically provided by the system were validated by comparison with 2D results measured by surgeons manually. The correlation between the two results was found to be 0.622 (p<0.01).

  5. Quantitative facial asymmetry: using three-dimensional photogrammetry to measure baseline facial surface symmetry.

    Science.gov (United States)

    Taylor, Helena O; Morrison, Clinton S; Linden, Olivia; Phillips, Benjamin; Chang, Johnny; Byrne, Margaret E; Sullivan, Stephen R; Forrest, Christopher R

    2014-01-01

    Although symmetry is hailed as a fundamental goal of aesthetic and reconstructive surgery, our tools for measuring this outcome have been limited and subjective. With the advent of three-dimensional photogrammetry, surface geometry can be captured, manipulated, and measured quantitatively. Until now, few normative data existed with regard to facial surface symmetry. Here, we present a method for reproducibly calculating overall facial symmetry and present normative data on 100 subjects. We enrolled 100 volunteers who underwent three-dimensional photogrammetry of their faces in repose. We collected demographic data on age, sex, and race and subjectively scored facial symmetry. We calculated the root mean square deviation (RMSD) between the native and reflected faces, reflecting about a plane of maximum symmetry. We analyzed the interobserver reliability of the subjective assessment of facial asymmetry and the quantitative measurements and compared the subjective and objective values. We also classified areas of greatest asymmetry as localized to the upper, middle, or lower facial thirds. This cluster of normative data was compared with a group of patients with subtle but increasing amounts of facial asymmetry. We imaged 100 subjects by three-dimensional photogrammetry. There was a poor interobserver correlation between subjective assessments of asymmetry (r = 0.56). There was a high interobserver reliability for quantitative measurements of facial symmetry RMSD calculations (r = 0.91-0.95). The mean RMSD for this normative population was found to be 0.80 ± 0.24 mm. Areas of greatest asymmetry were distributed as follows: 10% upper facial third, 49% central facial third, and 41% lower facial third. Precise measurement permitted discrimination of subtle facial asymmetry within this normative group and distinguished norms from patients with subtle facial asymmetry, with placement of RMSDs along an asymmetry ruler. Facial surface symmetry, which is poorly assessed

  6. Individual v. community-level measures of women's decision-making ...

    African Journals Online (AJOL)

    Individual v. community-level measures of women's decision-making involvement and ... participation for child survival in sub-Saharan Africa is limited. ... Multilevel discrete-time hazard models were employed to investigate the net effect of ...

  7. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  8. Semi-quantitative myocardial perfusion measured by computed tomography in patients with refractory angina

    DEFF Research Database (Denmark)

    Qayyum, Abbas Ali; Kühl, Jørgen Tobias; Kjaer, Andreas

    2017-01-01

    INTRODUCTION: Computed tomography (CT) is a novel method for assessment of myocardial perfusion and has not yet been compared to rubidium-82 positron emission tomography (PET). We aimed to compare CT measured semi-quantitative myocardial perfusion with absolute quantified myocardial perfusion usi...

  9. Feasibility of Quantitative Ultrasound Measurement of the Heel Bone in People with Intellectual Disabilities

    Science.gov (United States)

    Mergler, S.; Lobker, B.; Evenhuis, H. M.; Penning, C.

    2010-01-01

    Low bone mineral density (BMD) and fractures are common in people with intellectual disabilities (ID). Reduced mobility in case of motor impairment and the use of anti-epileptic drugs contribute to the development of low BMD. Quantitative ultrasound (QUS) measurement of the heel bone is a non-invasive and radiation-free method for measuring bone…

  10. Methods for quantitative measurement of tooth wear using the area and volume of virtual model cusps.

    Science.gov (United States)

    Kim, Soo-Hyun; Park, Young-Seok; Kim, Min-Kyoung; Kim, Sulhee; Lee, Seung-Pyo

    2018-04-01

    Clinicians must examine tooth wear to make a proper diagnosis. However, qualitative methods of measuring tooth wear have many disadvantages. Therefore, this study aimed to develop and evaluate quantitative parameters using the cusp area and volume of virtual dental models. The subjects of this study were the same virtual models that were used in our former study. The same age group classification and new tooth wear index (NTWI) scoring system were also reused. A virtual occlusal plane was generated with the highest cusp points and lowered vertically from 0.2 to 0.8 mm to create offset planes. The area and volume of each cusp was then measured and added together. In addition to the former analysis, the differential features of each cusp were analyzed. The scores of the new parameters differentiated the age and NTWI groups better than those analyzed in the former study. The Spearman ρ coefficients between the total area and the area of each cusp also showed higher scores at the levels of 0.6 mm (0.6A) and 0.8A. The mesiolingual cusp (MLC) showed a statistically significant difference ( P <0.01) from the other cusps in the paired t -test. Additionally, the MLC exhibited the highest percentage of change at 0.6A in some age and NTWI groups. Regarding the age groups, the MLC showed the highest score in groups 1 and 2. For the NTWI groups, the MLC was not significantly different in groups 3 and 4. These results support the proposal that the lingual cusp exhibits rapid wear because it serves as a functional cusp. Although this study has limitations due to its cross-sectional nature, it suggests better quantitative parameters and analytical tools for the characteristics of cusp wear.

  11. Reliability and group differences in quantitative cervicothoracic measures among individuals with and without chronic neck pain

    Science.gov (United States)

    2012-01-01

    Background Clinicians frequently rely on subjective categorization of impairments in mobility, strength, and endurance for clinical decision-making; however, these assessments are often unreliable and lack sensitivity to change. The objective of this study was to determine the inter-rater reliability, minimum detectable change (MDC), and group differences in quantitative cervicothoracic measures for individuals with and without chronic neck pain (NP). Methods Nineteen individuals with NP and 20 healthy controls participated in this case control study. Two physical therapists performed a 30-minute examination on separate days. A handheld dynamometer, gravity inclinometer, ruler, and stopwatch were used to quantify cervical range of motion (ROM), cervical muscle strength and endurance, and scapulothoracic muscle length and strength, respectively. Results Intraclass correlation coefficients for inter-rater reliability were significantly greater than zero for most impairment measures, with point estimates ranging from 0.45 to 0.93. The NP group exhibited reduced cervical ROM (P ≤ 0.012) and muscle strength (P ≤ 0.038) in most movement directions, reduced cervical extensor endurance (P = 0.029), and reduced rhomboid and middle trapezius muscle strength (P ≤ 0.049). Conclusions Results demonstrate the feasibility of obtaining objective cervicothoracic impairment measures with acceptable inter-rater agreement across time. The clinical utility of these measures is supported by evidence of impaired mobility, strength, and endurance among patients with NP, with corresponding MDC values that can help establish benchmarks for clinically significant change. PMID:23114092

  12. 3D OCT imaging in clinical settings: toward quantitative measurements of retinal structures

    Science.gov (United States)

    Zawadzki, Robert J.; Fuller, Alfred R.; Zhao, Mingtao; Wiley, David F.; Choi, Stacey S.; Bower, Bradley A.; Hamann, Bernd; Izatt, Joseph A.; Werner, John S.

    2006-02-01

    The acquisition speed of current FD-OCT (Fourier Domain - Optical Coherence Tomography) instruments allows rapid screening of three-dimensional (3D) volumes of human retinas in clinical settings. To take advantage of this ability requires software used by physicians to be capable of displaying and accessing volumetric data as well as supporting post processing in order to access important quantitative information such as thickness maps and segmented volumes. We describe our clinical FD-OCT system used to acquire 3D data from the human retina over the macula and optic nerve head. B-scans are registered to remove motion artifacts and post-processed with customized 3D visualization and analysis software. Our analysis software includes standard 3D visualization techniques along with a machine learning support vector machine (SVM) algorithm that allows a user to semi-automatically segment different retinal structures and layers. Our program makes possible measurements of the retinal layer thickness as well as volumes of structures of interest, despite the presence of noise and structural deformations associated with retinal pathology. Our software has been tested successfully in clinical settings for its efficacy in assessing 3D retinal structures in healthy as well as diseased cases. Our tool facilitates diagnosis and treatment monitoring of retinal diseases.

  13. Quantitative fundus autofluorescence in mice: correlation with HPLC quantitation of RPE lipofuscin and measurement of retina outer nuclear layer thickness.

    Science.gov (United States)

    Sparrow, Janet R; Blonska, Anna; Flynn, Erin; Duncker, Tobias; Greenberg, Jonathan P; Secondi, Roberta; Ueda, Keiko; Delori, François C

    2013-04-17

    Our study was conducted to establish procedures and protocols for quantitative autofluorescence (qAF) measurements in mice, and to report changes in qAF, A2E bisretinoid concentration, and outer nuclear layer (ONL) thickness in mice of different genotypes and age. Fundus autofluorescence (AF) images (55° lens, 488 nm excitation) were acquired in albino Abca4(-/-), Abca4(+/-), and Abca4(+/+) mice (ages 2-12 months) with a confocal scanning laser ophthalmoscope (cSLO). Gray levels (GLs) in each image were calibrated to an internal fluorescence reference. The bisretinoid A2E was measured by quantitative high performance liquid chromatography (HPLC). Histometric analysis of ONL thicknesses was performed. The Bland-Altman coefficient of repeatability (95% confidence interval) was ±18% for between-session qAF measurements. Mean qAF values increased with age (2-12 months) in all groups of mice. qAF was approximately 2-fold higher in Abca4(-/-) mice than in Abca4(+/+) mice and approximately 20% higher in heterozygous mice. HPLC measurements of the lipofuscin fluorophore A2E also revealed age-associated increases, and the fold difference between Abca4(-/-) and wild-type mice was more pronounced (approximately 3-4-fold) than measurable by qAF. Moreover, A2E levels declined after 8 months of age, a change not observed with qAF. The decline in A2E levels in the Abca4(-/-) mice corresponded to reduced photoreceptor cell viability as reflected in ONL thinning beginning at 8 months of age. The qAF method enables measurement of in vivo lipofuscin and the detection of genotype and age-associated differences. The use of this approach has the potential to aid in understanding retinal disease processes and will facilitate preclinical studies.

  14. Comprehensive Comparison of Self-Administered Questionnaires for Measuring Quantitative Autistic Traits in Adults

    Science.gov (United States)

    Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.

    2014-01-01

    We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…

  15. Quantitative Measurement of Physical Activity in Acute Ischemic Stroke and Transient Ischemic Attack

    DEFF Research Database (Denmark)

    Strømmen, Anna Maria; Christensen, Thomas; Jensen, Kai

    2014-01-01

    BACKGROUND AND PURPOSE: The purpose of this study was to quantitatively measure and describe the amount and pattern of physical activity in patients within the first week after acute ischemic stroke and transient ischemic attack using accelerometers. METHODS: A total of 100 patients with acute is...

  16. A novel semi-quantitative method for measuring tissue bleeding.

    Science.gov (United States)

    Vukcevic, G; Volarevic, V; Raicevic, S; Tanaskovic, I; Milicic, B; Vulovic, T; Arsenijevic, S

    2014-03-01

    In this study, we describe a new semi-quantitative method for measuring the extent of bleeding in pathohistological tissue samples. To test our novel method, we recruited 120 female patients in their first trimester of pregnancy and divided them into three groups of 40. Group I was the control group, in which no dilation was applied. Group II was an experimental group, in which dilation was performed using classical mechanical dilators. Group III was also an experimental group, in which dilation was performed using a hydraulic dilator. Tissue samples were taken from the patients' cervical canals using a Novak's probe via energetic single-step curettage prior to any dilation in Group I and after dilation in Groups II and III. After the tissue samples were prepared, light microscopy was used to obtain microphotographs at 100x magnification. The surfaces affected by bleeding were measured in the microphotographs using the Autodesk AutoCAD 2009 program and its "polylines" function. The lines were used to mark the area around the entire sample (marked A) and to create "polyline" areas around each bleeding area on the sample (marked B). The percentage of the total area affected by bleeding was calculated using the formula: N = Bt x 100 / At where N is the percentage (%) of the tissue sample surface affected by bleeding, At (A total) is the sum of the surfaces of all of the tissue samples and Bt (B total) is the sum of all the surfaces affected by bleeding in all of the tissue samples. This novel semi-quantitative method utilizes the Autodesk AutoCAD 2009 program, which is simple to use and widely available, thereby offering a new, objective and precise approach to estimate the extent of bleeding in tissue samples.

  17. Human figure drawings and house tree person drawings as indicators of self-esteem: a quantitative approach.

    Science.gov (United States)

    Groth-Marnat, G; Roberts, L

    1998-02-01

    This study assessed the concurrent validity of Human Figure Drawings (HFD) and House-Tree-Person (HTP) drawings as measures of self-esteem. Adult subjects were requested to make HFD and HTP drawings and to complete measures of psychological adjustment which included the Coopersmith Self Esteem Inventory and Tennessee Self Concept Scale. The drawings were scored using a quantitative, composite rating scale derived from HFD and HTP empirical and theoretical literature on psychological health. Results indicated that neither the HFD nor the HTP quantitative composite ratings of psychological health related to the formal measures of self-esteem.

  18. Quantitative measurement of the chemical composition of geological standards with a miniature laser ablation/ionization mass spectrometer designed for in situ application in space research

    International Nuclear Information System (INIS)

    Neuland, M B; Riedo, A; Tulej, M; Wurz, P; Grimaudo, V; Moreno-García, P; Mezger, K

    2016-01-01

    A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface. (paper)

  19. Epithelium percentage estimation facilitates epithelial quantitative protein measurement in tissue specimens.

    Science.gov (United States)

    Chen, Jing; Toghi Eshghi, Shadi; Bova, George Steven; Li, Qing Kay; Li, Xingde; Zhang, Hui

    2013-12-01

    The rapid advancement of high-throughput tools for quantitative measurement of proteins has demonstrated the potential for the identification of proteins associated with cancer. However, the quantitative results on cancer tissue specimens are usually confounded by tissue heterogeneity, e.g. regions with cancer usually have significantly higher epithelium content yet lower stromal content. It is therefore necessary to develop a tool to facilitate the interpretation of the results of protein measurements in tissue specimens. Epithelial cell adhesion molecule (EpCAM) and cathepsin L (CTSL) are two epithelial proteins whose expressions in normal and tumorous prostate tissues were confirmed by measuring staining intensity with immunohistochemical staining (IHC). The expressions of these proteins were measured by ELISA in protein extracts from OCT embedded frozen prostate tissues. To eliminate the influence of tissue heterogeneity on epithelial protein quantification measured by ELISA, a color-based segmentation method was developed in-house for estimation of epithelium content using H&E histology slides from the same prostate tissues and the estimated epithelium percentage was used to normalize the ELISA results. The epithelium contents of the same slides were also estimated by a pathologist and used to normalize the ELISA results. The computer based results were compared with the pathologist's reading. We found that both EpCAM and CTSL levels, measured by ELISA assays itself, were greatly affected by epithelium content in the tissue specimens. Without adjusting for epithelium percentage, both EpCAM and CTSL levels appeared significantly higher in tumor tissues than normal tissues with a p value less than 0.001. However, after normalization by the epithelium percentage, ELISA measurements of both EpCAM and CTSL were in agreement with IHC staining results, showing a significant increase only in EpCAM with no difference in CTSL expression in cancer tissues. These results

  20. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    Science.gov (United States)

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  1. High temperature and high pressure gas cell for quantitative spectroscopic measurements

    International Nuclear Information System (INIS)

    Christiansen, Caspar; Stolberg-Rohr, Thomine; Fateev, Alexander; Clausen, Sønnik

    2016-01-01

    A high temperature and high pressure gas cell (HTPGC) has been manufactured for quantitative spectroscopic measurements in the pressure range 1–200 bar and temperature range 300–1300 K. In the present work the cell was employed at up to 100 bar and 1000 K, and measured absorption coefficients of a CO_2–N_2 mixture at 100 bar and 1000 K are revealed for the first time, exceeding the high temperature and pressure combinations previously reported. This paper discusses the design considerations involved in the construction of the cell and presents validation measurements compared against simulated spectra, as well as published experimental data. - Highlights: • A ceramic gas cell designed for gas measurements up to 1300 K and 200 bar. • The first recorded absorption spectrum of CO_2 at 1000 K and 101 bar is presented. • Voigt profiles might suffice in the modeling of radiation from CO_2 in combustion.

  2. Validation and measurement uncertainty estimation in food microbiology: differences between quantitative and qualitative methods

    Directory of Open Access Journals (Sweden)

    Vesna Režić Dereani

    2010-09-01

    Full Text Available The aim of this research is to describe quality control procedures, procedures for validation and measurement uncertainty (MU determination as an important element of quality assurance in food microbiology laboratory for qualitative and quantitative type of analysis. Accreditation is conducted according to the standard ISO 17025:2007. General requirements for the competence of testing and calibration laboratories, which guarantees the compliance with standard operating procedures and the technical competence of the staff involved in the tests, recently are widely introduced in food microbiology laboratories in Croatia. In addition to quality manual introduction, and a lot of general documents, some of the most demanding procedures in routine microbiology laboratories are measurement uncertainty (MU procedures and validation experiment design establishment. Those procedures are not standardized yet even at international level, and they require practical microbiological knowledge, altogether with statistical competence. Differences between validation experiments design for quantitative and qualitative food microbiology analysis are discussed in this research, and practical solutions are shortly described. MU for quantitative determinations is more demanding issue than qualitative MU calculation. MU calculations are based on external proficiency testing data and internal validation data. In this paper, practical schematic descriptions for both procedures are shown.

  3. The importance of quantitative measurement methods for uveitis: laser flare photometry endorsed in Europe while neglected in Japan where the technology measuring quantitatively intraocular inflammation was developed.

    Science.gov (United States)

    Herbort, Carl P; Tugal-Tutkun, Ilknur

    2017-06-01

    Laser flare photometry (LFP) is an objective and quantitative method to measure intraocular inflammation. The LFP technology was developed in Japan and has been commercially available since 1990. The aim of this work was to review the application of LFP in uveitis practice in Europe compared to Japan where the technology was born. We reviewed PubMed articles published on LFP and uveitis. Although LFP has been largely integrated in routine uveitis practice in Europe, it has been comparatively neglected in Japan and still has not received FDA approval in the USA. As LFP is the only method that provides a precise measure of intraocular inflammation, it should be used as a gold standard in uveitis centres worldwide.

  4. Quantitative label-free sperm imaging by means of transport of intensity

    Science.gov (United States)

    Poola, Praveen Kumar; Pandiyan, Vimal Prabhu; Jayaraman, Varshini; John, Renu

    2016-03-01

    Most living cells are optically transparent which makes it difficult to visualize them under bright field microscopy. Use of contrast agents or markers and staining procedures are often followed to observe these cells. However, most of these staining agents are toxic and not applicable for live cell imaging. In the last decade, quantitative phase imaging has become an indispensable tool for morphological characterization of the phase objects without any markers. In this paper, we report noninterferometric quantitative phase imaging of live sperm cells by solving transport of intensity equations with recorded intensity measurements along optical axis on a commercial bright field microscope.

  5. Reliability of Reagent Strips for Semi-quantitative Measurement of Glucosuria in a Neonatal Intensive Care Setting

    Directory of Open Access Journals (Sweden)

    Jolita Bekhof

    2014-12-01

    Conclusion: The reliability of the semi-quantitative measurement of glucosuria in newborn infants using reagent strips is good, even under the conditions of a NICU. Changes in the rating of reagent strips of more than one category are most likely to be beyond measurement error.

  6. Parents' and Physicians' Perceptions of Children's Participation in Decision-making in Paediatric Oncology: A Quantitative Study.

    Science.gov (United States)

    Rost, Michael; Wangmo, Tenzin; Niggli, Felix; Hartmann, Karin; Hengartner, Heinz; Ansari, Marc; Brazzola, Pierluigi; Rischewski, Johannes; Beck-Popovic, Maja; Kühne, Thomas; Elger, Bernice S

    2017-12-01

    The goal is to present how shared decision-making in paediatric oncology occurs from the viewpoints of parents and physicians. Eight Swiss Pediatric Oncology Group centres participated in this prospective study. The sample comprised a parent and physician of the minor patient (decision-making on the part of the children. A patient's age and gender predicted involvement. Older children and girls were more likely to be involved. In the decision-making process, parents held a less active role than they actually wanted. Physicians should take measures to ensure that provided information is understood correctly. Furthermore, they should work towards creating awareness for systematic differences between parents and physicians with respect to the perception of the child, the disease, and shared decision-making.

  7. Electrons, Photons, and Force: Quantitative Single-Molecule Measurements from Physics to Biology

    Science.gov (United States)

    2011-01-01

    Single-molecule measurement techniques have illuminated unprecedented details of chemical behavior, including observations of the motion of a single molecule on a surface, and even the vibration of a single bond within a molecule. Such measurements are critical to our understanding of entities ranging from single atoms to the most complex protein assemblies. We provide an overview of the strikingly diverse classes of measurements that can be used to quantify single-molecule properties, including those of single macromolecules and single molecular assemblies, and discuss the quantitative insights they provide. Examples are drawn from across the single-molecule literature, ranging from ultrahigh vacuum scanning tunneling microscopy studies of adsorbate diffusion on surfaces to fluorescence studies of protein conformational changes in solution. PMID:21338175

  8. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    Directory of Open Access Journals (Sweden)

    Philip J Kellman

    Full Text Available Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert

  9. Forensic comparison and matching of fingerprints: using quantitative image measures for estimating error rates through understanding and predicting difficulty.

    Science.gov (United States)

    Kellman, Philip J; Mnookin, Jennifer L; Erlikhman, Gennady; Garrigan, Patrick; Ghose, Tandra; Mettler, Everett; Charlton, David; Dror, Itiel E

    2014-01-01

    Latent fingerprint examination is a complex task that, despite advances in image processing, still fundamentally depends on the visual judgments of highly trained human examiners. Fingerprints collected from crime scenes typically contain less information than fingerprints collected under controlled conditions. Specifically, they are often noisy and distorted and may contain only a portion of the total fingerprint area. Expertise in fingerprint comparison, like other forms of perceptual expertise, such as face recognition or aircraft identification, depends on perceptual learning processes that lead to the discovery of features and relations that matter in comparing prints. Relatively little is known about the perceptual processes involved in making comparisons, and even less is known about what characteristics of fingerprint pairs make particular comparisons easy or difficult. We measured expert examiner performance and judgments of difficulty and confidence on a new fingerprint database. We developed a number of quantitative measures of image characteristics and used multiple regression techniques to discover objective predictors of error as well as perceived difficulty and confidence. A number of useful predictors emerged, and these included variables related to image quality metrics, such as intensity and contrast information, as well as measures of information quantity, such as the total fingerprint area. Also included were configural features that fingerprint experts have noted, such as the presence and clarity of global features and fingerprint ridges. Within the constraints of the overall low error rates of experts, a regression model incorporating the derived predictors demonstrated reasonable success in predicting objective difficulty for print pairs, as shown both in goodness of fit measures to the original data set and in a cross validation test. The results indicate the plausibility of using objective image metrics to predict expert performance and

  10. Integrating fMRI with psychophysiological measurements in the study of decision-making

    OpenAIRE

    Wong, Savio W.H.; Xue, Gui; Bechara, Antoine

    2011-01-01

    Neuroimaging techniques have recently been used to examine the neural mechanism of decision-making. Nevertheless, most of the neuroimaging studies overlook the importance of emotion and autonomic response in modulating the process of decision-making. In this paper, we discussed how to integrating fMRI with psychophysiological measurements in studying decision-making. We suggested that psychophysiological data would complement with fMRI findings in providing a more comprehensive understanding ...

  11. Quantitative biological measurement in Transmission Electron Tomography

    International Nuclear Information System (INIS)

    Mantell, Judith M; Verkade, Paul; Arkill, Kenton P

    2012-01-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.

  12. Quantitative biological measurement in Transmission Electron Tomography

    Science.gov (United States)

    Mantell, Judith M.; Verkade, Paul; Arkill, Kenton P.

    2012-07-01

    It has been known for some time that biological sections shrink in the transmission electron microscope from exposure to the electron beam. This phenomenon is especially important in Electron Tomography (ET). The effect on shrinkage of parameters such as embedding medium or sample type is less well understood. In addition anisotropic area shrinkage has largely been ignored. The intention of this study is to explore the shrinkage on a number of samples ranging in thickness from 200 nm to 500 nm. A protocol was developed to determine the shrinkage in area and thickness using the gold fiducials used in electron tomography. In brief: Using low dose philosophy on the section, a focus area was used prior to a separate virgin study area for a series of known exposures on a tilted sample. The shrinkage was determined by measurements on the gold beads from both sides of the section as determined by a confirmatory tomogram. It was found that the shrinkage in area (approximately to 90-95% of the original) and the thickness (approximately 65% of the original at most) agreed with pervious authors, but that a lmost all the shrinkage was in the first minute and that although the direction of the in-plane shrinkage (in x and y) was sometimes uneven the end result was consistent. It was observed, in general, that thinner samples showed more percentage shrinkage than thicker ones. In conclusion, if direct quantitative measurements are required then the protocol described should be used for all areas studied.

  13. Reineke’s stand density index: a quantitative and non-unitless measure of stand density

    Science.gov (United States)

    Curtis L. VanderSchaaf

    2013-01-01

    When used as a measure of relative density, Reineke’s stand density index (SDI) can be made unitless by relating the current SDI to a standard density but when used as a quantitative measure of stand density SDI is not unitless. Reineke’s SDI relates the current stand density to an equivalent number of trees per unit area in a stand with a quadratic mean diameter (Dq)...

  14. Damage measurement of structural material by electron backscatter diffraction. Quantification of measurement quality toward standardization of measurement procedure

    International Nuclear Information System (INIS)

    Kamaya, Masayuki

    2011-01-01

    Several attempts have been made to assess the damage induced in materials by crystal orientation distributions identified using electron backscatter diffraction (EBSD). In particular, the local misorientation, which is the misorientation between neighboring measurement points, was shown to correlate well with the degree of material damage such as plastic strain, fatigue and creep. However, the damage assessments conducted using the local misorientations were qualitative rather than quantitative. The local misorientation can be correlated theoretically with physical parameters such as dislocation density. However, the error in crystal orientation measurements makes quantitative evaluation of the local misorientation difficult. Furthermore, the local misorientation depends on distance between the measurement points (step size). For a quantitative assessment of the local misorientation, the error in the crystal orientation measurements must be reduced or the degree of error must be shown quantitatively. In this study, first, the influence of the quality of measurements (accuracy of measurements) and step size on the local misorientation was investigated using stainless steel specimens damaged by tensile deformation or fatigue. By performing the crystal orientation measurements with different conditions, it was shown that the quality of measurement could be represented by the error index, which was previously proposed by the author. Secondly, a filtering process was applied in order to improve the accuracy of crystal orientation measurements and its effect was investigated using the error index. It was revealed that the local misorientations obtained under different measurement conditions could be compared quantitatively only when the error index and the step size were almost or exactly the same. It was also shown that the filtering process could successfully reduce the measurement error and step size dependency of the local misorientations. By applying the filtering

  15. Quantitative measurements of intercellular adhesion between a macrophage and cancer cells using a cup-attached AFM chip.

    Science.gov (United States)

    Kim, Hyonchol; Yamagishi, Ayana; Imaizumi, Miku; Onomura, Yui; Nagasaki, Akira; Miyagi, Yohei; Okada, Tomoko; Nakamura, Chikashi

    2017-07-01

    Intercellular adhesion between a macrophage and cancer cells was quantitatively measured using atomic force microscopy (AFM). Cup-shaped metal hemispheres were fabricated using polystyrene particles as a template, and a cup was attached to the apex of the AFM cantilever. The cup-attached AFM chip (cup-chip) approached a murine macrophage cell (J774.2), the cell was captured on the inner concave of the cup, and picked up by withdrawing the cup-chip from the substrate. The cell-attached chip was advanced towards a murine breast cancer cell (FP10SC2), and intercellular adhesion between the two cells was quantitatively measured. To compare cell adhesion strength, the work required to separate two adhered cells (separation work) was used as a parameter. Separation work was almost 2-fold larger between a J774.2 cell and FP10SC2 cell than between J774.2 cell and three additional different cancer cells (4T1E, MAT-LyLu, and U-2OS), two FP10SC2 cells, or two J774.2 cells. FP10SC2 was established from 4T1E as a highly metastatic cell line, indicates separation work increased as the malignancy of cancer cells became higher. One possible explanation of the strong adhesion of macrophages to cancer cells observed in this study is that the measurement condition mimicked the microenvironment of tumor-associated macrophages (TAMs) in vivo, and J774.2 cells strongly expressed CD204, which is a marker of TAMs. The results of the present study, which were obtained by measuring cell adhesion strength quantitatively, indicate that the fabricated cup-chip is a useful tool for measuring intercellular adhesion easily and quantitatively. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Quantitative echocardiographic measures in the assessment of single ventricle function post-Fontan: Incorporation into routine clinical practice.

    Science.gov (United States)

    Rios, Rodrigo; Ginde, Salil; Saudek, David; Loomba, Rohit S; Stelter, Jessica; Frommelt, Peter

    2017-01-01

    Quantitative echocardiographic measurements of single ventricular (SV) function have not been incorporated into routine clinical practice. A clinical protocol, which included quantitative measurements of SV deformation (global circumferential and longitudinal strain and strain rate), standard deviation of time to peak systolic strain, myocardial performance index (MPI), dP/dT from an atrioventricular valve regurgitant jet, and superior mesenteric artery resistance index, was instituted for all patients with a history of Fontan procedure undergoing echocardiography. All measures were performed real time during clinically indicated studies and were included in clinical reports. A total of 100 consecutive patients (mean age = 11.95±6.8 years, range 17 months-31.3 years) completed the protocol between September 1, 2014 to April 29, 2015. Deformation measures were completed in 100% of the studies, MPI in 93%, dP/dT in 55%, and superior mesenteric artery Doppler in 82%. The studies were reviewed to assess for efficiency in completing the protocol. The average time for image acquisition was 27.4±8.8 (range 10-62 minutes). The average time to perform deformation measures was 10.8±5.5 minutes (range 5-35 minutes) and time from beginning of imaging to report completion was 53.4±13.7 minutes (range 27-107 minutes). There was excellent inter-observer reliability when deformation indices were blindly repeated. Patients with a single left ventricle had significantly higher circumferential strain and strain rate, longitudinal strain and strain rate, and dP/dT compared to a single right ventricle. There were no differences in quantitative indices of ventricular function between patients 10 years post-Fontan. Advanced quantitative assessment of SV function post-Fontan can be consistently and efficiently performed real time during clinically indicated echocardiograms with excellent reliability. © 2016, Wiley Periodicals, Inc.

  17. Development of a Draft Core Set of Domains for Measuring Shared Decision Making in Osteoarthritis: An OMERACT Working Group on Shared Decision Making.

    Science.gov (United States)

    Toupin-April, Karine; Barton, Jennifer; Fraenkel, Liana; Li, Linda; Grandpierre, Viviane; Guillemin, Francis; Rader, Tamara; Stacey, Dawn; Légaré, France; Jull, Janet; Petkovic, Jennifer; Scholte-Voshaar, Marieke; Welch, Vivian; Lyddiatt, Anne; Hofstetter, Cathie; De Wit, Maarten; March, Lyn; Meade, Tanya; Christensen, Robin; Gaujoux-Viala, Cécile; Suarez-Almazor, Maria E; Boonen, Annelies; Pohl, Christoph; Martin, Richard; Tugwell, Peter S

    2015-12-01

    Despite the importance of shared decision making for delivering patient-centered care in rheumatology, there is no consensus on how to measure its process and outcomes. The aim of this Outcome Measures in Rheumatology (OMERACT) working group is to determine the core set of domains for measuring shared decision making in intervention studies in adults with osteoarthritis (OA), from the perspectives of patients, health professionals, and researchers. We followed the OMERACT Filter 2.0 method to develop a draft core domain set by (1) forming an OMERACT working group; (2) conducting a review of domains of shared decision making; and (3) obtaining opinions of all those involved using a modified nominal group process held at a session activity at the OMERACT 12 meeting. In all, 26 people from Europe, North America, and Australia, including 5 patient research partners, participated in the session activity. Participants identified the following domains for measuring shared decision making to be included as part of the draft core set: (1) identifying the decision, (2) exchanging information, (3) clarifying views, (4) deliberating, (5) making the decision, (6) putting the decision into practice, and (7) assessing the effect of the decision. Contextual factors were also suggested. We proposed a draft core set of shared decision-making domains for OA intervention research studies. Next steps include a workshop at OMERACT 13 to reach consensus on these proposed domains in the wider OMERACT group, as well as to detail subdomains and assess instruments to develop a core outcome measurement set.

  18. CTXA hip--an extension of classical DXA measurements using quantitative CT.

    Science.gov (United States)

    Cann, Christopher E; Adams, Judith E; Brown, J Keenan; Brett, Alan D

    2014-01-01

    Bone mineral density (BMD) estimates for the proximal femur using Dual Energy X-ray Absorptiometry (DXA) are currently considered the standard for making a diagnosis of osteoporosis in an individual patient using BMD alone. We have compared BMD results from a commercial Quantitative CT (QCT) BMD analysis system, "CTXA Hip", which provides clinical data for the proximal femur, to results from DXA. We have also used CTXA Hip to determine cortical and trabecular contributions to total BMD. Sixty-nine patients were scanned using 3D QCT and DXA. CTXA Hip BMD measurements for Total Hip and Femoral Neck were compared to DXA results. Twenty-two women were scanned at 0, 1, 2 years and CTXA Hip and DXA results analyzed for long-term reproducibility. Long-term reproducibility calculated as root-mean-square averages of SDs in vivo was 0.012 g/cm2 (CV = 1.8%) for CTXA Total Hip and 0.011 g/cm2 (CV = 2.0%) for CTXA Femoral Neck compared to 0.014 g/cm2 (CV = 2.0%) and 0.016 g/cm2 (CV = 2.7%), respectively, for DXA. The correlation of Total Hip BMD CTXA vs. DXA was R = 0.97 and for Femoral Neck was R = 0.95 (SEE 0.044 g/cm2 in both cases). Cortical bone comprised 62±5% (mean ± SD) of total hipbone mass in osteoporotic women. CTXA Hip provides substantially the same clinical information as conventional DXA and in addition provides estimates of BMD in separate cortical and trabecular bone compartments, which may be useful in evaluation of bone strength.

  19. CTXA hip--an extension of classical DXA measurements using quantitative CT.

    Directory of Open Access Journals (Sweden)

    Christopher E Cann

    Full Text Available Bone mineral density (BMD estimates for the proximal femur using Dual Energy X-ray Absorptiometry (DXA are currently considered the standard for making a diagnosis of osteoporosis in an individual patient using BMD alone. We have compared BMD results from a commercial Quantitative CT (QCT BMD analysis system, "CTXA Hip", which provides clinical data for the proximal femur, to results from DXA. We have also used CTXA Hip to determine cortical and trabecular contributions to total BMD. Sixty-nine patients were scanned using 3D QCT and DXA. CTXA Hip BMD measurements for Total Hip and Femoral Neck were compared to DXA results. Twenty-two women were scanned at 0, 1, 2 years and CTXA Hip and DXA results analyzed for long-term reproducibility. Long-term reproducibility calculated as root-mean-square averages of SDs in vivo was 0.012 g/cm2 (CV = 1.8% for CTXA Total Hip and 0.011 g/cm2 (CV = 2.0% for CTXA Femoral Neck compared to 0.014 g/cm2 (CV = 2.0% and 0.016 g/cm2 (CV = 2.7%, respectively, for DXA. The correlation of Total Hip BMD CTXA vs. DXA was R = 0.97 and for Femoral Neck was R = 0.95 (SEE 0.044 g/cm2 in both cases. Cortical bone comprised 62±5% (mean ± SD of total hipbone mass in osteoporotic women. CTXA Hip provides substantially the same clinical information as conventional DXA and in addition provides estimates of BMD in separate cortical and trabecular bone compartments, which may be useful in evaluation of bone strength.

  20. Quantitative Measures of Swallowing Deficits in Patients With Parkinson's Disease.

    Science.gov (United States)

    Ellerston, Julia K; Heller, Amanda C; Houtz, Daniel R; Kendall, Katherine A

    2016-05-01

    Dysphagia and associated aspiration pneumonia are commonly reported sequelae of Parkinson's disease (PD). Previous studies of swallowing in patients with PD have described prolonged pharyngeal transit time, delayed onset of pharyngeal transit, cricopharyngeal (CP) achalasia, reduced pharyngeal constriction, and slowed hyolaryngeal elevation. These studies were completed using inconsistent evaluation methodology, reliance on qualitative analysis, and a lack of a large control group, resulting in concerns regarding diagnostic precision. The purpose of this study was to investigate swallowing function in patients with PD using a norm-referenced, quantitative approach. This retrospective study includes 34 patients with a diagnosis of PD referred to a multidisciplinary voice and swallowing clinic. Modified barium swallow studies were performed using quantitative measures of pharyngeal transit time, hyoid displacement, CP sphincter opening, area of the pharynx at maximal constriction, and timing of laryngeal vestibule closure relative to bolus arrival at the CP sphincter. Reduced pharyngeal constriction was found in 30.4%, and a delay in airway closure relative to arrival of the bolus at the CP sphincter was the most common abnormality, present in 62% of patients. Previously reported findings of prolonged pharyngeal transit, poor hyoid elevation, and CP achalasia were not identified as prominent features. © The Author(s) 2015.

  1. A quantitative ELISA procedure for the measurement of membrane-bound platelet-associated IgG (PAIgG).

    Science.gov (United States)

    Lynch, D M; Lynch, J M; Howe, S E

    1985-03-01

    A quantitative ELISA assay for the measurement of in vivo bound platelet-associated IgG (PAIgG) using intact patient platelets is presented. The assay requires quantitation and standardization of the number of platelets bound to microtiter plate wells and an absorbance curve using quantitated IgG standards. Platelet-bound IgG was measured using an F(ab')2 peroxidase labeled anti-human IgG and o-phenylenediamine dihydrochloride (OPD) as the substrate. Using this assay, PAIgG for normal individuals was 2.8 +/- 1.6 fg/platelet (mean +/- 1 SD; n = 30). Increased levels were found in 28 of 30 patients with clinical autoimmune thrombocytopenia (ATP) with a range of 7.0-80 fg/platelet. Normal PAIgG levels were found in 26 of 30 patients with nonimmune thrombocytopenia. In the sample population studied, the PAIgG assay showed a sensitivity of 93%, specificity of 90%, a positive predictive value of 0.90, and a negative predictive value of 0.93. The procedure is highly reproducible (CV = 6.8%) and useful in evaluating patients with suspected immune mediated thrombocytopenia.

  2. Quantitative measurement of blood circulation in tests of rats using nuclear medical methods

    International Nuclear Information System (INIS)

    Ripke, R.

    1980-01-01

    The experiments show that is it is possible to quantitatively assess the blood circulation and, within limits, the germinative function of tests by measuring the impulses of an incorporated radionuclide (99-Tc-pertechnetate) using an uptake measuring instrument. This is a rapid and unbloody method to be adopted in human medicine. 'Acute tests' or pre-damaged tests can thus be exactly diagnosed. In the former case the circulation modification and in the latter the evaluation of the germinative function ability is of main interest. The most important measuring criterion is the 15-minute-uptake U; it represents the blood circulation in the tests measured. The germinative function ability is evaluated on the basis of the accumulation activity Nsub(max). (orig./MG) [de

  3. Quantitative measurement of lightning-induced electron precipitation using VLF remote sensing

    Science.gov (United States)

    Peter, William Bolton

    This dissertation examines the detection of lightning-induced energetic electron precipitation via subionospheric Very Low Frequency (VLF) remote sensing. The primary measurement tool used is a distributed set of VLF observing sites, the Holographic Array for Ionospheric/Lightning Research (HAIL), located along the eastern side of the Rocky Mountains in the Central United States. Measurements of the VLF signal perturbations indicate that 90% of the precipitation occurs over a region ˜8 degrees in latitudinal extent, with the peak of the precipitation poleward displaced ˜7 degrees from the causative discharge. A comparison of the VLF signal perturbations recorded on the HAIL array with a comprehensive model of LEP events allows for the quantitative measurement of electron precipitation and ionospheric density enhancement with unprecedented quantitative detail. The model consists of three major components: a test-particle model of gyroresonant whistler-induced electron precipitation; a Monte Carlo simulation of energy deposition into the ionosphere; and a model of VLF subionospheric signal propagation. For the two representative LEP events studied, the model calculates peak VLF amplitude and phase perturbations within a factor of three of those observed, well within the expected variability of radiation belt flux levels. The modeled precipitated energy flux (E>45 keV) peaks at ˜1 x 10-2 [ergs s-1 cm -2], resulting in a peak loss of ˜0.001% from a single flux tube at L˜2.2, consistent with previous satellite measurements of LEP events. Metrics quantifying the ionospheric density enhancement (N ILDE) and the electron precipitation (Gamma) are strongly correlated with the VLF signal perturbations calculated by the model. A conversion ratio Psi relates VLF signal amplitude perturbations (DeltaA) to the time-integrated precipitation (100-300 keV) along the VLF path (Psi=Gamma / DeltaA). The total precipitation (100-300 keV) induced by one of the representative LEP

  4. Quantitative computed tomography measurements to evaluate airway disease in chronic obstructive pulmonary disease: Relationship to physiological measurements, clinical index and visual assessment of airway disease

    International Nuclear Information System (INIS)

    Nambu, Atsushi; Zach, Jordan; Schroeder, Joyce; Jin, Gongyoung; Kim, Song Soo; Kim, Yu-IL; Schnell, Christina; Bowler, Russell; Lynch, David A.

    2016-01-01

    Purpose: To correlate currently available quantitative CT measurements for airway disease with physiological indices and the body-mass index, airflow obstruction, dyspnea, and exercise capacity (BODE) index in patients with chronic obstructive pulmonary disease (COPD). Materials and methods: This study was approved by our institutional review board (IRB number 2778). Written informed consent was obtained from all subjects. The subjects included 188 current and former cigarette smokers from the COPDGene cohort who underwent inspiratory and expiratory CT and also had physiological measurements for the evaluation of airflow limitation, including FEF25-75%, airway resistance (Raw), and specific airway conductance (sGaw). The BODE index was used as the index of clinical symptoms. Quantitative CT measures included % low attenuation areas [% voxels ≤ 950 Hounsfield unit (HU) on inspiratory CT, %LAA −950ins ], percent gas trapping (% voxels ≤ −856 HU on expiratory CT, %LAA −856exp ), relative inspiratory to expiratory volume change of voxels with attenuation values from −856 to −950 HU [Relative Volume Change (RVC) −856 to −950 ], expiratory to inspiratory ratio of mean lung density (E/I-ratio MLD ), Pi10, and airway wall thickness (WT), luminal diameter (LD) and airway wall area percent (WA%) in the segmental, subsegmental and subsubsegmental bronchi on inspiratory CT. Correlation coefficients were calculated between the QCT measurements and physiological measurements in all subjects and in the subjects with mild emphysema (%LAA −950ins <10%). Univariate and multiple variable analysis for the BODE index were also performed. Adjustments were made for age, gender, smoking pack years, FEF25-75%, Raw, and sGaw. Results: Quantitative CT measurements had significant correlations with physiological indices. Among them, E/I-ratio MLD had the strongest correlations with FEF25-75% (r = −0.648, <0.001) and sGaw (r = −0.624, <0.001) while in the subjects with

  5. Quantitative and simultaneous non-invasive measurement of skin hydration and sebum levels

    Science.gov (United States)

    Ezerskaia, Anna; Pereira, S. F.; Urbach, H. Paul; Verhagen, Rieko; Varghese, Babu

    2016-01-01

    We report a method on quantitative and simultaneous non-contact in-vivo hydration and sebum measurements of the skin using an infrared optical spectroscopic set-up. The method utilizes differential detection with three wavelengths 1720, 1750, and 1770 nm, corresponding to the lipid vibrational bands that lay “in between” the prominent water absorption bands. We have used an emulsifier containing hydro- and lipophilic components to mix water and sebum in various volume fractions which was applied to the skin to mimic different oily-dry skin conditions. We also measured the skin sebum and hydration values on the forehead under natural conditions and its variations to external stimuli. Good agreement was found between our experimental results and reference values measured using conventional biophysical methods such as Corneometer and Sebumeter. PMID:27375946

  6. Piezoelectric tuning fork biosensors for the quantitative measurement of biomolecular interactions

    International Nuclear Information System (INIS)

    Gonzalez, Laura; Maria Benito, Angel; Puig-Vidal, Manel; Otero, Jorge; Rodrigues, Mafalda; Pérez-García, Lluïsa

    2015-01-01

    The quantitative measurement of biomolecular interactions is of great interest in molecular biology. Atomic force microscopy (AFM) has proved its capacity to act as a biosensor and determine the affinity between biomolecules of interest. Nevertheless, the detection scheme presents certain limitations when it comes to developing a compact biosensor. Recently, piezoelectric quartz tuning forks (QTFs) have been used as laser-free detection sensors for AFM. However, only a few studies along these lines have considered soft biological samples, and even fewer constitute quantified molecular recognition experiments. Here, we demonstrate the capacity of QTF probes to perform specific interaction measurements between biotin–streptavidin complexes in buffer solution. We propose in this paper a variant of dynamic force spectroscopy based on representing adhesion energies E (aJ) against pulling rates v (nm s"–"1). Our results are compared with conventional AFM measurements and show the great potential of these sensors in molecular interaction studies. (paper)

  7. Gold Nanoparticle Labeling Based ICP-MS Detection/Measurement of Bacteria, and Their Quantitative Photothermal Destruction

    Science.gov (United States)

    Lin, Yunfeng

    2015-01-01

    Bacteria such as Salmonella and E. coli present a great challenge in public health care in today’s society. Protection of public safety against bacterial contamination and rapid diagnosis of infection require simple and fast assays for the detection and elimination of bacterial pathogens. After utilizing Salmonella DT104 as an example bacterial strain for our investigation, we report a rapid and sensitive assay for the qualitative and quantitative detection of bacteria by using antibody affinity binding, popcorn shaped gold nanoparticle (GNPOPs) labeling, surfance enchanced Raman spectroscopy (SERS), and inductively coupled plasma mass spectrometry (ICP-MS) detection. For qualitative analysis, our assay can detect Salmonella within 10 min by Raman spectroscopy; for quantitative analysis, our assay has the ability to measure as few as 100 Salmonella DT104 in a 1 mL sample (100 CFU/mL) within 40 min. Based on the quantitative detection, we investigated the quantitative destruction of Salmonella DT104, and the assay’s photothermal efficiency in order to reduce the amount of GNPOPs in the assay to ultimately to eliminate any potential side effects/toxicity to the surrounding cells in vivo. Results suggest that our assay may serve as a promising candidate for qualitative and quantitative detection and elimination of a variety of bacterial pathogens. PMID:26417447

  8. The Reliability and Validity of Discrete and Continuous Measures of Psychopathology: A Quantitative Review

    Science.gov (United States)

    Markon, Kristian E.; Chmielewski, Michael; Miller, Christopher J.

    2011-01-01

    In 2 meta-analyses involving 58 studies and 59,575 participants, we quantitatively summarized the relative reliability and validity of continuous (i.e., dimensional) and discrete (i.e., categorical) measures of psychopathology. Overall, results suggest an expected 15% increase in reliability and 37% increase in validity through adoption of a…

  9. Sustainable Urban Forestry Potential Based Quantitative And Qualitative Measurement Using Geospatial Technique

    International Nuclear Information System (INIS)

    Rosli, A Z; Reba, M N M; Roslan, N; Room, M H M

    2014-01-01

    In order to maintain the stability of natural ecosystems around urban areas, urban forestry will be the best initiative to maintain and control green space in our country. Integration between remote sensing (RS) and geospatial information system (GIS) serves as an effective tool for monitoring environmental changes and planning, managing and developing a sustainable urbanization. This paper aims to assess capability of the integration of RS and GIS to provide information for urban forest potential sites based on qualitative and quantitative by using priority parameter ranking in the new township of Nusajaya. SPOT image was used to provide high spatial accuracy while map of topography, landuse, soils group, hydrology, Digital Elevation Model (DEM) and soil series data were applied to enhance the satellite image in detecting and locating present attributes and features on the ground. Multi-Criteria Decision Making (MCDM) technique provides structural and pair wise quantification and comparison elements and criteria for priority ranking for urban forestry purpose. Slope, soil texture, drainage, spatial area, availability of natural resource, and vicinity of urban area are criteria considered in this study. This study highlighted the priority ranking MCDM is cost effective tool for decision-making in urban forestry planning and landscaping

  10. Sustainable Urban Forestry Potential Based Quantitative And Qualitative Measurement Using Geospatial Technique

    Science.gov (United States)

    Rosli, A. Z.; Reba, M. N. M.; Roslan, N.; Room, M. H. M.

    2014-02-01

    In order to maintain the stability of natural ecosystems around urban areas, urban forestry will be the best initiative to maintain and control green space in our country. Integration between remote sensing (RS) and geospatial information system (GIS) serves as an effective tool for monitoring environmental changes and planning, managing and developing a sustainable urbanization. This paper aims to assess capability of the integration of RS and GIS to provide information for urban forest potential sites based on qualitative and quantitative by using priority parameter ranking in the new township of Nusajaya. SPOT image was used to provide high spatial accuracy while map of topography, landuse, soils group, hydrology, Digital Elevation Model (DEM) and soil series data were applied to enhance the satellite image in detecting and locating present attributes and features on the ground. Multi-Criteria Decision Making (MCDM) technique provides structural and pair wise quantification and comparison elements and criteria for priority ranking for urban forestry purpose. Slope, soil texture, drainage, spatial area, availability of natural resource, and vicinity of urban area are criteria considered in this study. This study highlighted the priority ranking MCDM is cost effective tool for decision-making in urban forestry planning and landscaping.

  11. Quantitative MRI and strength measurements in the assessment of muscle quality in Duchenne muscular dystrophy.

    Science.gov (United States)

    Wokke, B H; van den Bergen, J C; Versluis, M J; Niks, E H; Milles, J; Webb, A G; van Zwet, E W; Aartsma-Rus, A; Verschuuren, J J; Kan, H E

    2014-05-01

    The purpose of this study was to assess leg muscle quality and give a detailed description of leg muscle involvement in a series of Duchenne muscular dystrophy patients using quantitative MRI and strength measurements. Fatty infiltration, as well as total and contractile (not fatty infiltrated) cross sectional areas of various leg muscles were determined in 16 Duchenne patients and 11 controls (aged 8-15). To determine specific muscle strength, four leg muscle groups (quadriceps femoris, hamstrings, anterior tibialis and triceps surae) were measured and related to the amount of contractile tissue. In patients, the quadriceps femoris showed decreased total and contractile cross sectional area, attributable to muscle atrophy. The total, but not the contractile, cross sectional area of the triceps surae was increased in patients, corresponding to hypertrophy. Specific strength decreased in all four muscle groups of Duchenne patients, indicating reduced muscle quality. This suggests that muscle hypertrophy and fatty infiltration are two distinct pathological processes, differing between muscle groups. Additionally, the quality of remaining muscle fibers is severely reduced in the legs of Duchenne patients. The combination of quantitative MRI and quantitative muscle testing could be a valuable outcome parameter in longitudinal studies and in the follow-up of therapeutic effects. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Quantitative analysis of a wind energy conversion model

    International Nuclear Information System (INIS)

    Zucker, Florian; Gräbner, Anna; Strunz, Andreas; Meyn, Jan-Peter

    2015-01-01

    A rotor of 12 cm diameter is attached to a precision electric motor, used as a generator, to make a model wind turbine. Output power of the generator is measured in a wind tunnel with up to 15 m s −1 air velocity. The maximum power is 3.4 W, the power conversion factor from kinetic to electric energy is c p = 0.15. The v 3 power law is confirmed. The model illustrates several technically important features of industrial wind turbines quantitatively. (paper)

  13. Quantitative measurement of water diffusion lifetimes at a protein/DNA interface by NMR

    International Nuclear Information System (INIS)

    Gruschus, James M.; Ferretti, James A.

    2001-01-01

    Hydration site lifetimes of slowly diffusing water molecules at the protein/DNA interface of the vnd/NK-2 homeodomain DNA complex were determined using novel three-dimensional NMR techniques. The lifetimes were calculated using the ratios of ROE and NOE cross-relaxation rates between the water and the protein backbone and side chain amides. This calculation of the lifetimes is based on a model of the spectral density function of the water-protein interaction consisting of three timescales of motion: fast vibrational/rotational motion, diffusion into/out of the hydration site, and overall macromolecular tumbling. The lifetimes measured ranged from approximately 400 ps to more than 5 ns, and nearly all the slowly diffusing water molecules detected lie at the protein/DNA interface. A quantitative analysis of relayed water cross-relaxation indicated that even at very short mixing times, 5 ms for ROESY and 12 ms for NOESY, relay of magnetization can make a small but detectable contribution to the measured rates. The temperature dependences of the NOE rates were measured to help discriminate direct dipolar cross-relaxation from chemical exchange. Comparison with several X-ray structures of homeodomain/DNA complexes reveals a strong correspondence between water molecules in conserved locations and the slowly diffusing water molecules detected by NMR. A homology model based on the X-ray structures was created to visualize the conserved water molecules detected at the vnd/NK-2 homeodomain DNA interface. Two chains of water molecules are seen at the right and left sides of the major groove, adjacent to the third helix of the homeodomain. Two water-mediated hydrogen bond bridges spanning the protein/DNA interface are present in the model, one between the backbone of Phe8 and a DNA phosphate, and one between the side chain of Asn51 and a DNA phosphate. The hydrogen bond bridge between Asn51 and the DNA might be especially important since the DNA contact made by the invariant

  14. Quantitative sensory testing measures individual pain responses in emergency department patients

    Directory of Open Access Journals (Sweden)

    Duffy KJ

    2017-05-01

    Full Text Available Kevin J Duffy, Katharyn L Flickinger, Jeffrey T Kristan, Melissa J Repine, Alexandro Gianforcaro, Rebecca B Hasley, Saad Feroz, Jessica M Rupp, Jumana Al-Baghli, Maria L Pacella, Brian P Suffoletto, Clifton W Callaway Department of Emergency Medicine, School of Medicine, University of Pittsburgh, Pittsburgh, PA, USA Background: Refining and individualizing treatment of acute pain in the emergency department (ED is a high priority, given that painful complaints are the most common reasons for ED visits. Few tools exist to objectively measure pain perception in the ED setting. We speculated that variation in perception of fixed painful stimuli would explain individual variation in reported pain and response to treatment among ED patients. Materials and methods: In three studies, we 1 describe performance characteristics of brief quantitative sensory testing (QST in 50 healthy volunteers, 2 test effects of 10 mg oxycodone versus placebo on QST measures in 18 healthy volunteers, and 3 measure interindividual differences in nociception and treatment responses in 198 ED patients with a painful complaint during ED treatment. QST measures adapted for use in the ED included pressure sensation threshold, pressure pain threshold (PPT, pressure pain response (PPR, and cold pain tolerance (CPT tests. Results: First, all QST measures had high inter-rater reliability and test–retest reproducibility. Second, 10 mg oxycodone reduced PPR, increased PPT, and prolonged CPT. Third, baseline PPT and PPR revealed hyperalgesia in 31 (16% ED subjects relative to healthy volunteers. In 173 (88% ED subjects who completed repeat testing 30 minutes after pain treatment, PPT increased and PPR decreased (Cohen’s dz 0.10–0.19. Verbal pain scores (0–10 for the ED complaint decreased by 2.2 (95% confidence intervals [CI]: 1.9, 2.6 (Cohen’s dz 0.97 but did not covary with the changes in PPT and PPR (r=0.05–0.13. Treatment effects were greatest in ED subjects

  15. Quantitative Method of Measuring Metastatic Activity

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1999-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated uroldnase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  16. Timed function tests, motor function measure, and quantitative thigh muscle MRI in ambulant children with Duchenne muscular dystrophy: A cross-sectional analysis.

    Science.gov (United States)

    Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike

    2018-01-01

    The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Development of a Draft Core Set of Domains for Measuring Shared Decision Making in Osteoarthritis: An OMERACT Working Group on Shared Decision Making

    Science.gov (United States)

    Toupin April, Karine; Barton, Jennifer; Fraenkel, Liana; Li, Linda; Grandpierre, Viviane; Guillemin, Francis; Rader, Tamara; Stacey, Dawn; Légaré, France; Jull, Janet; Petkovic, Jennifer; Scholte Voshaar, Marieke; Welch, Vivian; Lyddiatt, Anne; Hofstetter, Cathie; De Wit, Maarten; March, Lyn; Meade, Tanya; Christensen, Robin; Gaujoux-Viala, Cécile; Suarez-Almazor, Maria E.; Boonen, Annelies; Pohl, Christoph; Martin, Richard; Tugwell, Peter

    2015-01-01

    Objective Despite the importance of shared decision making for delivering patient-centred care in rheumatology, there is no consensus on how to measure its process and outcomes. The aim of this OMERACT working group is to determine the core set of domains for measuring shared decision making in intervention studies in adults with osteoarthritis (OA), from the perspective of patients, health professionals and researchers. Methods We followed the OMERACT Filter 2.0 to develop a draft core domain set, which consisted of: (i) forming an OMERACT working group; (ii) conducting a review of domains of shared decision making; and (iii) obtaining the opinions of stakeholders using a modified nominal group process held at a session activity at the OMERACT 2014 meeting. Results 26 stakeholders from Europe, North America and Australia, including 5 patient research partners, participated in the session activity. Participants identified the following domains for measuring shared decision making to be included as part of the Draft Core Set: 1) Identifying the decision; 2) Exchanging Information; 3) Clarifying views; 4) Deliberating; 5) Making the decision; 6) Putting the decision into practice; and 7) Assessing the impact of the decision. Contextual factors were also suggested. Conclusion We propose a Draft Core Set of shared decision making domains for OA intervention research studies. Next steps include a workshop at OMERACT 2016 to reach consensus on these proposed domains in the wider OMERACT group, as well as detail sub-domains and assess instruments to develop a Core Outcome Measurement Set. PMID:25877502

  18. Quantitative computed tomography measurements to evaluate airway disease in chronic obstructive pulmonary disease: Relationship to physiological measurements, clinical index and visual assessment of airway disease

    Energy Technology Data Exchange (ETDEWEB)

    Nambu, Atsushi, E-mail: nambu-a@gray.plala.or.jp [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO, 80206 (United States); Zach, Jordan, E-mail: ZachJ@NJHealth.org [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO, 80206 (United States); Schroeder, Joyce, E-mail: Joyce.schroeder@stanfordalumni.org [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO, 80206 (United States); Jin, Gongyoung, E-mail: gyjin@chonbuk.ac.kr [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO, 80206 (United States); Kim, Song Soo, E-mail: haneul88@hanmail.net [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO, 80206 (United States); Kim, Yu-IL, E-mail: kyionly@chonnam.ac.kr [Department of Medicine, National Jewish Health, Denver, CO (United States); Schnell, Christina, E-mail: SchnellC@NJHealth.org [Department of Medicine, National Jewish Health, Denver, CO (United States); Bowler, Russell, E-mail: BowlerR@NJHealth.org [Division of Pulmonary Medicine, Department of Medicine, National Jewish Health (United States); Lynch, David A., E-mail: LynchD@NJHealth.org [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO, 80206 (United States)

    2016-11-15

    Purpose: To correlate currently available quantitative CT measurements for airway disease with physiological indices and the body-mass index, airflow obstruction, dyspnea, and exercise capacity (BODE) index in patients with chronic obstructive pulmonary disease (COPD). Materials and methods: This study was approved by our institutional review board (IRB number 2778). Written informed consent was obtained from all subjects. The subjects included 188 current and former cigarette smokers from the COPDGene cohort who underwent inspiratory and expiratory CT and also had physiological measurements for the evaluation of airflow limitation, including FEF25-75%, airway resistance (Raw), and specific airway conductance (sGaw). The BODE index was used as the index of clinical symptoms. Quantitative CT measures included % low attenuation areas [% voxels ≤ 950 Hounsfield unit (HU) on inspiratory CT, %LAA{sub −950ins}], percent gas trapping (% voxels ≤ −856 HU on expiratory CT, %LAA {sub −856exp}), relative inspiratory to expiratory volume change of voxels with attenuation values from −856 to −950 HU [Relative Volume Change (RVC){sub −856} {sub to} {sub −950}], expiratory to inspiratory ratio of mean lung density (E/I-ratio {sub MLD}), Pi10, and airway wall thickness (WT), luminal diameter (LD) and airway wall area percent (WA%) in the segmental, subsegmental and subsubsegmental bronchi on inspiratory CT. Correlation coefficients were calculated between the QCT measurements and physiological measurements in all subjects and in the subjects with mild emphysema (%LAA{sub −950ins} <10%). Univariate and multiple variable analysis for the BODE index were also performed. Adjustments were made for age, gender, smoking pack years, FEF25-75%, Raw, and sGaw. Results: Quantitative CT measurements had significant correlations with physiological indices. Among them, E/I-ratio {sub MLD} had the strongest correlations with FEF25-75% (r = −0.648, <0.001) and sGaw (r = −0

  19. A passive quantitative measurement of airway resistance using depth data.

    Science.gov (United States)

    Ostadabbas, Sarah; Bulach, Christoph; Ku, David N; Anderson, Larry J; Ghovanloo, Maysam

    2014-01-01

    The Respiratory Syncytial Virus (RSV) is the most common cause of serious lower respiratory tract infections in infants and young children. RSV often causes increased airway resistance, clinically detected as wheezing by chest auscultation. In this disease, expiratory flows are significantly reduced due to the high resistance in patient's airway passages. A quantitative method for measuring resistance can have a great benefit to diagnosis and management of children with RSV infections as well as with other lung diseases. Airway resistance is defined as the lung pressure divided by the airflow. In this paper, we propose a method to quantify resistance through a simple, non-contact measurement of chest volume that can act as a surrogate measure of the lung pressure and volumetric airflow. We used depth data collected by a Microsoft Kinect camera for the measurement of the lung volume over time. In our experimentation, breathing through a number of plastic straws induced different airway resistances. For a standard spirometry test, our volume/flow estimation using Kinect showed strong correlation with the flow data collected by a commercially-available spirometer (five subjects, each performing 20 breathing trials, correlation coefficient = 0.88, with 95% confidence interval). As the number of straws decreased, emulating a higher airway obstruction, our algorithm was sufficient to distinguish between several levels of airway resistance.

  20. Design and performance of a high-resolution frictional force microscope with quantitative three-dimensional force sensitivity

    International Nuclear Information System (INIS)

    Dienwiebel, M.; Kuyper, E. de; Crama, L.; Frenken, J.W.M.; Heimberg, J.A.; Spaanderman, D.-J.; Glatra van Loon, D.; Zijlstra, T.; Drift, E. van der

    2005-01-01

    In this article, the construction and initial tests of a frictional force microscope are described. The instrument makes use of a microfabricated cantilever that allows one to independently measure the lateral forces in X and Y directions as well as the normal force. We use four fiber-optic interferometers to detect the motion of the sensor in three dimensions. The properties of our cantilevers allow easy and accurate normal and lateral force calibration, making it possible to measure the lateral force on a fully quantitative basis. First experiments on highly oriented pyrolytic graphite demonstrate that the microscope is capable of measuring lateral forces with a resolution down to 15 pN

  1. Quantitative method for measuring heat flux emitted from a cryogenic object

    Science.gov (United States)

    Duncan, R.V.

    1993-03-16

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices.

  2. Quantitative method for measuring heat flux emitted from a cryogenic object

    International Nuclear Information System (INIS)

    Duncan, R.V.

    1993-01-01

    The present invention is a quantitative method for measuring the total heat flux, and of deriving the total power dissipation, of a heat-fluxing object which includes the steps of placing an electrical noise-emitting heat-fluxing object in a liquid helium bath and measuring the superfluid transition temperature of the bath. The temperature of the liquid helium bath is thereafter reduced until some measurable parameter, such as the electrical noise, exhibited by the heat-fluxing object or a temperature-dependent resistive thin film in intimate contact with the heat-fluxing object, becomes greatly reduced. The temperature of the liquid helum bath is measured at this point. The difference between the superfluid transition temperature of the liquid helium bath surrounding the heat-fluxing object, and the temperature of the liquid helium bath when the electrical noise emitted by the heat-fluxing object becomes greatly reduced, is determined. The total heat flux from the heat-fluxing object is determined as a function of this difference between these temperatures. In certain applications, the technique can be used to optimize thermal design parameters of cryogenic electronics, for example, Josephson junction and infrared sensing devices

  3. Quantitative dual energy CT measurements in rabbit VX2 liver tumors: Comparison to perfusion CT measurements and histopathological findings

    International Nuclear Information System (INIS)

    Zhang, Long Jiang; Wu, Shengyong; Wang, Mei; Lu, Li; Chen, Bo; Jin, Lixin; Wang, Jiandong; Larson, Andrew C.; Lu, Guang Ming

    2012-01-01

    Purpose: To evaluate the correlation between quantitative dual energy CT and perfusion CT measurements in rabbit VX2 liver tumors. Materials and methods: This study was approved by the institutional animal care and use committee at our institution. Nine rabbits with VX2 liver tumors underwent contrast-enhanced dual energy CT and perfusion CT. CT attenuation for the tumors and normal liver parenchyma and tumor-to-liver ratio were obtained at the 140 kVp, 80 kVp, average weighted images and dual energy CT iodine maps. Quantitative parameters for the viable tumor and adjacent liver were measured with perfusion CT. The correlation between the enhancement values of the tumor in iodine maps and perfusion CT parameters of each tumor was analyzed. Radiation dose from dual energy CT and perfusion CT was measured. Results: Enhancement values for the tumor were higher than that for normal liver parenchyma at the hepatic arterial phase (P < 0.05). The highest tumor-to-liver ratio was obtained in hepatic arterial phase iodine map. Hepatic blood flow of the tumor was higher than that for adjacent liver (P < 0.05). Enhancement values of hepatic tumors in the iodine maps positively correlated with permeability of capillary vessel surface (r = 0.913, P < 0.001), hepatic blood flow (r = 0.512, P = 0.010), and hepatic blood volume (r = 0.464, P = 0.022) at the hepatic arterial phases. The effective radiation dose from perfusion CT was higher than that from DECT (P < 0.001). Conclusions: The enhancement values for viable tumor tissues measured in iodine maps were well correlated to perfusion CT measurements in rabbit VX2 liver tumors. Compared with perfusion CT, dual energy CT of the liver required a lower radiation dose.

  4. Quantitative measurement of piezoelectric coefficient of thin film using a scanning evanescent microwave microscope.

    Science.gov (United States)

    Zhao, Zhenli; Luo, Zhenlin; Liu, Chihui; Wu, Wenbin; Gao, Chen; Lu, Yalin

    2008-06-01

    This article describes a new approach to quantitatively measure the piezoelectric coefficients of thin films at the microscopic level using a scanning evanescent microwave microscope. This technique can resolve 10 pm deformation caused by the piezoelectric effect and has the advantages of high scanning speed, large scanning area, submicron spatial resolution, and a simultaneous accessibility to many other related properties. Results from the test measurements on the longitudinal piezoelectric coefficient of PZT thin film agree well with those from other techniques listed in literatures.

  5. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    Science.gov (United States)

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  6. Measuring Filament Orientation: A New Quantitative, Local Approach

    Energy Technology Data Exchange (ETDEWEB)

    Green, C.-E.; Cunningham, M. R.; Jones, P. A. [School of Physics, University of New South Wales, Sydney, NSW, 2052 (Australia); Dawson, J. R. [CSIRO Astronomy and Space Science, Australia Telescope National Facility, P.O. Box 76, Epping, NSW 1710 (Australia); Novak, G. [Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA) and Department of Physics and Astronomy, Northwestern University, 2145 Sheridan Road, Evanston, IL 60208 (United States); Fissel, L. M. [National Radio Astronomy Observatory (NRAO), 520 Edgemont Road, Charlottesville, VA, 22903 (United States)

    2017-09-01

    The relative orientation between filamentary structures in molecular clouds and the ambient magnetic field provides insight into filament formation and stability. To calculate the relative orientation, a measurement of filament orientation is first required. We propose a new method to calculate the orientation of the one-pixel-wide filament skeleton that is output by filament identification algorithms such as filfinder. We derive the local filament orientation from the direction of the intensity gradient in the skeleton image using the Sobel filter and a few simple post-processing steps. We call this the “Sobel-gradient method.” The resulting filament orientation map can be compared quantitatively on a local scale with the magnetic field orientation map to then find the relative orientation of the filament with respect to the magnetic field at each point along the filament. It can also be used for constructing radial profiles for filament width fitting. The proposed method facilitates automation in analyses of filament skeletons, which is imperative in this era of “big data.”.

  7. Measuring Filament Orientation: A New Quantitative, Local Approach

    Science.gov (United States)

    Green, C.-E.; Dawson, J. R.; Cunningham, M. R.; Jones, P. A.; Novak, G.; Fissel, L. M.

    2017-09-01

    The relative orientation between filamentary structures in molecular clouds and the ambient magnetic field provides insight into filament formation and stability. To calculate the relative orientation, a measurement of filament orientation is first required. We propose a new method to calculate the orientation of the one-pixel-wide filament skeleton that is output by filament identification algorithms such as filfinder. We derive the local filament orientation from the direction of the intensity gradient in the skeleton image using the Sobel filter and a few simple post-processing steps. We call this the “Sobel-gradient method.” The resulting filament orientation map can be compared quantitatively on a local scale with the magnetic field orientation map to then find the relative orientation of the filament with respect to the magnetic field at each point along the filament. It can also be used for constructing radial profiles for filament width fitting. The proposed method facilitates automation in analyses of filament skeletons, which is imperative in this era of “big data.”

  8. Measuring Filament Orientation: A New Quantitative, Local Approach

    International Nuclear Information System (INIS)

    Green, C.-E.; Cunningham, M. R.; Jones, P. A.; Dawson, J. R.; Novak, G.; Fissel, L. M.

    2017-01-01

    The relative orientation between filamentary structures in molecular clouds and the ambient magnetic field provides insight into filament formation and stability. To calculate the relative orientation, a measurement of filament orientation is first required. We propose a new method to calculate the orientation of the one-pixel-wide filament skeleton that is output by filament identification algorithms such as filfinder. We derive the local filament orientation from the direction of the intensity gradient in the skeleton image using the Sobel filter and a few simple post-processing steps. We call this the “Sobel-gradient method.” The resulting filament orientation map can be compared quantitatively on a local scale with the magnetic field orientation map to then find the relative orientation of the filament with respect to the magnetic field at each point along the filament. It can also be used for constructing radial profiles for filament width fitting. The proposed method facilitates automation in analyses of filament skeletons, which is imperative in this era of “big data.”

  9. Standardized Approach to Quantitatively Measure Residual Limb Skin Health in Individuals with Lower Limb Amputation.

    Science.gov (United States)

    Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K

    2017-07-01

    Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n  = 5) and transfemoral ( n  = 5) amputation were compared to able-limb controls ( n  = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.

  10. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  11. Quantitative computed tomography in measurement of vertebral trabecular bone mass

    International Nuclear Information System (INIS)

    Nilsson, M.; Johnell, O.; Jonsson, K.; Redlund-Johnell, I.

    1988-01-01

    Measurement of bone mineral concentration (BMC) can be done by several modalities. Quantitative computed tomography (QCT) can be used for measurements at different sites and with different types of bone (trabecular-cortical). This study presents a modified method reducing the influence of fat. Determination of BMC was made from measurements with single-energy computed tomography (CT) of the mean Hounsfield number in the trabecular part of the L1 vertebra. The method takes into account the age-dependent composition of the trabecular part of the vertebra. As the amount of intravertebral fat increases with age, the effective atomic number for these parts decreases. This results in a non-linear calibration curve for single-energy CT. Comparison of BMC values using the non-linear calibration curve or the traditional linear calibration with those obtained with a pixel-by-pixel based electron density calculation method (theoretically better) showed results clearly in favor of the non-linear method. The material consisted of 327 patients aged 6 to 91 years, of whom 197 were considered normal. The normal data show a sharp decrease in trabecular bone after the age of 50 in women. In men a slower decrease was found. The vertebrae were larger in men than in women. (orig.)

  12. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    Science.gov (United States)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  13. Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing

    Science.gov (United States)

    Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak

    2012-01-01

    This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700

  14. Measuring patient participation in surgical treatment decision-making from healthcare professionals' perspective.

    Science.gov (United States)

    Heggland, Liv-Helen; Mikkelsen, Aslaug; Øgaard, Torvald; Hausken, Kjell

    2014-02-01

    To develop, empirical test, and validate an instrument measuring patient participation in surgical treatment decision-making from healthcare professionals' perspective. Since the advent of New Public Management in many Western countries, patient participation in healthcare decision-making has been considered to be a best practice. A common notion is that well-educated and well-informed public want to choose their own treatments and providers and want to ask questions about the quality of their health services. Survey. A self-report-measuring instrument was designed and administered to 620 healthcare professionals. Items were developed, validated and tested by 451 nurses and physicians working in six surgical wards in a University Hospital in Norway. A 16-item scale with the following four dimensions was developed: information dissemination, formulation of options, integration of information and control. Factor analysis procedures and reliability testing were performed. A one-way, between-groups analysis of variance was conducted to compare doctors' and nurses' opinions on four dimensions of patient participation in surgical treatment decision-making. This article shows that patient participation in surgical treatment decision-making can be measured by a 16-item scale and four distinct dimensions. The analysis demonstrated a reasonable level of construct validity and reliability. Nurses and physicians have a positive attitude towards patient participation overall, but the two groups differ in the extent to which they accept the idea of patient participation in treatment decision-making. The instrument can be a tool for managers and healthcare professionals in the implementation of patient participation in clinical practice. Data from the instrument can be useful to identify health services being provided and what areas that could strengthen patient participation. © 2013 Blackwell Publishing Ltd.

  15. Towards a Quantitative Performance Measurement Framework to Assess the Impact of Geographic Information Standards

    Science.gov (United States)

    Vandenbroucke, D.; Van Orshoven, J.; Vancauwenberghe, G.

    2012-12-01

    Over the last decennia, the use of Geographic Information (GI) has gained importance, in public as well as in private sector. But even if many spatial data and related information exist, data sets are scattered over many organizations and departments. In practice it remains difficult to find the spatial data sets needed, and to access, obtain and prepare them for using in applications. Therefore Spatial Data Infrastructures (SDI) haven been developed to enhance the access, the use and sharing of GI. SDIs consist of a set of technological and non-technological components to reach this goal. Since the nineties many SDI initiatives saw light. Ultimately, all these initiatives aim to enhance the flow of spatial data between organizations (users as well as producers) involved in intra- and inter-organizational and even cross-country business processes. However, the flow of information and its re-use in different business processes requires technical and semantic interoperability: the first should guarantee that system components can interoperate and use the data, while the second should guarantee that data content is understood by all users in the same way. GI-standards within the SDI are necessary to make this happen. However, it is not known if this is realized in practice. Therefore the objective of the research is to develop a quantitative framework to assess the impact of GI-standards on the performance of business processes. For that purpose, indicators are defined and tested in several cases throughout Europe. The proposed research will build upon previous work carried out in the SPATIALIST project. It analyzed the impact of different technological and non-technological factors on the SDI-performance of business processes (Dessers et al., 2011). The current research aims to apply quantitative performance measurement techniques - which are frequently used to measure performance of production processes (Anupindi et al., 2005). Key to reach the research objectives

  16. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  17. "Why am i a volunteer?": building a quantitative scale

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Cavalcante

    Full Text Available This paper aims to analyze the validity of a quantitative instrument to identify what attracts someone to volunteer work, as well as what makes them stay and what makes them quit such an activity. The theoretical framework lists aspects related to volunteer work, which is followed by a discussion on models of analysis of volunteer motivation. As to the objectives, this research is descriptive, since it presents the analysis of the validity of a quantitative instrument that seeks to understand and describe the reasons for volunteering at the Pastoral da Criança, a Brazilian NGO. This instrument is based on theoretical ideas by Souza, Medeiros and Fernandes (2006. Reliability - Cronbach's Alpha - reached values between 0.7 and 0.8. Regarding Kaiser-Meyer-Olkin measure of sampling adequacy a good index was also obtained: 0.74. Despite the good results of reliability and sampling adequacy of factor analysis, none of the variables resulted in the expected combination, namely: indicators versus profile. It is necessary to improve the semantic meaning of certain factors, or even increase the number of indicators so as to generate additional correlations among them.

  18. OPPORTUNISTIC ASPERGILLUS PATHOGENS MEASURED IN HOME AND HOSPITAL TAP WATER BY MOLD SPECIFIC QUANTITATIVE PCR (MSQPCR)

    Science.gov (United States)

    Opportunistic fungal pathogens are a concern because of the increasing number of immunocompromised patients. The goal of this research was to test a simple extraction method and rapid quantitative PCR (QPCR) measurement of the occurrence of potential pathogens, Aspergillus fumiga...

  19. Quantitative measurement and visualization of biofilm O 2 consumption rates in membrane filtration systems

    KAUST Repository

    Prest, Emmanuelle I E C

    2012-03-01

    There is a strong need for techniques enabling direct assessment of biological activity of biofouling in membrane filtration systems. Here we present a new quantitative and non-destructive method for mapping O 2 dynamics in biofilms during biofouling studies in membrane fouling simulators (MFS). Transparent planar O 2 optodes in combination with a luminescence lifetime imaging system were used to map the two-dimensional distribution of O 2 concentrations and consumption rates inside the MFS. The O 2 distribution was indicative for biofilm development. Biofilm activity was characterized by imaging of O 2 consumption rates, where low and high activity areas could be clearly distinguished. The spatial development of O 2 consumption rates, flow channels and stagnant areas could be determined. This can be used for studies on concentration polarization, i.e. salt accumulation at the membrane surface resulting in increased salt passage and reduced water flux. The new optode-based O 2 imaging technique applied to MFS allows non-destructive and spatially resolved quantitative biological activity measurements (BAM) for on-site biofouling diagnosis and laboratory studies. The following set of complementary tools is now available to study development and control of biofouling in membrane systems: (i) MFS, (ii) sensitive pressure drop measurement, (iii) magnetic resonance imaging, (iv) numerical modelling, and (v) biological activity measurement based on O 2 imaging methodology. © 2011 Elsevier B.V.

  20. The Role of Physiotherapy Extended Scope Practitioners in Musculoskeletal care with Focus on Decision Making and Clinical Outcomes: A Systematic Review of Quantitative and Qualitative Research.

    Science.gov (United States)

    Thompson, Jonathan; Yoward, Samantha; Dawson, Pamela

    2017-06-01

    Physiotherapy extended scope practitioner (ESP) roles are widely utilized in the management of musculoskeletal conditions. The present article reviews the current literature, with particular emphasis on the decision-making process, patient/clinician interaction and clinical outcomes. A systematic review of musculoskeletal extended scope practice was carried out. The review focused on the outcome of interventions, and the interactions and decision-making processes between ESPs and their patients. A wide search strategy was employed, through multiple databases, grey literature and experts in the field. Qualitative and quantitative studies alike were included and a mixed-methods synthesis approach was undertaken in analysing the findings of included studies. A total of 476 articles were identified for inclusion, 25 of which (22 quantitative and three qualitative) meeting the criteria for full quality appraisal and synthesis. It was not possible to conduct a meta-analysis owing to data heterogeneity. The results showed high patient satisfaction with the ESP role, support for ESP staff listing patients for orthopaedic surgery, a high positive correlation of decision making between ESPs and orthopaedic surgeons and evidence of a positive impact on patient outcomes. Qualitative themes reflected the importance of ESP clinical decision making and interpersonal skills and their role in patient education. There is broad support for the physiotherapy ESP role and evidence of favourable outcomes from ESP intervention. Clinical decisions made by ESPs correlate well with those of medical colleagues, although there is a lack of detail explaining the ESP decision-making process itself and the influences and mechanisms by which this occurs. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  1. Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests

    Science.gov (United States)

    Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.

    2010-01-01

    Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.

  2. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  3. Quantitative method for measurement of the Goos-Hanchen effect based on source divergence considerations

    International Nuclear Information System (INIS)

    Gray, Jeffrey F.; Puri, Ashok

    2007-01-01

    In this paper we report on a method for quantitative measurement and characterization of the Goos-Hanchen effect based upon the real world performance of optical sources. A numerical model of a nonideal plane wave is developed in terms of uniform divergence properties. This model is applied to the Goos-Hanchen shift equations to determine beam shift displacement characteristics, which provides quantitative estimates of finite shifts near critical angle. As a potential technique for carrying out a meaningful comparison with experiments, a classical method of edge detection is discussed. To this end a line spread Green's function is defined which can be used to determine the effective transfer function of the near critical angle behavior of divergent plane waves. The process yields a distributed (blurred) output with a line spread function characteristic of the inverse square root nature of the Goos-Hanchen shift equation. A parameter of interest for measurement is given by the edge shift function. Modern imaging and image processing methods provide suitable techniques for exploiting the edge shift phenomena to attain refractive index sensitivities of the order of 10 -6 , comparable with the recent results reported in the literature

  4. Combination of optically measured coordinates and displacements for quantitative investigation of complex objects

    Science.gov (United States)

    Andrae, Peter; Beeck, Manfred-Andreas; Jueptner, Werner P. O.; Nadeborn, Werner; Osten, Wolfgang

    1996-09-01

    Holographic interferometry makes it possible to measure high precision displacement data in the range of the wavelength of the used laser light. However, the determination of 3D- displacement vectors of objects with complex surfaces requires the measurement of 3D-object coordinates not only to consider local sensitivities but to distinguish between in-plane deformation, i.e. strains, and out-of-plane components, i.e. shears, too. To this purpose both the surface displacement and coordinates have to be combined and it is advantageous to make the data available for CAE- systems. The object surface has to be approximated analytically from the measured point cloud to generate a surface mesh. The displacement vectors can be assigned to the nodes of this surface mesh for visualization of the deformation of the object under test. They also can be compared to the results of FEM-calculations or can be used as boundary conditions for further numerical investigations. Here the 3D-object coordinates are measured in a separate topometric set-up using a modified fringe projection technique to acquire absolute phase values and a sophisticated geometrical model to map these phase data onto coordinates precisely. The determination of 3D-displacement vectors requires the measurement of several interference phase distributions for at least three independent sensitivity directions depending on the observation and illumination directions as well as the 3D-position of each measuring point. These geometric quantities have to be transformed into a reference coordinate system of the interferometric set-up in order to calculate the geometric matrix. The necessary transformation can be realized by means of a detection of object features in both data sets and a subsequent determination of the external camera orientation. This paper presents a consistent solution for the measurement and combination of shape and displacement data including their transformation into simulation systems. The

  5. Prognostic value of quantitative fluorodeoxyglucose measurements in newly diagnosed metastatic breast cancer

    International Nuclear Information System (INIS)

    Ulaner, Gary A; Eaton, Anne; Morris, Patrick G; Lilienstein, Joshua; Jhaveri, Komal; Patil, Sujata; Fazio, Maurizio; Larson, Steven; Hudis, Clifford A; Jochelson, Maxine S

    2013-01-01

    The aim of this study was to determine the prognostic value of quantitative fluorodeoxyglucose (FDG) measurements (maximum standardized uptake value [SUV max ], metabolic tumor volume [MTV], and total lesion glycolysis [TLG]) in patients with newly diagnosed metastatic breast cancer (MBC). An IRB-approved retrospective review was performed of patients who underwent FDG positron emission tomography (PET)/computed tomography (CT) from 1/02 to 12/08 within 60 days of diagnosis MBC. Patients with FDG-avid lesions without receiving chemotherapy in the prior 30 days were included. Target lesions in bone, lymph node (LN), liver, and lung were analyzed for SUV max , MTV, and TLG. Medical records were reviewed for patient characteristics and overall survival (OS). Cox regression was used to test associations between quantitative FDG measurements and OS. A total of 253 patients were identified with disease in bone (n = 150), LN (n = 162), liver (n = 48), and lung (n = 66) at the time of metastatic diagnosis. Higher SUV max tertile was associated with worse OS in bone metastases (highest vs. lowest tertile hazard ratio [HR] = 3.1, P < 0.01), but not in LN, liver or lung (all P > 0.1). Higher MTV tertile was associated with worse OS in LN (HR = 2.4, P < 0.01) and liver (HR = 3.0, P = 0.02) metastases, but not in bone (P = 0.22) or lung (P = 0.14). Higher TLG tertile was associated with worse OS in bone (HR = 2.2, P = 0.02), LN (HR = 2.3, P < 0.01), and liver (HR = 4.9, P < 0.01) metastases, but not in lung (P = 0.19). We conclude measures of FDG avidity are prognostic biomarkers in newly diagnosed MBC. SUV max and TLG were both predictors of survival in breast cancer patients with bone metastases. TLG may be a more informative biomarker of OS than SUV max for patients with LN and liver metastases. Measures of fluorodeoxyglucose (FDG) avidity are prognostic biomarkers in newly diagnosed metastatic breast cancer. Volumetric measurements, such as total lesion glycolysis (TLG

  6. The predictive value of quantitative fibronectin testing in combination with cervical length measurement in symptomatic women

    NARCIS (Netherlands)

    Bruijn, Merel M. C.; Kamphuis, Esme I.; Hoesli, Irene M.; Martinez de Tejada, Begoña; Loccufier, Anne R.; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M.; Oudijk, Martijn A.; Jacquemyn, Yves; Schulzke, Sven M.; Vetter, Grit; Hoste, Griet; Vis, Jolande Y.; Kok, Marjolein; Mol, Ben W. J.; van Baaren, Gert-Jan

    2016-01-01

    The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the

  7. Quantitative determination of localized tissue oxygen concentration in vivo by two-photon excitation phosphorescence lifetime measurements

    NARCIS (Netherlands)

    Mik, Egbert G.; van Leeuwen, Ton G.; Raat, Nicolaas J.; Ince, Can

    2004-01-01

    This study describes the use of two-photon excitation phosphorescence lifetime measurements for quantitative oxygen determination in vivo. Doubling the excitation wavelength of Pd-porphyrin from visible light to the infrared allows for deeper tissue penetration and a more precise and confined

  8. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  9. Quantitative heartbeat coupling measures in human-horse interaction.

    Science.gov (United States)

    Lanata, Antonio; Guidi, Andrea; Valenza, Gaetano; Baragli, Paolo; Scilingo, Enzo Pasquale

    2016-08-01

    We present a study focused on a quantitative estimation of a human-horse dynamic interaction. A set of measures based on magnitude and phase coupling between heartbeat dynamics of both humans and horses in three different conditions is reported: no interaction, visual/olfactory interaction and grooming. Specifically, Magnitude Squared Coherence (MSC), Mean Phase Coherence (MPC) and Dynamic Time Warping (DTW) have been used as estimators of the amount of coupling between human and horse through the analysis of their heart rate variability (HRV) time series in a group of eleven human subjects, and one horse. The rationale behind this study is that the interaction of two complex biological systems go towards a coupling process whose dynamical evolution is modulated by the kind and time duration of the interaction itself. We achieved a congruent and consistent statistical significant difference for all of the three indices. Moreover, a Nearest Mean Classifier was able to recognize the three classes of interaction with an accuracy greater than 70%. Although preliminary, these encouraging results allow a discrimination of three distinct phases in a real human-animal interaction opening to the characterization of the empirically proven relationship between human and horse.

  10. Selecting quantitative water management measures at the river basin scale in a global change context

    Science.gov (United States)

    Girard, Corentin; Rinaudo, Jean-Daniel; Caballero, Yvan; Pulido-Velazquez, Manuel

    2013-04-01

    One of the main challenges in the implementation of the Water Framework Directive (WFD) in the European Union is the definition of programme of measures to reach the good status of the European water bodies. In areas where water scarcity is an issue, one of these challenges is the selection of water conservation and capacity expansion measures to ensure minimum environmental in-stream flow requirements. At the same time, the WFD calls for the use of economic analysis to identify the most cost-effective combination of measures at the river basin scale to achieve its objective. With this respect, hydro-economic river basin models, by integrating economics, environmental and hydrological aspects at the river basin scale in a consistent framework, represent a promising approach. This article presents a least-cost river basin optimization model (LCRBOM) that selects the combination of quantitative water management measures to meet environmental flows for future scenarios of agricultural and urban demand taken into account the impact of the climate change. The model has been implemented in a case study on a Mediterranean basin in the south of France, the Orb River basin. The water basin has been identified as in need for quantitative water management measures in order to reach the good status of its water bodies. The LCRBOM has been developed using GAMS, applying Mixed Integer Linear Programming. It is run to select the set of measures that minimizes the total annualized cost of the applied measures, while meeting the demands and minimum in-stream flow constraints. For the economic analysis, the programme of measures is composed of water conservation measures on agricultural and urban water demands. It compares them with measures mobilizing new water resources coming from groundwater, inter-basin transfers and improvement in reservoir operating rules. The total annual cost of each measure is calculated for each demand unit considering operation, maintenance and

  11. Fast and quantitative differentiation of single-base mismatched DNA by initial reaction rate of catalytic hairpin assembly.

    Science.gov (United States)

    Li, Chenxi; Li, Yixin; Xu, Xiao; Wang, Xinyi; Chen, Yang; Yang, Xiaoda; Liu, Feng; Li, Na

    2014-10-15

    The widely used catalytic hairpin assembly (CHA) amplification strategy generally needs several hours to accomplish one measurement based on the prevailingly used maximum intensity detection mode, making it less practical for assays where high throughput or speed is desired. To make the best use of the kinetic specificity of toehold domain for circuit reaction initiation, we developed a mathematical model and proposed an initial reaction rate detection mode to quantitatively differentiate the single-base mismatch. Using the kinetic mode, assay time can be reduced substantially to 10 min for one measurement with the comparable sensitivity and single-base mismatch differentiating ability as were obtained by the maximum intensity detection mode. This initial reaction rate based approach not only provided a fast and quantitative differentiation of single-base mismatch, but also helped in-depth understanding of the CHA system, which will be beneficial to the design of highly sensitive and specific toehold-mediated hybridization reactions. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Limitations of quantitative photoacoustic measurements of blood oxygenation in small vessels

    International Nuclear Information System (INIS)

    Sivaramakrishnan, Mathangi; Maslov, Konstantin; Zhang, Hao F; Stoica, George; Wang, Lihong V

    2007-01-01

    We investigate the feasibility of obtaining accurate quantitative information, such as local blood oxygenation level (sO 2 ), with a spatial resolution of about 50 μm from spectral photoacoustic (PA) measurements. The optical wavelength dependence of the peak values of the PA signals is utilized to obtain the local blood oxygenation level. In our in vitro experimental models, the PA signal amplitude is found to be linearly proportional to the blood optical absorption coefficient when using ultrasonic transducers with central frequencies high enough such that the ultrasonic wavelengths are shorter than the light penetration depth into the blood vessels. For an optical wavelength in the 578-596 nm region, with a transducer central frequency that is above 25 MHz, the sensitivity and accuracy of sO 2 inversion is shown to be better than 4%. The effect of the transducer focal position on the accuracy of quantifying blood oxygenation is found to be negligible. In vivo oxygenation measurements of rat skin microvasculature yield results consistent with those from in vitro studies, although factors specific to in vivo measurements, such as the spectral dependence of tissue optical attenuation, dramatically affect the accuracy of sO 2 quantification in vivo

  13. Neuroeconomic Measures of Social Decision-Making Across the Lifespan

    Directory of Open Access Journals (Sweden)

    Lusha eZhu

    2012-09-01

    Full Text Available Social and decision-making deficits are often the first symptoms of a striking number of neurodegenerative disorders associated with aging. These includes not only disorders that directly impact dopamine and basal ganglia, such as Parkinson’s disorder, but also degeneration in which multiple neural pathways are affected over the course of normal aging. The impact of such deficits can be dramatic, as in cases of financial fraud, which disproportionately affect the elderly. Unlike memory and motor impairments, however, which are readily recognized as symptoms of more serious underlying neurological conditions, social and decision-making deficits often do not elicit comparable concern in the elderly. Furthermore, few behavioral measures exist to quantify these deficits, due in part to our limited knowledge of the core cognitive components or their neurobiological substrates. Here we probe age-related differences in decision-making using a game theory paradigm previously shown to dissociate contributions of basal ganglia and prefrontal regions to behavior. Combined with computational modeling, we provide evidence that behavioral deficits in elderly participants is driven primarily by an over-reliance in trial-and-error reinforcement learning that does not take into account the strategic context, which may underlie elderly’s susceptibility to fraud.

  14. Neuroeconomic measures of social decision-making across the lifespan.

    Science.gov (United States)

    Zhu, Lusha; Walsh, Daniel; Hsu, Ming

    2012-01-01

    Social and decision-making deficits are often the first symptoms of a striking number of neurodegenerative disorders associated with aging. These includes not only disorders that directly impact dopamine and basal ganglia, such as Parkinson's disorder, but also degeneration in which multiple neural pathways are affected over the course of normal aging. The impact of such deficits can be dramatic, as in cases of financial fraud, which disproportionately affect the elderly. Unlike memory and motor impairments, however, which are readily recognized as symptoms of more serious underlying neurological conditions, social and decision-making deficits often do not elicit comparable concern in the elderly. Furthermore, few behavioral measures exist to quantify these deficits, due in part to our limited knowledge of the core cognitive components or their neurobiological substrates. Here we probe age-related differences in decision-making using a game theory paradigm previously shown to dissociate contributions of basal ganglia and prefrontal regions to behavior. Combined with computational modeling, we provide evidence that age-related changes in elderly participants are driven primarily by an over-reliance in trial-and-error reinforcement learning that does not take into account the strategic context, which may underlie cognitive deficits that contribute to social vulnerability in elderly individuals.

  15. Capsular Outcomes After Pediatric Cataract Surgery Without Intraocular Lens Implantation: Qualitative Classification and Quantitative Measurement.

    Science.gov (United States)

    Tan, Xuhua; Lin, Haotian; Lin, Zhuoling; Chen, Jingjing; Tang, Xiangchen; Luo, Lixia; Chen, Weirong; Liu, Yizhi

    2016-03-01

    The objective of this study was to investigate capsular outcomes 12 months after pediatric cataract surgery without intraocular lens implantation via qualitative classification and quantitative measurement.This study is a cross-sectional study that was approved by the institutional review board of Zhongshan Ophthalmic Center of Sun Yat-sen University in Guangzhou, China.Digital coaxial retro-illumination photographs of 329 aphakic pediatric eyes were obtained 12 months after pediatric cataract surgery without intraocular lens implantation. Capsule digital coaxial retro-illumination photographs were divided as follows: anterior capsule opening area (ACOA), posterior capsule opening area (PCOA), and posterior capsule opening opacity (PCOO). Capsular outcomes were qualitatively classified into 3 types based on the PCOO: Type I-capsule with mild opacification but no invasion into the capsule opening; Type II-capsule with moderate opacification accompanied by contraction of the ACOA and invasion to the occluding part of the PCOA; and Type III-capsule with severe opacification accompanied by total occlusion of the PCOA. Software was developed to quantitatively measure the ACOA, PCOA, and PCOO using standardized DCRPs. The relationships between the accurate intraoperative anterior and posterior capsulorhexis sizes and the qualitative capsular types were statistically analyzed.The DCRPs of 315 aphakic eyes (95.8%) of 191 children were included. Capsular outcomes were classified into 3 types: Type I-120 eyes (38.1%); Type II-157 eyes (49.8%); Type III-38 eyes (12.1%). The scores of the capsular outcomes were negatively correlated with intraoperative anterior capsulorhexis size (R = -0.572, P PCOA increased in size from Type I to Type II, and the PCOO increased from Type II to Type III (all P < 0.05).Capsular outcomes after pediatric cataract surgery can be qualitatively classified and quantitatively measured by acquisition, division, definition, and user

  16. Development of a safety decision-making scenario to measure worker safety in agriculture.

    Science.gov (United States)

    Mosher, G A; Keren, N; Freeman, S A; Hurburgh, C R

    2014-04-01

    Human factors play an important role in the management of occupational safety, especially in high-hazard workplaces such as commercial grain-handling facilities. Employee decision-making patterns represent an essential component of the safety system within a work environment. This research describes the process used to create a safety decision-making scenario to measure the process that grain-handling employees used to make choices in a safety-related work task. A sample of 160 employees completed safety decision-making simulations based on a hypothetical but realistic scenario in a grain-handling environment. Their choices and the information they used to make their choices were recorded. Although the employees emphasized safety information in their decision-making process, not all of their choices were safe choices. Factors influencing their choices are discussed, and implications for industry, management, and workers are shared.

  17. A Fan-tastic Quantitative Exploration of Ohm's Law

    Science.gov (United States)

    Mitchell, Brandon; Ekey, Robert; McCullough, Roy; Reitz, William

    2018-02-01

    Teaching simple circuits and Ohm's law to students in the introductory classroom has been extensively investigated through the common practice of using incandescent light bulbs to help students develop a conceptual foundation before moving on to quantitative analysis. However, the bulb filaments' resistance has a large temperature dependence, which makes them less suitable as a tool for quantitative analysis. Some instructors show that light bulbs do not obey Ohm's law either outright or through inquiry-based laboratory experiments. Others avoid the subject altogether by using bulbs strictly for qualitative purposes and then later switching to resistors for a numerical analysis, or by changing the operating conditions of the bulb so that it is "barely" glowing. It seems incongruous to develop a conceptual basis for the behavior of simple circuits using bulbs only to later reveal that they do not follow Ohm's law. Recently, small computer fans were proposed as a suitable replacement of bulbs for qualitative analysis of simple circuits where the current is related to the rotational speed of the fans. In this contribution, we demonstrate that fans can also be used for quantitative measurements and provide suggestions for successful classroom implementation.

  18. Quantitative nitric oxide measurements by means of laser-induced fluorescence in a heavy-duty Diesel engine

    NARCIS (Netherlands)

    Verbiezen, K.; Vliet, van A.P.; Klein-Douwel, R.J.H.; Ganippa, L.C.; Bougie, H.J.T.; Meerts, W.L.; Dam, N.J.; Meulen, ter J.J.

    2005-01-01

    Quantitative in-cylinder laser-induced fluorescence measurements ofnitric oxide in a heavy-duty Diesel engine are presented. Special attention is paid to experimental techniques to assess the attenuation of the laser beam and the fluorescence signal by the cylinder contents.This attenuation can be

  19. Quantitative measurements in laser-induced plasmas using optical probing. Final report

    International Nuclear Information System (INIS)

    Sweeney, D.W.

    1981-01-01

    Optical probing of laser induced plasmas can be used to quantitatively reconstruct electron number densities and magnetic fields. Numerical techniques for extracting quantitative information from the experimental data are described. A computer simulation of optical probing is used to determine the quantitative information that can be reasonably extracted from real experimental interferometric systems to reconstruct electron number density distributions. An example of a reconstructed interferogram shows a steepened electron distribution due to radiation pressure effects

  20. A new importance measure for risk-informed decision making

    International Nuclear Information System (INIS)

    Borgonovo, E.; Apostolakis, G.E.

    2000-01-01

    Recently, several authors pointed out that the traditional importance measures had limitations. In this study, the problem through an analysis at the parameter level was investigated and a new measure was introduced. The measure was based on small parameter variations and is capable of accounting for the importance of a group of components/parameters. The definition, computational steps, and an application of a new importance measure for risk-informed decision making were presented here. Unlike traditional importance measures, differential importance measure (DIM) deals with changes in the various parameters that determine the unavailability/unreliability of a component, e.g., failure rates, common-cause failure rates, individual human errors. The importance of the component unavailability/unreliability can be calculated from the importance of the parameters. DIM can be calculated for the frequency of initiating events, while risk achievement worth (RAW) is limited to binary events, e.g., component unavailability. The changes in parameters are 'small'. This is more realistic than the drastic assumption in RAW that the component is always down. DIM is additive. This allows the evaluation of the impact of changes, such as the relaxation of quality assurance requirements, which affect groups of parameters, e.g., the failure rates of a group of pumps. (M.N.)

  1. Quantitative measurement of ultraviolet-induced damage in cellular DNA by an enzyme immunodot assay

    International Nuclear Information System (INIS)

    Wakizaka, A.; Nishizawa, Y.; Aiba, N.; Okuhara, E.; Takahashi, S.

    1989-01-01

    A simple enzyme immunoassay procedure was developed for the quantitative determination of 254-nm uv-induced DNA damage in cells. With the use of specific antibodies to uv-irradiated DNA and horseradish peroxidase-conjugated antibody to rabbit IgG, the extent of damaged DNA in uv-irradiated rat spleen mononuclear cells was quantitatively measurable. Through the use of this method, the amount of damaged DNA present in 2 X 10(5) cells irradiated at a dose of 75 J/m2 was estimated to be 7 ng equivalents of the standard uv-irradiated DNA. In addition, when the cells, irradiated at 750 J/m2, were incubated for 1 h, the antigenic activity of DNA decreased by 40%, suggesting that a repair of the damaged sites in DNA had proceeded to some extent in the cells

  2. Making a measurable difference in advanced Huntington disease care.

    Science.gov (United States)

    Moskowitz, Carol Brown; Rao, Ashwini K

    2017-01-01

    Neurologists' role in the care of people with advanced Huntington disease (HD) (total functional capacity speech and language pathology), behavioral and psychiatric professionals for problem-solving strategies, which must be reviewed with direct care staff before implementation; (3) encourage and support qualitative and quantitative interdisciplinary research studies, and randomized controlled studies of nonpharmacologic interventions; and (4) assist in the development of meaningful measures to further document what works to provide a good quality of life for the patient and family and a comfortable thoughtful approach to a good death. Collaborative models of care depend on: (1) clear communication; (2) ongoing education and support programs; with (3) pharmacologic and rehabilitation interventions, always in the context of respect for the person with HD, a preservation of the individuals' dignity, autonomy, and individual preferences. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Quantitative photoacoustic integrating sphere (QPAIS platform for absorption coefficient and Grüneisen parameter measurements: Demonstration with human blood

    Directory of Open Access Journals (Sweden)

    Yolanda Villanueva-Palero

    2017-06-01

    Full Text Available Quantitative photoacoustic imaging in biomedicine relies on accurate measurements of relevant material properties of target absorbers. Here, we present a method for simultaneous measurements of the absorption coefficient and Grüneisen parameter of small volume of liquid scattering and absorbing media using a coupled-integrating sphere system which we refer to as quantitative photoacoustic integrating sphere (QPAIS platform. The derived equations do not require absolute magnitudes of optical energy and pressure values, only calibration of the setup using aqueous ink dilutions is necessary. As a demonstration, measurements with blood samples from various human donors are done at room and body temperatures using an incubator. Measured absorption coefficient values are consistent with known oxygen saturation dependence of blood absorption at 750 nm, whereas measured Grüneisen parameter values indicate variability among five different donors. An increasing Grüneisen parameter value with both hematocrit and temperature is observed. These observations are consistent with those reported in literature.

  4. A novel method for morphological pleomorphism and heterogeneity quantitative measurement: Named cell feature level co-occurrence matrix.

    Science.gov (United States)

    Saito, Akira; Numata, Yasushi; Hamada, Takuya; Horisawa, Tomoyoshi; Cosatto, Eric; Graf, Hans-Peter; Kuroda, Masahiko; Yamamoto, Yoichiro

    2016-01-01

    Recent developments in molecular pathology and genetic/epigenetic analysis of cancer tissue have resulted in a marked increase in objective and measurable data. In comparison, the traditional morphological analysis approach to pathology diagnosis, which can connect these molecular data and clinical diagnosis, is still mostly subjective. Even though the advent and popularization of digital pathology has provided a boost to computer-aided diagnosis, some important pathological concepts still remain largely non-quantitative and their associated data measurements depend on the pathologist's sense and experience. Such features include pleomorphism and heterogeneity. In this paper, we propose a method for the objective measurement of pleomorphism and heterogeneity, using the cell-level co-occurrence matrix. Our method is based on the widely used Gray-level co-occurrence matrix (GLCM), where relations between neighboring pixel intensity levels are captured into a co-occurrence matrix, followed by the application of analysis functions such as Haralick features. In the pathological tissue image, through image processing techniques, each nucleus can be measured and each nucleus has its own measureable features like nucleus size, roundness, contour length, intra-nucleus texture data (GLCM is one of the methods). In GLCM each nucleus in the tissue image corresponds to one pixel. In this approach the most important point is how to define the neighborhood of each nucleus. We define three types of neighborhoods of a nucleus, then create the co-occurrence matrix and apply Haralick feature functions. In each image pleomorphism and heterogeneity are then determined quantitatively. For our method, one pixel corresponds to one nucleus feature, and we therefore named our method Cell Feature Level Co-occurrence Matrix (CFLCM). We tested this method for several nucleus features. CFLCM is showed as a useful quantitative method for pleomorphism and heterogeneity on histopathological image

  5. Quantitative autistic trait measurements index background genetic risk for ASD in Hispanic families.

    Science.gov (United States)

    Page, Joshua; Constantino, John Nicholas; Zambrana, Katherine; Martin, Eden; Tunc, Ilker; Zhang, Yi; Abbacchi, Anna; Messinger, Daniel

    2016-01-01

    Recent studies have indicated that quantitative autistic traits (QATs) of parents reflect inherited liabilities that may index background genetic risk for clinical autism spectrum disorder (ASD) in their offspring. Moreover, preferential mating for QATs has been observed as a potential factor in concentrating autistic liabilities in some families across generations. Heretofore, intergenerational studies of QATs have focused almost exclusively on Caucasian populations-the present study explored these phenomena in a well-characterized Hispanic population. The present study examined QAT scores in siblings and parents of 83 Hispanic probands meeting research diagnostic criteria for ASD, and 64 non-ASD controls, using the Social Responsiveness Scale-2 (SRS-2). Ancestry of the probands was characterized by genotype, using information from 541,929 single nucleotide polymorphic markers. In families of Hispanic children with an ASD diagnosis, the pattern of quantitative trait correlations observed between ASD-affected children and their first-degree relatives (ICCs on the order of 0.20), between unaffected first-degree relatives in ASD-affected families (sibling/mother ICC = 0.36; sibling/father ICC = 0.53), and between spouses (mother/father ICC = 0.48) were in keeping with the influence of transmitted background genetic risk and strong preferential mating for variation in quantitative autistic trait burden. Results from analysis of ancestry-informative genetic markers among probands in this sample were consistent with that from other Hispanic populations. Quantitative autistic traits represent measurable indices of inherited liability to ASD in Hispanic families. The accumulation of autistic traits occurs within generations, between spouses, and across generations, among Hispanic families affected by ASD. The occurrence of preferential mating for QATs-the magnitude of which may vary across cultures-constitutes a mechanism by which background genetic liability

  6. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  7. Failure to Integrate Quantitative Measurement Methods of Ocular Inflammation Hampers Clinical Practice and Trials on New Therapies for Posterior Uveitis.

    Science.gov (United States)

    Herbort, Carl P; Tugal-Tutkun, Ilknur; Neri, Piergiorgio; Pavésio, Carlos; Onal, Sumru; LeHoang, Phuc

    2017-05-01

    Uveitis is one of the fields in ophthalmology where a tremendous evolution took place in the past 25 years. Not only did we gain access to more efficient, more targeted, and better tolerated therapies, but also in parallel precise and quantitative measurement methods developed allowing the clinician to evaluate these therapies and adjust therapeutic intervention with a high degree of precision. Objective and quantitative measurement of the global level of intraocular inflammation became possible for most inflammatory diseases with direct or spill-over anterior chamber inflammation, thanks to laser flare photometry. The amount of retinal inflammation could be quantified by using fluorescein angiography to score retinal angiographic signs. Indocyanine green angiography gave imaging insight into the hitherto inaccessible choroidal compartment, rendering possible the quantification of choroiditis by scoring indocyanine green angiographic signs. Optical coherence tomography has enabled measurement and objective monitoring of retinal and choroidal thickness. This multimodal quantitative appraisal of intraocular inflammation represents an exquisite security in monitoring uveitis. What is enigmatic, however, is the slow pace with which these improvements are integrated in some areas. What is even more difficult to understand is the fact that clinical trials to assess new therapeutic agents still mostly rely on subjective parameters such as clinical evaluation of vitreous haze as a main endpoint; whereas a whole array of precise, quantitative, and objective modalities are available for the design of clinical studies. The scope of this work was to review the quantitative investigations that improved the management of uveitis in the past 2-3 decades.

  8. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  9. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  10. Quantitative Methods Intervention: What Do the Students Want?

    Science.gov (United States)

    Frankland, Lianne; Harrison, Jacqui

    2016-01-01

    The shortage of social science graduates with competent quantitative skills jeopardises the competitive UK economy, public policy making effectiveness and the status the UK has as a world leader in higher education and research (British Academy for Humanities and Social Sciences, 2012). There is a growing demand for quantitative skills across all…

  11. Risk Metrics and Measures for an Extended PSA

    International Nuclear Information System (INIS)

    Wielenberg, A.; Loeffler, H.; Hasnaoui, C.; Burgazzi, L.; Cazzoli, E.; Jan, P.; La Rovere, S.; Siklossy, T.; Vitazkova, J.; Raimond, E.

    2016-01-01

    This report provides a review of the main used risk measures for Level 1 and Level 2 PSA. It depicts their advantages, limitations and disadvantages and develops some more precise risk measures relevant for extended PSAs and helpful for decision-making. This report does not recommend or suggest any quantitative value for the risk measures. It does not discuss in details decision-making based on PSA results neither. The choice of one appropriate risk measure or a set of risk measures depends on the decision making approach as well as on the issue to be decided. The general approach for decision making aims at a multi-attribute approach. This can include the use of several risk measures as appropriate. Section 5 provides some recommendations on the main risk metrics to be used for an extended PSA. For Level 1 PSA, Fuel Damage Frequency and Radionuclide Mobilization Frequency are recommended. For Level 2 PSA, the characterization of loss of containment function and a total risk measure based on the aggregated activity releases of all sequences rated by their frequencies is proposed. (authors)

  12. Damage detection using piezoelectric transducers and the Lamb wave approach: II. Robust and quantitative decision making

    International Nuclear Information System (INIS)

    Lu, Y; Wang, X; Tang, J; Ding, Y

    2008-01-01

    The propagation of Lamb waves generated by piezoelectric transducers in a one-dimensional structure has been studied comprehensively in part I of this two-paper series. Using the information embedded in the propagating waveforms, we expect to make a decision on whether damage has occurred; however, environmental and operational variances inevitably complicate the problem. To better detect the damage under these variances, we present in this paper a robust and quantitative decision-making methodology involving advanced signal processing and statistical analysis. In order to statistically evaluate the features in Lamb wave propagation in the presence of noise, we collect multiple time series (baseline signals) from the undamaged beam. A combination of the improved adaptive harmonic wavelet transform (AHWT) and the principal component analysis (PCA) is performed on the baseline signals to highlight the critical features of Lamb wave propagation in the undamaged structure. The detection of damage is facilitated by comparing the features of the test signal collected from the test structure (damaged or undamaged) with the features of the baseline signals. In this process, we employ Hotelling's T 2 statistical analysis to first purify the baseline dataset and then to quantify the deviation of the test data vector from the baseline dataset. Through experimental and numerical studies, we systematically investigate the proposed methodology in terms of the detectability (capability of detecting damage), the sensitivity (with respect to damage severity and excitation frequency) and the robustness against noises. The parametric studies also validate, from the signal processing standpoint, the guidelines of Lamb-wave-based damage detection developed in part I

  13. Eye tracking measures of uncertainty during perceptual decision making.

    Science.gov (United States)

    Brunyé, Tad T; Gardony, Aaron L

    2017-10-01

    Perceptual decision making involves gathering and interpreting sensory information to effectively categorize the world and inform behavior. For instance, a radiologist distinguishing the presence versus absence of a tumor, or a luggage screener categorizing objects as threatening or non-threatening. In many cases, sensory information is not sufficient to reliably disambiguate the nature of a stimulus, and resulting decisions are done under conditions of uncertainty. The present study asked whether several oculomotor metrics might prove sensitive to transient states of uncertainty during perceptual decision making. Participants viewed images with varying visual clarity and were asked to categorize them as faces or houses, and rate the certainty of their decisions, while we used eye tracking to monitor fixations, saccades, blinks, and pupil diameter. Results demonstrated that decision certainty influenced several oculomotor variables, including fixation frequency and duration, the frequency, peak velocity, and amplitude of saccades, and phasic pupil diameter. Whereas most measures tended to change linearly along with decision certainty, pupil diameter revealed more nuanced and dynamic information about the time course of perceptual decision making. Together, results demonstrate robust alterations in eye movement behavior as a function of decision certainty and attention demands, and suggest that monitoring oculomotor variables during applied task performance may prove valuable for identifying and remediating transient states of uncertainty. Published by Elsevier B.V.

  14. Art-making in a family medicine clerkship: how does it affect medical student empathy?

    Science.gov (United States)

    Potash, Jordan S; Chen, Julie Y; Lam, Cindy L K; Chau, Vivian T W

    2014-11-28

    To provide patient-centred holistic care, doctors must possess good interpersonal and empathic skills. Medical schools traditionally adopt a skills-based approach to such training but creative engagement with the arts has also been effective. A novel arts-based approach may help medical students develop empathic understanding of patients and thus contribute to medical students' transformative process into compassionate doctors. This study aimed to evaluate the impact of an arts-making workshop on medical student empathy. This was a mixed-method quantitative-qualitative study. In the 2011-12 academic year, all 161 third year medical students at the University of Hong Kong were randomly allocated into either an arts-making workshop or a problem-solving workshop during the Family Medicine clerkship according to a centrally-set timetable. Students in the arts-making workshop wrote a poem, created artwork and completed a reflective essay while students in the conventional workshop problem-solved clinical cases and wrote a case commentary. All students who agreed to participate in the study completed a measure of empathy for medical students, the Jefferson Scale of Empathy (JSE) (student version), at the start and end of the clerkship. Quantitative data analysis: Paired t-test and repeated measures ANOVA was used to compare the change within and between groups respectively. Qualitative data analysis: Two researchers independently chose representational narratives based on criteria adapted from art therapy. The final 20 works were agreed upon by consensus and thematically analysed using a grounded theory approach. The level of empathy declined in both groups over time, but with no statistically significant differences between groups. For JSE items relating to emotional influence on medical decision making, participants in the arts-making workshop changed more than those in the problem-solving workshop. From the qualitative data, students perceived benefits in arts-making

  15. Quantitative Laughter Detection, Measurement, and Classification-A Critical Survey.

    Science.gov (United States)

    Cosentino, Sarah; Sessa, Salvatore; Takanishi, Atsuo

    2016-01-01

    The study of human nonverbal social behaviors has taken a more quantitative and computational approach in recent years due to the development of smart interfaces and virtual agents or robots able to interact socially. One of the most interesting nonverbal social behaviors, producing a characteristic vocal signal, is laughing. Laughter is produced in several different situations: in response to external physical, cognitive, or emotional stimuli; to negotiate social interactions; and also, pathologically, as a consequence of neural damage. For this reason, laughter has attracted researchers from many disciplines. A consequence of this multidisciplinarity is the absence of a holistic vision of this complex behavior: the methods of analysis and classification of laughter, as well as the terminology used, are heterogeneous; the findings sometimes contradictory and poorly documented. This survey aims at collecting and presenting objective measurement methods and results from a variety of different studies in different fields, to contribute to build a unified model and taxonomy of laughter. This could be successfully used for advances in several fields, from artificial intelligence and human-robot interaction to medicine and psychiatry.

  16. Quantitative measure of randomness and order for complete genomes

    Science.gov (United States)

    Kong, Sing-Guan; Fan, Wen-Lang; Chen, Hong-Da; Wigger, Jan; Torda, Andrew E.; Lee, H. C.

    2009-06-01

    We propose an order index, ϕ , which gives a quantitative measure of randomness and order of complete genomic sequences. It maps genomes to a number from 0 (random and of infinite length) to 1 (fully ordered) and applies regardless of sequence length. The 786 complete genomic sequences in GenBank were found to have ϕ values in a very narrow range, ϕg=0.031-0.015+0.028 . We show this implies that genomes are halfway toward being completely random, or, at the “edge of chaos.” We further show that artificial “genomes” converted from literary classics have ϕ ’s that almost exactly coincide with ϕg , but sequences of low information content do not. We infer that ϕg represents a high information-capacity “fixed point” in sequence space, and that genomes are driven to it by the dynamics of a robust growth and evolution process. We show that a growth process characterized by random segmental duplication can robustly drive genomes to the fixed point.

  17. Development of a Draft Core Set of Domains for Measuring Shared Decision Making in Osteoarthritis

    DEFF Research Database (Denmark)

    Toupin-April, Karine; Barton, Jennifer; Fraenkel, Liana

    2015-01-01

    OBJECTIVE: Despite the importance of shared decision making for delivering patient-centered care in rheumatology, there is no consensus on how to measure its process and outcomes. The aim of this Outcome Measures in Rheumatology (OMERACT) working group is to determine the core set of domains...... for measuring shared decision making in intervention studies in adults with osteoarthritis (OA), from the perspectives of patients, health professionals, and researchers. METHODS: We followed the OMERACT Filter 2.0 method to develop a draft core domain set by (1) forming an OMERACT working group; (2) conducting...... a review of domains of shared decision making; and (3) obtaining opinions of all those involved using a modified nominal group process held at a session activity at the OMERACT 12 meeting. RESULTS: In all, 26 people from Europe, North America, and Australia, including 5 patient research partners...

  18. The Study on the Quantitative Analysis in LPG Tank's Fire and Explosion

    Energy Technology Data Exchange (ETDEWEB)

    Bae, S.J.; Kim, B.J. [Department of chemical Engineering, Soongsil University, Seoul (Korea)

    1999-04-01

    Chemical plant's fire and explosion does not only damage to the chemical plants themselves but also damage to people in or near of the accident spot and the neighborhood of chemical plant. For that reason, Chemical process safety management has become important. One of safety management methods is called 'the quantitative analysis', which is used to reduce and prevent the accident. The results of the quantitative analysis could be used to arrange the equipments, evaluate the minimum safety distance, prepare the safety equipments. In this study we make the computer program to make easy to do quantitative analysis of the accident. The output of the computer program is the magnitude of fire(pool fire and fireball) and explosion (UVCE and BLEVE) effects. We used the thermal radiation as a measure of fire magnitude and used the overpressure as a measure of explosion magnitude. In case of BLEVE, the fly distance of fragment can be evaluated. Also probit analysis was done in every case. As the case study, Buchun LPG explosion accident in Korea was analysed by the program developed. The simulation results showed that the permissible distance was 800m and probit analysis showed that 1st degree burn, 2nd degree burn, and death distances are 450, 280, 260m, respectively. the simulation results showed the good agreement with the result from SAFER PROGRAM made by DuPont. 13 refs., 4 figs., 2 tabs.

  19. Information Risk Management: Qualitative or Quantitative? Cross industry lessons from medical and financial fields

    Directory of Open Access Journals (Sweden)

    Upasna Saluja

    2012-06-01

    Full Text Available Enterprises across the world are taking a hard look at their risk management practices. A number of qualitative and quantitative models and approaches are employed by risk practitioners to keep risk under check. As a norm most organizations end up choosing the more flexible, easier to deploy and customize qualitative models of risk assessment. In practice one sees that such models often call upon the practitioners to make qualitative judgments on a relative rating scale which brings in considerable room for errors, biases and subjectivity. On the other hand under the quantitative risk analysis approach, estimation of risk is connected with application of numerical measures of some kind. Medical risk management models lend themselves as ideal candidates for deriving lessons for Information Security Risk Management. We can use this considerably developed understanding of risk management from the medical field especially Survival Analysis towards handling risks that information infrastructures face. Similarly, financial risk management discipline prides itself on perhaps the most quantifiable of models in risk management. Market Risk and Credit Risk Information Security Risk Management can make risk measurement more objective and quantitative by referring to the approach of Credit Risk. During the recent financial crisis many investors and financial institutions lost money or went bankrupt respectively, because they did not apply the basic principles of risk management. Learning from the financial crisis provides some valuable lessons for information risk management.

  20. Development of quantitative x-ray microtomography

    International Nuclear Information System (INIS)

    Deckman, H.W.; Dunsmuir, J.A.; D'Amico, K.L.; Ferguson, S.R.; Flannery, B.P.

    1990-01-01

    The authors have developed several x-ray microtomography systems which function as quantitative three dimensional x-ray microscopes. In this paper the authors describe the evolutionary path followed from making the first high resolution experimental microscopes to later generations which can be routinely used for investigating materials. Developing the instrumentation for reliable quantitative x-ray microscopy using synchrotron and laboratory based x-ray sources has led to other imaging modalities for obtaining temporal and spatial two dimensional information

  1. Developing a Tool for Measuring the Decision-Making Competence of Older Adults

    Science.gov (United States)

    Finucane, Melissa L.; Gullion, Christina M.

    2010-01-01

    The authors evaluated the reliability and validity of a tool for measuring older adults’ decision-making competence (DMC). Two-hundred-five younger adults (25-45 years), 208 young-older adults (65-74 years), and 198 old-older adults (75-97 years) made judgments and decisions related to health, finance, and nutrition. Reliable indices of comprehension, dimension weighting, and cognitive reflection were developed. Unlike previous research, the authors were able to compare old-older with young-older adults’ performance. As hypothesized, old-older adults performed more poorly than young-older adults; both groups of older adults performed more poorly than younger adults. Hierarchical regression analyses showed that a large amount of variance in decision performance across age groups (including mean trends) could be accounted for by social variables, health measures, basic cognitive skills, attitudinal measures, and numeracy. Structural equation modeling revealed significant pathways from three exogenous latent factors (crystallized intelligence, other cognitive abilities, and age) to the endogenous DMC latent factor. Further research is needed to validate the meaning of performance on these tasks for real-life decision making. PMID:20545413

  2. Harnessing monitoring measurements in urban environments for decision making after nuclear accidents

    International Nuclear Information System (INIS)

    Kaiser, J.C.; Proehl, G.

    2007-01-01

    This article gives an overview on the conceptual design of the Inhabited Areas Monitoring Module IA MM which will be introduced into European decision support systems for nuclear emergencies. It will improve the use of monitoring data of radioactive contamination in urban environments for decision making. IAMM converts the dated gamma dose rate (GDR) measurements from geo-referenced locations into maps of surface contamination with an enhanced spatial resolution. Depending on the availability of the monitoring data, IAMM relies on two modes of operation. If there are only a few measurements, these are taken to improve the maps from a deposition model using data assimilation. If the number of measurements is sufficient to apply spatial interpolation IAMM will rely entirely on monitoring data. Suitable geo-referenced data points will be interpreted by IAMM with respect to their detector environment using the concept of location factors. The endpoints of IAMM can be used directly for decision making or dose calculations with either simple dose models or the more refined EuRopean Model for INhabited areas (ERMIN). (orig.)

  3. Quantitative approach for optimizing e-beam condition of photoresist inspection and measurement

    Science.gov (United States)

    Lin, Chia-Jen; Teng, Chia-Hao; Cheng, Po-Chung; Sato, Yoshishige; Huang, Shang-Chieh; Chen, Chu-En; Maruyama, Kotaro; Yamazaki, Yuichiro

    2018-03-01

    Severe process margin in advanced technology node of semiconductor device is controlled by e-beam metrology system and e-beam inspection system with scanning electron microscopy (SEM) image. By using SEM, larger area image with higher image quality is required to collect massive amount of data for metrology and to detect defect in a large area for inspection. Although photoresist is the one of the critical process in semiconductor device manufacturing, observing photoresist pattern by SEM image is crucial and troublesome especially in the case of large image. The charging effect by e-beam irradiation on photoresist pattern causes deterioration of image quality, and it affect CD variation on metrology system and causes difficulties to continue defect inspection in a long time for a large area. In this study, we established a quantitative approach for optimizing e-beam condition with "Die to Database" algorithm of NGR3500 on photoresist pattern to minimize charging effect. And we enhanced the performance of measurement and inspection on photoresist pattern by using optimized e-beam condition. NGR3500 is the geometry verification system based on "Die to Database" algorithm which compares SEM image with design data [1]. By comparing SEM image and design data, key performance indicator (KPI) of SEM image such as "Sharpness", "S/N", "Gray level variation in FOV", "Image shift" can be retrieved. These KPIs were analyzed with different e-beam conditions which consist of "Landing Energy", "Probe Current", "Scanning Speed" and "Scanning Method", and the best e-beam condition could be achieved with maximum image quality, maximum scanning speed and minimum image shift. On this quantitative approach of optimizing e-beam condition, we could observe dependency of SEM condition on photoresist charging. By using optimized e-beam condition, measurement could be continued on photoresist pattern over 24 hours stably. KPIs of SEM image proved image quality during measurement and

  4. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  5. The vexing problem of defining the meaning, role and measurement of values in treatment decision-making.

    Science.gov (United States)

    Charles, Cathy; Gafni, Amiram

    2014-03-01

    Two international movements, evidence-based medicine (EBM) and shared decision-making (SDM) have grappled for some time with issues related to defining the meaning, role and measurement of values/preferences in their respective models of treatment decision-making. In this article, we identify and describe unresolved problems in the way that each movement addresses these issues. The starting point for this discussion is that at least two essential ingredients are needed for treatment decision-making: research information about treatment options and their potential benefits and risks; and the values/preferences of participants in the decision-making process. Both the EBM and SDM movements have encountered difficulties in defining the meaning, role and measurement of values/preferences in treatment decision-making. In the EBM model of practice, there is no clear and consistent definition of patient values/preferences and no guidance is provided on how to integrate these into an EBM model of practice. Methods advocated to measure patient values are also problematic. Within the SDM movement, patient values/preferences tend to be defined and measured in a restrictive and reductionist way as patient preferences for treatment options or attributes of options, while broader underlying value structures are ignored. In both models of practice, the meaning and expected role of physician values in decision-making are unclear. Values clarification exercises embedded in patient decision aids are suggested by SDM advocates to identify and communicate patient values/preferences for different treatment outcomes. Such exercises have the potential to impose a particular decision-making theory and/or process onto patients, which can change the way they think about and process information, potentially impeding them from making decisions that are consistent with their true values. The tasks of clarifying the meaning, role and measurement of values/preferences in treatment decision-making

  6. Do Quantitative Measures of Research Productivity Correlate with Academic Rank in Oral and Maxillofacial Surgery?

    Science.gov (United States)

    Susarla, Srinivas M; Dodson, Thomas B; Lopez, Joseph; Swanson, Edward W; Calotta, Nicholas; Peacock, Zachary S

    2015-08-01

    Academic promotion is linked to research productivity. The purpose of this study was to assess the correlation between quantitative measures of academic productivity and academic rank among academic oral and maxillofacial surgeons. This was a cross-sectional study of full-time academic oral and maxillofacial surgeons in the United States. The predictor variables were categorized as demographic (gender, medical degree, research doctorate, other advanced degree) and quantitative measures of academic productivity (total number of publications, total number of citations, maximum number of citations for a single article, I-10 index [number of publications with ≥ 10 citations], and h-index [number of publications h with ≥ h citations each]). The outcome variable was current academic rank (instructor, assistant professor, associate professor, professor, or endowed professor). Descriptive, bivariate, and multiple regression statistics were computed to evaluate associations between the predictors and academic rank. Receiver-operator characteristic curves were computed to identify thresholds for academic promotion. The sample consisted of 324 academic oral and maxillofacial surgeons, of whom 11.7% were female, 40% had medical degrees, and 8% had research doctorates. The h-index was the most strongly correlated with academic rank (ρ = 0.62, p research activity.

  7. Navigational Traffic Conflict Technique: A Proactive Approach to Quantitative Measurement of Collision Risks in Port Waters

    Science.gov (United States)

    Debnath, Ashim Kumar; Chin, Hoong Chor

    Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.

  8. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  9. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  10. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  11. Quantitative measurement of mixtures by terahertz time–domain ...

    Indian Academy of Sciences (India)

    Administrator

    earth and space science, quality control of food and agricultural products and global environmental monitoring. In quantitative applications, terahertz technology has been widely used for studying dif- ferent kinds of mixtures, such as amino acids,. 8 ter- nary chemical mixtures,. 9 pharmaceuticals,. 10 racemic compounds. 11.

  12. Sooting turbulent jet flame: characterization and quantitative soot measurements

    Science.gov (United States)

    Köhler, M.; Geigle, K. P.; Meier, W.; Crosland, B. M.; Thomson, K. A.; Smallwood, G. J.

    2011-08-01

    Computational fluid dynamics (CFD) modelers require high-quality experimental data sets for validation of their numerical tools. Preferred features for numerical simulations of a sooting, turbulent test case flame are simplicity (no pilot flame), well-defined boundary conditions, and sufficient soot production. This paper proposes a non-premixed C2H4/air turbulent jet flame to fill this role and presents an extensive database for soot model validation. The sooting turbulent jet flame has a total visible flame length of approximately 400 mm and a fuel-jet Reynolds number of 10,000. The flame has a measured lift-off height of 26 mm which acts as a sensitive marker for CFD model validation, while this novel compiled experimental database of soot properties, temperature and velocity maps are useful for the validation of kinetic soot models and numerical flame simulations. Due to the relatively simple burner design which produces a flame with sufficient soot concentration while meeting modelers' needs with respect to boundary conditions and flame specifications as well as the present lack of a sooting "standard flame", this flame is suggested as a new reference turbulent sooting flame. The flame characterization presented here involved a variety of optical diagnostics including quantitative 2D laser-induced incandescence (2D-LII), shifted-vibrational coherent anti-Stokes Raman spectroscopy (SV-CARS), and particle image velocimetry (PIV). Producing an accurate and comprehensive characterization of a transient sooting flame was challenging and required optimization of these diagnostics. In this respect, we present the first simultaneous, instantaneous PIV, and LII measurements in a heavily sooting flame environment. Simultaneous soot and flow field measurements can provide new insights into the interaction between a turbulent vortex and flame chemistry, especially since soot structures in turbulent flames are known to be small and often treated in a statistical manner.

  13. Codon Deviation Coefficient: A novel measure for estimating codon usage bias and its statistical significance

    KAUST Repository

    Zhang, Zhang; Li, Jun; Cui, Peng; Ding, Feng; Li, Ang; Townsend, Jeffrey P; Yu, Jun

    2012-01-01

    measurement of CUB is of fundamental importance to making inferences regarding gene function and genome evolution. However, extant measures of CUB have not fully accounted for the quantitative effect of background nucleotide composition and have

  14. MEASURING ORGANIZATIONAL CULTURE: A QUANTITATIVE-COMPARATIVE ANALYSIS [doi: 10.5329/RECADM.20100902007

    Directory of Open Access Journals (Sweden)

    Valderí de Castro Alcântara

    2010-11-01

    Full Text Available This article aims at the analysis of the organizational culture at enterprises located in two towns with distinct quantitative traits, Rio Paranaíba and Araxá. While the surveyed enterprises in Rio Paranaíba are mostly micro and small enterprises (86%, in Araxá there are mostly medium and large companies (53%. The overall objective is to verify if there are significant differences in organizational culture among these enterprises and if they can be explained by the organization size. The research was quantitative and instruments for data collection were a questionnaire and a scale for measuring organizational culture containing four dimensions: Hierarchical Distance Index (IDH, Individualism Index (INDI, Masculinity Index (MASC and the Uncertainty Control Index (CINC. Tabulation and analysis of data were performed using the PASW Statistics 18, doing descriptive and inferential statistical procedures. Using a Reduction Factor (-21 the achieved indexes were classified into 5 intensity categories (from "very low" to "very high". The Student t test for two means was performed, revealing significant differences in Hierarchical Distance and Individualism between Araxá and Rio Paranaíba enterprises (p <0.05.   Keywords Organizational Culture; Dimensions of Organizational Culture; Araxá; Rio Paranaíba.

  15. Association between quantitative measures obtained using fluorescence-based methods and activity status of occlusal caries lesions in primary molars.

    Science.gov (United States)

    Novaes, Tatiane Fernandes; Reyes, Alessandra; Matos, Ronilza; Antunes-Pontes, Laura Regina; Marques, Renata Pereira de Samuel; Braga, Mariana Minatel; Diniz, Michele Baffi; Mendes, Fausto Medeiros

    2017-05-01

    Fluorescence-based methods (FBM) can add objectiveness to diagnosis strategy for caries. Few studies, however, have focused on the evaluation of caries activity. To evaluate the association between quantitative measures obtained with FBM, clinical parameters acquired from the patients, caries detection, and assessment of activity status in occlusal surfaces of primary molars. Six hundred and six teeth from 113 children (4-14 years) were evaluated. The presence of a biofilm, caries experience, and the number of active lesions were recorded. The teeth were assessed using FBM: DIAGNOdent pen (Lfpen) and Quantitative light-induced fluorescence (QLF). As reference standard, all teeth were evaluated using the ICDAS (International Caries Detection and Assessment System) associated with clinical activity assessments. Multilevel regressions compared the FBM values and evaluated the association between the FBM measures and clinical variables related to the caries activity. The measures from the FBM were higher in cavitated lesions. Only, ∆F values distinguished active and inactive lesions. The LFpen measures were higher in active lesions, at the cavitated threshold (56.95 ± 29.60). Following regression analyses, only the presence of visible biofilm on occlusal surfaces (adjusted prevalence ratio = 1.43) and ∆R values of the teeth (adjusted prevalence ratio = 1.02) were associated with caries activity. Some quantitative measures from FBM parameters are associated with caries activity evaluation, which is similar to the clinical evaluation of the presence of visible biofilm. © 2016 BSPD, IAPD and John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Quantitative computed tomography measurements of emphysema for diagnosing asthma-chronic obstructive pulmonary disease overlap syndrome

    Science.gov (United States)

    Xie, Mengshuang; Wang, Wei; Dou, Shuang; Cui, Liwei; Xiao, Wei

    2016-01-01

    Background The diagnostic criteria of asthma–COPD overlap syndrome (ACOS) are controversial. Emphysema is characteristic of COPD and usually does not exist in typical asthma patients. Emphysema in patients with asthma suggests the coexistence of COPD. Quantitative computed tomography (CT) allows repeated evaluation of emphysema noninvasively. We investigated the value of quantitative CT measurements of emphysema in the diagnosis of ACOS. Methods This study included 404 participants; 151 asthma patients, 125 COPD patients, and 128 normal control subjects. All the participants underwent pulmonary function tests and a high-resolution CT scan. Emphysema measurements were taken with an Airway Inspector software. The asthma patients were divided into high and low emphysema index (EI) groups based on the percentage of low attenuation areas less than −950 Hounsfield units. The characteristics of asthma patients with high EI were compared with those having low EI or COPD. Results The normal value of percentage of low attenuation areas less than −950 Hounsfield units in Chinese aged >40 years was 2.79%±2.37%. COPD patients indicated more severe emphysema and more upper-zone-predominant distribution of emphysema than asthma patients or controls. Thirty-two (21.2%) of the 151 asthma patients had high EI. Compared with asthma patients with low EI, those with high EI were significantly older, more likely to be male, had more pack-years of smoking, had more upper-zone-predominant distribution of emphysema, and had greater airflow limitation. There were no significant differences in sex ratios, pack-years of smoking, airflow limitation, or emphysema distribution between asthma patients with high EI and COPD patients. A greater number of acute exacerbations were seen in asthma patients with high EI compared with those with low EI or COPD. Conclusion Asthma patients with high EI fulfill the features of ACOS, as described in the Global Initiative for Asthma and Global

  17. The development of NEdSERV: quantitative instrumentation to measure service quality in nurse education.

    Science.gov (United States)

    Roberts, P

    1999-07-01

    The political climate of health care provision and education for health care in the latter years of the 20th century is evolving from the uncertainty of newly created markets to a more clearly focused culture of collaboration, dissemination of good practice, with an increased emphasis on quality provision and its measurement. The need for provider units to prove and improve efficiency and effectiveness through evidence-based quality strategies in order to stay firmly in the market place has never been more necessary. The measurement of customer expectations and perceptions of delivered service quality is widely utilized as a basis for customer retention and business growth in both commercial and non-profit organizations. This paper describes the methodological development of NEdSERV--quantitative instrumentation designed to measure and respond to ongoing stakeholder expectations and perceptions of delivered service quality within nurse education.

  18. Quantitative measurements of in-cylinder gas composition in a controlled auto-ignition combustion engine

    Science.gov (United States)

    Zhao, H.; Zhang, S.

    2008-01-01

    One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O2, H2O, CO2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine.

  19. Quantitative measurements of in-cylinder gas composition in a controlled auto-ignition combustion engine

    International Nuclear Information System (INIS)

    Zhao, H; Zhang, S

    2008-01-01

    One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O 2 , H 2 O, CO 2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine

  20. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    Science.gov (United States)

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  1. Exhaust Gas Temperature Measurements in Diagnostics of Turbocharged Marine Internal Combustion Engines Part II Dynamic Measurements

    Directory of Open Access Journals (Sweden)

    Korczewski Zbigniew

    2016-01-01

    Full Text Available The second part of the article describes the technology of marine engine diagnostics making use of dynamic measurements of the exhaust gas temperature. Little-known achievements of Prof. S. Rutkowski of the Naval College in Gdynia (now: Polish Naval Academy in this area are presented. A novel approach is proposed which consists in the use of the measured exhaust gas temperature dynamics for qualitative and quantitative assessment of the enthalpy flux of successive pressure pulses of the exhaust gas supplying the marine engine turbocompressor. General design assumptions are presented for the measuring and diagnostic system which makes use of a sheathed thermocouple installed in the engine exhaust gas manifold. The corrected thermal inertia of the thermocouple enables to reproduce a real time-history of exhaust gas temperature changes.

  2. Quantitative analysis by computer controlled X-ray fluorescence spectrometer

    International Nuclear Information System (INIS)

    Balasubramanian, T.V.; Angelo, P.C.

    1981-01-01

    X-ray fluorescence spectroscopy has become a widely accepted method in the metallurgical field for analysis of both minor and major elements. As encountered in many other analytical techniques, the problem of matrix effect generally known as the interelemental effects is to be dealt with effectively in order to make the analysis accurate. There are several methods by which the effects of matrix on the analyte are minimised or corrected for and the mathematical correction is one among them. In this method the characteristic secondary X-ray intensities are measured from standard samples and correction coefficients. If any, for interelemental effects are evaluated by mathematical calculations. This paper describes attempts to evaluate the correction coefficients for interelemental effects by multiple linear regression programmes using a computer for the quantitative analysis of stainless steel and a nickel base cast alloy. The quantitative results obtained using this method for a standard stainless steel sample are compared with the given certified values. (author)

  3. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  4. Comparison of Quantitative Cartilage T2 Measurements and Qualitative MR Imaging between Professional Ballet Dancers and Healthy Volunteers.

    Science.gov (United States)

    Cha, Jang Gyu; Yi, Ji Sook; Han, Jong Kyu; Lee, Young Koo

    2015-07-01

    To compare qualitative magnetic resonance (MR) images and quantitative T2 measurements of the tibiotalar cartilage between ballerinas and healthy volunteers. Institutional review board approval for this study and informed consent (from all participants) were obtained. MR examinations were performed by using a 3-T MR imaging system with 21 professional female ballet dancers and 20 healthy female volunteers. Two musculoskeletal radiologists qualitatively measured tibiotalar cartilage T2 values in the anterior zones, middle zones, and posterior zones of cartilage. MR findings were also qualitatively analyzed in both groups. The tibial cartilage T2 values measured in the anterior and posterior zones and the talar cartilage T2 values measured in all three zones were significantly higher in the ballerina group than in the control group (P The posterior zones exhibited the highest T2 values among the three tibiotalar cartilage zones in both groups (P the presence of posterior soft-tissue edema (P = .001) and flexor hallucis longus tenosynovitis (P The findings showed a trend toward increasing cartilage T2 values in ballerinas when compared with control subjects, indicating that quantitative T2 measurement may potentially be used as a noninvasive imaging tool for early detection of cartilage lesions in the tibiotalar joint.

  5. A rapid and quantitative assay for measuring antibody-mediated neutralization of West Nile virus infection

    International Nuclear Information System (INIS)

    Pierson, Theodore C.; Sanchez, Melissa D.; Puffer, Bridget A.; Ahmed, Asim A.; Geiss, Brian J.; Valentine, Laura E.; Altamura, Louis A.; Diamond, Michael S.; Doms, Robert W.

    2006-01-01

    West Nile virus (WNV) is a neurotropic flavivirus within the Japanese encephalitis antigenic complex that is responsible for causing West Nile encephalitis in humans. The surface of WNV virions is covered by a highly ordered icosahedral array of envelope proteins that is responsible for mediating attachment and fusion with target cells. These envelope proteins are also primary targets for the generation of neutralizing antibodies in vivo. In this study, we describe a novel approach for measuring antibody-mediated neutralization of WNV infection using virus-like particles that measure infection as a function of reporter gene expression. These reporter virus particles (RVPs) are produced by complementation of a sub-genomic replicon with WNV structural proteins provided in trans using conventional DNA expression vectors. The precision and accuracy of this approach stem from an ability to measure the outcome of the interaction between antibody and viral antigens under conditions that satisfy the assumptions of the law of mass action as applied to virus neutralization. In addition to its quantitative strengths, this approach allows the production of WNV RVPs bearing the prM-E proteins of different WNV strains and mutants, offering considerable flexibility for the study of the humoral immune response to WNV in vitro. WNV RVPs are capable of only a single round of infection, can be used under BSL-2 conditions, and offer a rapid and quantitative approach for detecting virus entry and its inhibition by neutralizing antibody

  6. Quantitative predictions from competition theory with incomplete information on model parameters tested against experiments across diverse taxa

    OpenAIRE

    Fort, Hugo

    2017-01-01

    We derive an analytical approximation for making quantitative predictions for ecological communities as a function of the mean intensity of the inter-specific competition and the species richness. This method, with only a fraction of the model parameters (carrying capacities and competition coefficients), is able to predict accurately empirical measurements covering a wide variety of taxa (algae, plants, protozoa).

  7. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  8. Quantitative Ultrasound for Measuring Obstructive Severity in Children with Hydronephrosis.

    Science.gov (United States)

    Cerrolaza, Juan J; Peters, Craig A; Martin, Aaron D; Myers, Emmarie; Safdar, Nabile; Linguraru, Marius George

    2016-04-01

    We define sonographic biomarkers for hydronephrotic renal units that can predict the necessity of diuretic nuclear renography. We selected a cohort of 50 consecutive patients with hydronephrosis of varying severity in whom 2-dimensional sonography and diuretic mercaptoacetyltriglycine renography had been performed. A total of 131 morphological parameters were computed using quantitative image analysis algorithms. Machine learning techniques were then applied to identify ultrasound based safety thresholds that agreed with the t½ for washout. A best fit model was then derived for each threshold level of t½ that would be clinically relevant at 20, 30 and 40 minutes. Receiver operating characteristic curve analysis was performed. Sensitivity, specificity and area under the receiver operating characteristic curve were determined. Improvement obtained by the quantitative imaging method compared to the Society for Fetal Urology grading system and the hydronephrosis index was statistically verified. For the 3 thresholds considered and at 100% sensitivity the specificities of the quantitative imaging method were 94%, 70% and 74%, respectively. Corresponding area under the receiver operating characteristic curve values were 0.98, 0.94 and 0.94, respectively. Improvement obtained by the quantitative imaging method over the Society for Fetal Urology grade and hydronephrosis index was statistically significant (p hydronephrosis can identify thresholds of clinically significant washout times with 100% sensitivity to decrease the number of diuretic renograms in up to 62% of children. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  9. Quantitative analysis of impact measurements using dynamic load cells

    Directory of Open Access Journals (Sweden)

    Brent J. Maranzano

    2016-03-01

    Full Text Available A mathematical model is used to estimate material properties from a short duration transient impact force measured by dropping spheres onto rectangular coupons fixed to a dynamic load cell. The contact stress between the dynamic load cell surface and the projectile are modeled using Hertzian contact mechanics. Due to the short impact time relative to the load cell dynamics, an additional Kelvin–Voigt element is included in the model to account for the finite response time of the piezoelectric crystal. Calculations with and without the Kelvin–Voigt element are compared to experimental data collected from combinations of polymeric spheres and polymeric and metallic surfaces. The results illustrate that the inclusion of the Kelvin–Voigt element qualitatively captures the post impact resonance and non-linear behavior of the load cell signal and quantitatively improves the estimation of the Young's elastic modulus and Poisson's ratio. Mathematically, the additional KV element couples one additional differential equation to the Hertzian spring-dashpot equation. The model can be numerically integrated in seconds using standard numerical techniques allowing for its use as a rapid technique for the estimation of material properties. Keywords: Young's modulus, Poisson's ratio, Dynamic load cell

  10. Decision making in inter‐corporate projects : A qualitative and quantitative study of project workers in automobile research and pre‐ development projects in Japan and Germany

    OpenAIRE

    Markkula, Petter

    2009-01-01

    This thesis is dealing with the integration of Japanese and German project workers in automobile inter‐corporate research/pre‐development projects. The focus is on better understanding the respective decision making process. As cultural differences play a big role in the way that people behave an extra focus was put on investigating this. The methods chosen for this study were quantitative research in the form of a questionnaire and qualitative research in the form of an interview series. For...

  11. Single-cell quantitative HER2 measurement identifies heterogeneity and distinct subgroups within traditionally defined HER2-positive patients.

    Science.gov (United States)

    Onsum, Matthew D; Geretti, Elena; Paragas, Violette; Kudla, Arthur J; Moulis, Sharon P; Luus, Lia; Wickham, Thomas J; McDonagh, Charlotte F; MacBeath, Gavin; Hendriks, Bart S

    2013-11-01

    Human epidermal growth factor receptor 2 (HER2) is an important biomarker for breast and gastric cancer prognosis and patient treatment decisions. HER2 positivity, as defined by IHC or fluorescent in situ hybridization testing, remains an imprecise predictor of patient response to HER2-targeted therapies. Challenges to correct HER2 assessment and patient stratification include intratumoral heterogeneity, lack of quantitative and/or objective assays, and differences between measuring HER2 amplification at the protein versus gene level. We developed a novel immunofluorescence method for quantitation of HER2 protein expression at the single-cell level on FFPE patient samples. Our assay uses automated image analysis to identify and classify tumor versus non-tumor cells, as well as quantitate the HER2 staining for each tumor cell. The HER2 staining level is converted to HER2 protein expression using a standard cell pellet array stained in parallel with the tissue sample. This approach allows assessment of HER2 expression and heterogeneity within a tissue section at the single-cell level. By using this assay, we identified distinct subgroups of HER2 heterogeneity within traditional definitions of HER2 positivity in both breast and gastric cancers. Quantitative assessment of intratumoral HER2 heterogeneity may offer an opportunity to improve the identification of patients likely to respond to HER2-targeted therapies. The broad applicability of the assay was demonstrated by measuring HER2 expression profiles on multiple tumor types, and on normal and diseased heart tissues. Copyright © 2013 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  12. Simple and fast spectral domain algorithm for quantitative phase imaging of living cells with digital holographic microscopy

    Science.gov (United States)

    Min, Junwei; Yao, Baoli; Ketelhut, Steffi; Kemper, Björn

    2017-02-01

    The modular combination of optical microscopes with digital holographic microscopy (DHM) has been proven to be a powerful tool for quantitative live cell imaging. The introduction of condenser and different microscope objectives (MO) simplifies the usage of the technique and makes it easier to measure different kinds of specimens with different magnifications. However, the high flexibility of illumination and imaging also causes variable phase aberrations that need to be eliminated for high resolution quantitative phase imaging. The existent phase aberrations compensation methods either require add additional elements into the reference arm or need specimen free reference areas or separate reference holograms to build up suitable digital phase masks. These inherent requirements make them unpractical for usage with highly variable illumination and imaging systems and prevent on-line monitoring of living cells. In this paper, we present a simple numerical method for phase aberration compensation based on the analysis of holograms in spatial frequency domain with capabilities for on-line quantitative phase imaging. From a single shot off-axis hologram, the whole phase aberration can be eliminated automatically without numerical fitting or pre-knowledge of the setup. The capabilities and robustness for quantitative phase imaging of living cancer cells are demonstrated.

  13. The Importance of Economic Perspective and Quantitative Approaches in Oncology Value Frameworks of Drug Selection and Shared Decision Making.

    Science.gov (United States)

    Waldeck, A Reginald; Botteman, Marc F; White, Richard E; van Hout, Ben A

    2017-06-01

    The debate around value in oncology drug selection has been prominent in recent years, and several professional bodies have furthered this debate by advocating for so-called value frameworks. Herein, we provide a viewpoint on these value frameworks, emphasizing the need to consider 4 key aspects: (1) the economic underpinnings of value; (2) the importance of the perspective adopted in the valuation; (3) the importance of the difference between absolute and relative measures of risk and measuring patient preferences; and (4) the recognition of multiple quality-of-life (QoL) domains, and the aggregation and valuation of those domains, through utilities within a multicriteria decision analysis, may allow prioritization of QoL above the tallying of safety events, particularly in a value framework focusing on the individual patient. While several frameworks exist, they incorporate different attributes and-importantly-assess value from alternative perspectives, including those of patients, regulators, payers, and society. The various perspectives necessarily lead to potentially different, if not sometimes divergent, conclusions about the valuation. We show that the perspective of the valuation affects the framing of the risk/benefit question and the methodology to measure the individual patient choice, or preference, as opposed to the collective, or population, choice. We focus specifically on the American Society of Clinical Oncology (ASCO) Value Framework. We argue that its laudable intent to assist in shared clinician-patient decision making can be augmented by more formally adopting methodology underpinned by micro- and health economic concepts, as well as application of formal quantitative approaches. Our recommendations for value frameworks focusing on the individual patient, such as the ASCO Value Framework, are 3-fold: (1) ensure that stakeholders understand the importance of the adopted (economic) perspective; (2) consider using exclusively absolute measures of

  14. A Quantitative Tool for Producing DNA-Based Diagnostic Arrays

    Energy Technology Data Exchange (ETDEWEB)

    Tom J. Whitaker

    2008-07-11

    ODN. Studies were conducted using this technique and comparing the results of the radioactive label vs SIRIS measurements of Pt as a function of ODN length and distance of the Pt label from the attachment end. The SIRIS signal was not proportional to the amount of oligo attached to the surface as determined by the decay of the 33P label. We intentionally tested conditions under which one might expect the atomization efficiency to change and we believe this is the problem. Different lengths of oligos, and different placement of the label in the oligo affected the final signal. This obviously makes use of SIRIS as a quantitative tool for oligonucleotides problematic except under highly controlled situations.

  15. Boron concentration measurements by alpha spectrometry and quantitative neutron autoradiography in cells and tissues treated with different boronated formulations and administration protocols

    International Nuclear Information System (INIS)

    Bortolussi, Silva; Ciani, Laura; Postuma, Ian; Protti, Nicoletta; Luca Reversi,; Bruschi, Piero; Ferrari, Cinzia; Cansolino, Laura; Panza, Luigi; Ristori, Sandra; Altieri, Saverio

    2014-01-01

    The possibility to measure boron concentration with high precision in tissues that will be irradiated represents a fundamental step for a safe and effective BNCT treatment. In Pavia, two techniques have been used for this purpose, a quantitative method based on charged particles spectrometry and a boron biodistribution imaging based on neutron autoradiography. A quantitative method to determine boron concentration by neutron autoradiography has been recently set-up and calibrated for the measurement of biological samples, both solid and liquid, in the frame of the feasibility study of BNCT. This technique was calibrated and the obtained results were cross checked with those of α spectrometry, in order to validate them. The comparisons were performed using tissues taken form animals treated with different boron administration protocols. Subsequently the quantitative neutron autoradiography was employed to measure osteosarcoma cell samples treated with BPA and with new boronated formulations. - Highlights: • A method for 10B measurements in samples based on neutron autoradiography was developed. • The results were compared with those of alpha spectrometry applied on tissue and cell samples. • Boronated liposomes were developed and administered to osteosarcoma cell cultures. • Neutron autoradiography was employed to measure boron concentration due to liposomes. • Liposomes were proved to be more effective in concentrating boron in cells than BPA

  16. Decision making in the electricity sector using performance indicators

    Energy Technology Data Exchange (ETDEWEB)

    Domingues, Nuno [ISEL-ADESPA, Lisbon (Portugal); FCT-UNL, Caparica (Portugal); Neves-Silva, Rui; Melo, Joao Joanaz de [FCT-UNL, Caparica (Portugal)

    2017-02-15

    The studies on the electricity sector are usually focused on the supply side, considering consumers as price-takers, i.e. assuming no demand elasticity. The present paper highlights the role of consumers on the electricity sector, assuming that consumers react to electricity prices and make decisions. Many studies focused on the demand side disaggregate consumers by activities, leading to a highly complex analyse. In the present paper, consumers are divided by three main types. In the present paper, the Government makes decisions on the measures to implement to influence the production and the consumption. To study the impact of the Government decisions, the present paper studies and implements a tool: a decision support system. This tool is based on a conceptual model and assists the task of test and analyse the electricity sector using scenarios to obtain a set of performance indicators that would allow to make quantitative balance and to eliminate unfeasible measures. The performance indicators quantify the technical, environmental, social and economical aspects of the electricity sector and help to understand the effect of consumer practices, production technology and Government measures on the electricity sector. Based on the scenarios produced, it is possible to conclude that the price signal is important for consumers and it is a way to guide their behaviour. It is also possible to conclude that is preferable to apply incentives on supporting energy-efficiency measures implementation than on reduce the price of electricity sold to consumers. (orig.)

  17. Decision making in the electricity sector using performance indicators

    International Nuclear Information System (INIS)

    Domingues, Nuno; Neves-Silva, Rui; Melo, Joao Joanaz de

    2017-01-01

    The studies on the electricity sector are usually focused on the supply side, considering consumers as price-takers, i.e. assuming no demand elasticity. The present paper highlights the role of consumers on the electricity sector, assuming that consumers react to electricity prices and make decisions. Many studies focused on the demand side disaggregate consumers by activities, leading to a highly complex analyse. In the present paper, consumers are divided by three main types. In the present paper, the Government makes decisions on the measures to implement to influence the production and the consumption. To study the impact of the Government decisions, the present paper studies and implements a tool: a decision support system. This tool is based on a conceptual model and assists the task of test and analyse the electricity sector using scenarios to obtain a set of performance indicators that would allow to make quantitative balance and to eliminate unfeasible measures. The performance indicators quantify the technical, environmental, social and economical aspects of the electricity sector and help to understand the effect of consumer practices, production technology and Government measures on the electricity sector. Based on the scenarios produced, it is possible to conclude that the price signal is important for consumers and it is a way to guide their behaviour. It is also possible to conclude that is preferable to apply incentives on supporting energy-efficiency measures implementation than on reduce the price of electricity sold to consumers. (orig.)

  18. Resting quantitative cerebral blood flow in schizophrenia measured by pulsed arterial spin labeling perfusion MRI

    OpenAIRE

    Pinkham, Amy; Loughead, James; Ruparel, Kosha; Wu, Wen-Chau; Overton, Eve; Gur, Raquel; Gur, Ruben

    2011-01-01

    Arterial spin labeling imaging (ASL) perfusion MRI is a relatively novel technique that can allow for quantitative measurement of cerebral blood flow (CBF) by using magnetically labeled arterial blood water as an endogenous tracer. Available data on resting CBF in schizophrenia primarily comes from invasive and expensive nuclear medicine techniques that are often limited to small samples and yield mixed results. The noninvasive nature of ASL offers promise for larger-scale studies. The utilit...

  19. Development of a Quantitative Measure of Holistic Nursing Care.

    Science.gov (United States)

    Kinchen, Elizabeth

    2015-09-01

    Holistic care has long been a defining attribute of nursing practice. From the earliest years of its formal history, nursing has favored a holistic approach in the care of patients, and such an approach has become more important over time. The expansion of nursing's responsibility in delivering comprehensive primary care, the recognition of the importance of relationship-centered care, and the need for evidence-based legitimation of holistic nursing care and practices to insurance companies, policy-makers, health care providers, and patients highlight the need to examine the holistic properties of nursing care. The Holistic Caring Inventory is a theoretically sound, valid, and reliable tool; however, it does not comprehensively address attributes that have come to define holistic nursing care, necessitating the development of a more current instrument to measure the elements of a holistic perspective in nursing care. The development of a current and more comprehensive measure of holistic nursing care may be critical in demonstrating the importance of a holistic approach to patient care that reflects the principles of relationship-based care, shared decision-making, authentic presence, and pattern recognition. © The Author(s) 2014.

  20. Large-scale quantitative analysis of painting arts.

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  1. Decision Dissonance: Evaluating an Approach to Measuring the Quality of Surgical Decision Making

    Science.gov (United States)

    Fowler, Floyd J.; Gallagher, Patricia M.; Drake, Keith M.; Sepucha, Karen R.

    2013-01-01

    Background Good decision making has been increasingly cited as a core component of good medical care, and shared decision making is one means of achieving high decision quality. If it is to be a standard, good measures and protocols are needed for assessing the quality of decisions. Consistency with patient goals and concerns is one defining characteristic of a good decision. A new method for evaluating decision quality for major surgical decisions was examined, and a methodology for collecting the needed data was developed. Methods For a national probability sample of fee-for-service Medicare beneficiaries who had a coronary artery bypass graft (CABG), a lumpectomy or a mastectomy for breast cancer, or surgery for prostate cancer during the last half of 2008, a mail survey of selected patients was carried out about one year after the procedures. Patients’ goals and concerns, knowledge, key aspects of interactions with clinicians, and feelings about the decisions were assessed. A Decision Dissonance Score was created that measured the extent to which patient ratings of goals ran counter to the treatment received. The construct and predictive validity of the Decision Dissonance Score was then assessed. Results When data were averaged across all four procedures, patients with more knowledge and those who reported more involvement reported significantly lower Decision Dissonance Scores. Patients with lower Decision Dissonance Scores also reported more confidence in their decisions and feeling more positively about how the treatment turned out, and they were more likely to say that they would make the same decision again. Conclusions Surveying discharged surgery patients is a feasible way to evaluate decision making, and Decision Dissonance appears to be a promising approach to validly measuring decision quality. PMID:23516764

  2. Quantitative measurement of lung density with x-ray CT and positron CT, (2)

    International Nuclear Information System (INIS)

    Ito, Kengo; Ito, Masatoshi; Kubota, Kazuo

    1985-01-01

    Lung density was quantitatively measured on six diseased patients with X-ray CT (XCT) and Positron CT(PCT). The findings are as follows: In the silicosis, extravascular lung density was found to be remarkably increased compared to normals (0.29gcm -3 ), but blood volume was in normal range. In the post-irradiated lung cancers, extravascular lung density increased in the irradiated sites compared to the non-irradiated opposite sites, and blood volume varied in each case. In a patient with chronic heart failure, blood volume decreased (0.11mlcm -3 ) with increased extravascular lung density (0.23gcm -3 ). In the chronic obstructive pulmonary disease, both extravascular lung density and blood volume decreased (0.11gcm -3 and 0.10mlcm -3 respectively). Lung density measured with XCT was constantly lower than that with PCT in all cases. But changes in the values of lung density measured, correlated well with each other. In conclusion, the method presented here may clarify the etiology of the diffuse pulmonary diseases, and be used to differentiate and grade the diseases. (author)

  3. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  4. /sup 99m/Tc-labeled solid-phase meal: a quantitative clinical measurement of human gastric emptying

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J.L.; Beck, W.J.; McDonald, A.P.; Carlson, G.M.; Mathias, J.R.

    1983-08-01

    A solid-phase meal labeled with /sup 99m/Tc-sulfur colloid provides an improved clinical test for the quantitative evaluation of human gastric emptying. We studied 12 healthy male controls and five male patients with known gastric stasis secondary to a vagotomy and drainage procedure. All subjects were fasted for 8 hours before the study, and each consumed an unbuttered biscuit and a poached egg white containing 1 mCi of /sup 99m/Tc-sulfur colloid. For 2 hours, 60-second counts were measured every 10 minutes by a Pho Gamma III scintillation camera. The t/sup 1///sup 2/ for control subjects was 60 minutes, at which time patients with gastric stasis had retained 98% of the test meal. At 120 minutes, control subjects and patients with gastric stasis had 4.7% and 89%, respectively, of the meal remaining in the stomach. The solid-phase test meal labeled with /sup 99m/Tc-sulfur colloid is easy to perform and can be used clinically to quantitatively measure gastric emptying in humans. This test can discriminate between control subjects and patients with known gastric stasis.

  5. /sup 99m/Tc-labeled solid-phase meal: a quantitative clinical measurement of human gastric emptying

    International Nuclear Information System (INIS)

    Martin, J.L.; Beck, W.J.; McDonald, A.P.; Carlson, G.M.; Mathias, J.R.

    1983-01-01

    A solid-phase meal labeled with /sup 99m/Tc-sulfur colloid provides an improved clinical test for the quantitative evaluation of human gastric emptying. We studied 12 healthy male controls and five male patients with known gastric stasis secondary to a vagotomy and drainage procedure. All subjects were fasted for 8 hours before the study, and each consumed an unbuttered biscuit and a poached egg white containing 1 mCi of /sup 99m/Tc-sulfur colloid. For 2 hours, 60-second counts were measured every 10 minutes by a Pho Gamma III scintillation camera. The t 1 / 2 for control subjects was 60 minutes, at which time patients with gastric stasis had retained 98% of the test meal. At 120 minutes, control subjects and patients with gastric stasis had 4.7% and 89%, respectively, of the meal remaining in the stomach. The solid-phase test meal labeled with /sup 99m/Tc-sulfur colloid is easy to perform and can be used clinically to quantitatively measure gastric emptying in humans. This test can discriminate between control subjects and patients with known gastric stasis

  6. A Quantitative Measure For Evaluating Project Uncertainty Under Variation And Risk Effects

    Directory of Open Access Journals (Sweden)

    A. Chenarani

    2017-10-01

    Full Text Available The effects of uncertainty on a project and the risk event as the consequence of uncertainty are analyzed. The uncertainty index is proposed as a quantitative measure for evaluating the uncertainty of a project. This is done by employing entropy as the indicator of system disorder and lack of information. By employing this index, the uncertainty of each activity and its increase due to risk effects as well as project uncertainty changes as a function of time can be assessed. The results are implemented and analyzed for a small turbojet engine development project as the case study. The results of this study can be useful for project managers and other stakeholders for selecting the most effective risk management and uncertainty controlling method.

  7. Quantitative measurement of normal and hydrocephalic cerebrospinal fluid flow using phase contrast cine MR imaging

    International Nuclear Information System (INIS)

    Katayama, Shinji; Asari, Shoji; Ohmoto, Takashi

    1993-01-01

    Measurements of the cerebrospinal fluid (CSF) flow using phase contrast cine magnetic resonance (MR) imaging were performed on a phantom, 12 normal subjects and 20 patients with normal pressure hydrocephalus (NPH). The phantom study demonstrated the applicability of phase contrast in quantitative measurement of the slow flow. The CSF flows of the normal subjects showed a consistent pattern with a to-and-fro movement of the flow in the anterior subarachnoid space at the C2/3 level, and they were dependent on the cardiac cycle in all subjects. However, the patients with NPH showed variable patterns of the CSF pulsatile flow and these patterns could be divided into four types according to velocity and amplitude. The amplitudes of each type were as follows: type 0 (n=1), 87.6 mm; type I (n=2), 58.2 mm (mean); type II (n=6), 48.0±5.0 mm (mean±SEM); and type III (n=11), 19.9±1.8 mm (mean±SEM). The decrease of the amplitudes correlated to a worsening of the clinical symptoms. After the shunting operation, the amplitude of to-and-fro movement of the CSF increased again in the patients with NPH who improved clinically. Some of the type III cases were reclassified type II, I and 0 and also one of the type II cases changed type I after the shunting operation. We conclude that the phase contrast cine MR imaging is a practically and clinically applicable technique for the quantitative measurement of the CSF flow. (author)

  8. Problems involved in quantitative gamma camera scintigraphy. C. Sensitivity and homogeneity

    International Nuclear Information System (INIS)

    Erbsmann, F.; Paternot, J.; Piepsz, A.; Dobbeleire, A.; Froideville, J.L.

    1976-01-01

    A constant sensitivity of the scintillation camera is an important feature of quantitative digital scintigraphy and must be controlled as much as other factors. The phantom distribution is an excellent test of the camera adjusment but according to present knowledge cannot be used to make corrections of any kind. The best way to reduce the effect of spatial sensitivity variations is to use the same part of the detector constantly to measure the standard as well as the two successive kidneys. Users who wish to measure the uptake of both kidneys simultaneously are advised to measure the standard in the approximate position of the two kidneys and to check that the count rate difference is not more than 5% for example, a higher value requiring a camera adjustment [fr

  9. Rational quantitative safety goals: a summary

    International Nuclear Information System (INIS)

    Unwin, S.D.; Hayns, M.R.

    1984-08-01

    We introduce the notion of a Rational Quantitative Safety Goal. Such a goal reflects the imprecision and vagueness inherent in any reasonable notion of adequate safety and permits such vagueness to be incorporated into the formal regulatory decision-making process. A quantitative goal of the form, the parameter x, characterizing the safety level of the nuclear plant, shall not exceed the value x 0 , for example, is of a non-rational nature in that it invokes a strict binary logic in which the parameter space underlying x is cut sharply into two portions: that containing those values of x that comply with the goal and that containing those that do not. Here, we utilize an alternative form of logic which, in accordance with any intuitively reasonable notion of safety, permits a smooth transition of a safety determining parameter between the adequately safe and inadequately safe domains. Fuzzy set theory provides a suitable mathematical basis for the formulation of rational quantitative safety goals. The decision-making process proposed here is compatible with current risk assessment techniques and produces results in a transparent and useful format. Our methodology is illustrated with reference to the NUS Corporation risk assessment of the Limerick Generating Station

  10. A quantitative approach to evolution of music and philosophy

    International Nuclear Information System (INIS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira Jr, Osvaldo N; Costa, Luciano da Fontoura

    2012-01-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master–apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic. (paper)

  11. Physics and Analysis at a Hadron Collider - Making Measurements (3/3)

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    This is the third lecture of three which together discuss the physics of hadron colliders with an emphasis on experimental techniques used for data analysis. This third lecture discusses techniques important for analyses making a measurement (e.g. determining a cross section or a particle property such as its mass or lifetime) using some CDF top-quark analyses as specific examples. The lectures are aimed at graduate students.

  12. Quantitative materials analysis of micro devices using absorption-based thickness measurements

    International Nuclear Information System (INIS)

    Sim, L M; Wog, B S; Spowage, A C

    2006-01-01

    Preliminary work in designing an X-ray inspection machine with the capability of providing quantitative thickness analysis based on absorption measurements has been demonstrated. This study attempts to use the gray levels data to investigate the nature and thickness of occluded features and materials within devices. The investigation focused on metallic materials essential to semiconductor and MEMS technologies such as tin, aluminium, copper, silver, iron and zinc. The materials were arranged to simulate different feature thicknesses and sample geometries. The X-ray parameters were varied in-order to modify the X-ray energy spectrum with the aim of optimising the measurement conditions for each sample. The capability of the method to resolve differences in thicknesses was found to be highly dependent on the material. The thickness resolution with aluminium was the poorest due to its low radiographic density. The thickness resolutions achievable for silver and tin were significantly better and of the order of 0.015 mm and 0.025 mm respectively. From the linear relationship between the X-ray attenuation and sample thickness established, the energy dependent linear attenuation coefficient for each material was determined for a series of specific energy spectra. A decrease in the linear attenuation coefficient was observed as the applied voltage and thickness of the material increased. The results provide a platform for the development of a novel absorption-based thickness measurement system that can be optimised for a range of industrial applications

  13. Non-invasive tissue temperature measurements based on quantitative diffuse optical spectroscopy (DOS) of water

    Energy Technology Data Exchange (ETDEWEB)

    Chung, S H [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Cerussi, A E; Tromberg, B J [Beckman Laser Institute and Medical Clinic, University of California, Irvine, 1002 Health Sciences Road, Irvine 92612, CA (United States); Merritt, S I [Masimo Corporation, 40 Parker, Irvine, CA 92618 (United States); Ruth, J, E-mail: bjtrombe@uci.ed [Department of Bioengineering, University of Pennsylvania, 210 S. 33rd Street, Room 240, Skirkanich Hall, Philadelphia, PA 19104 (United States)

    2010-07-07

    We describe the development of a non-invasive method for quantitative tissue temperature measurements using Broadband diffuse optical spectroscopy (DOS). Our approach is based on well-characterized opposing shifts in near-infrared (NIR) water absorption spectra that appear with temperature and macromolecular binding state. Unlike conventional reflectance methods, DOS is used to generate scattering-corrected tissue water absorption spectra. This allows us to separate the macromolecular bound water contribution from the thermally induced spectral shift using the temperature isosbestic point at 996 nm. The method was validated in intralipid tissue phantoms by correlating DOS with thermistor measurements (R = 0.96) with a difference of 1.1 {+-} 0.91 {sup 0}C over a range of 28-48 {sup 0}C. Once validated, thermal and hemodynamic (i.e. oxy- and deoxy-hemoglobin concentration) changes were measured simultaneously and continuously in human subjects (forearm) during mild cold stress. DOS-measured arm temperatures were consistent with previously reported invasive deep tissue temperature studies. These results suggest that DOS can be used for non-invasive, co-registered measurements of absolute temperature and hemoglobin parameters in thick tissues, a potentially important approach for optimizing thermal diagnostics and therapeutics.

  14. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  15. Quantitative estimation of the influence of external vibrations on the measurement error of a coriolis mass-flow meter

    NARCIS (Netherlands)

    van de Ridder, Bert; Hakvoort, Wouter; van Dijk, Johannes; Lötters, Joost Conrad; de Boer, Andries; Dimitrovova, Z.; de Almeida, J.R.

    2013-01-01

    In this paper the quantitative influence of external vibrations on the measurement value of a Coriolis Mass-Flow Meter for low flows is investigated, with the eventual goal to reduce the influence of vibrations. Model results are compared with experimental results to improve the knowledge on how

  16. Quantitative measurement of phase variation amplitude of ultrasonic diffraction grating based on diffraction spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Meiyan, E-mail: yphantomohive@gmail.com; Zeng, Yingzhi; Huang, Zuohua, E-mail: zuohuah@163.com [Laboratory of Quantum Engineering and Quantum Materials, School of Physics and Telecommunication Engineering, South China Normal University, Guangzhou, Guangdong 510006 (China)

    2014-09-15

    A new method based on diffraction spectral analysis is proposed for the quantitative measurement of the phase variation amplitude of an ultrasonic diffraction grating. For a traveling wave, the phase variation amplitude of the grating depends on the intensity of the zeroth- and first-order diffraction waves. By contrast, for a standing wave, this amplitude depends on the intensity of the zeroth-, first-, and second-order diffraction waves. The proposed method is verified experimentally. The measured phase variation amplitude ranges from 0 to 2π, with a relative error of approximately 5%. A nearly linear relation exists between the phase variation amplitude and driving voltage. Our proposed method can also be applied to ordinary sinusoidal phase grating.

  17. Quantitative measurement of productivity loss due to thermal discomfort

    DEFF Research Database (Denmark)

    Lan, Li; Wargocki, Pawel; Lian, Zhiwei

    2011-01-01

    discomfort caused by elevated air temperature had a negative effect on performance. A quantitative relationship was established between thermal sensation votes and task performance. It can be used for economic calculations pertaining to building design and operation when occupant productivity is considered...

  18. Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, Daniel S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tandon, Lav [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-05

    The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.

  19. Correlations between quantitative cineangiography, coronary flow reserve measured with digital subtraction cineangiography and exercise thallium perfusion scintigraphy

    International Nuclear Information System (INIS)

    Zijlstra, F.; Fioretti, P.; Reiber, J.H.; Serruys, P.W.

    1988-01-01

    The goal of this investigation was to establish which anatomical parameters of stenotic lesions correlate best with its functional severity. Therefore, thirty-eight patients with single vessel disease underwent coronary cineangiography and exercise/redistribution thallium-201 scintigraphy. Cross-sectional area at the site of obstruction (OA), percentage diameter stenosis (DS), the calculated pressuredrop over the stenosis (PD), as well as coronary flow reserve (CFR) derived from myocardial contrast appearance time and density were determined. The relations between CFR and the 3 anatomical parameters were described by the following equations: CFR = 4.6 - 0.053 DS, r = 0.82, SEE: 0.79, p less than 0.001 CFR = 0.5 + 0.75 OA, r = 0.87, SEE: 0.68, p less than 0.001 CFR = 3.6 - 1.5 log PD, r = 0.90, SEE: 0.62, p less than 0.001 The calculated pressuredrop was highly predictive of the thallium scintigraphic results with a sensitivity of 94% and a specificity of 90%. Therefore, the calculated pressuredrop is a better anatomical parameter for assessing the functional importance of a stenosis than percentage diameter stenosis or obstruction area. However, the 95% confidence limits of the relation between pressuredrop and coronary flow reserve are wide, making measurement of CFR a valuable addition to quantitative angiography, especially when determining the functional importance of moderately severe coronary artery lesions

  20. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    Science.gov (United States)

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  1. Synchrotron radiation as a source for quantitative XPS: advantages and consequences

    International Nuclear Information System (INIS)

    Rosseel, T.M.; Carlson, T.A.; Negri, R.E.; Beall, C.E.; Taylor, J.W.

    1986-01-01

    Synchrotron radiation (SR) has a variety of properties which make it an attractive source for quantitative x-ray photoelectron spectroscopy (XPS). Among the most significant are high intensity and tunability. In addition, the intensity of the dispersed radiation is comparable to laboratory line sources. Synchrotron radiation is also a clean source, i.e., it will not contaminate the sample, because it operates under ultra-high vacuum conditions. We have used these properties to demonstrate the advantages of SR as a source for quantitative XPS. We have also found several consequences associated with this source which can either limit its use or provide unique opportunities for analysis and research. Using the tunability of SR, we have measured the energy dependence of the 3p photoionization cross sections of Ti, Cr, and Mn from 50 to 150 eV above threshold at the University of Wisconsin's Tantalus electron-storage ring

  2. Porous Silicon Antibody Microarrays for Quantitative Analysis: Measurement of Free and Total PSA in Clinical Plasma Samples

    Science.gov (United States)

    Tojo, Axel; Malm, Johan; Marko-Varga, György; Lilja, Hans; Laurell, Thomas

    2014-01-01

    The antibody microarrays have become widespread, but their use for quantitative analyses in clinical samples has not yet been established. We investigated an immunoassay based on nanoporous silicon antibody microarrays for quantification of total prostate-specific-antigen (PSA) in 80 clinical plasma samples, and provide quantitative data from a duplex microarray assay that simultaneously quantifies free and total PSA in plasma. To further develop the assay the porous silicon chips was placed into a standard 96-well microtiter plate for higher throughput analysis. The samples analyzed by this quantitative microarray were 80 plasma samples obtained from men undergoing clinical PSA testing (dynamic range: 0.14-44ng/ml, LOD: 0.14ng/ml). The second dataset, measuring free PSA (dynamic range: 0.40-74.9ng/ml, LOD: 0.47ng/ml) and total PSA (dynamic range: 0.87-295ng/ml, LOD: 0.76ng/ml), was also obtained from the clinical routine. The reference for the quantification was a commercially available assay, the ProStatus PSA Free/Total DELFIA. In an analysis of 80 plasma samples the microarray platform performs well across the range of total PSA levels. This assay might have the potential to substitute for the large-scale microtiter plate format in diagnostic applications. The duplex assay paves the way for a future quantitative multiplex assay, which analyses several prostate cancer biomarkers simultaneously. PMID:22921878

  3. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy

    Science.gov (United States)

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-01-01

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535

  4. Quantitative determination of uranium by SIMS

    International Nuclear Information System (INIS)

    Kuruc, J.; Harvan, D.; Galanda, D.; Matel, L.; Aranyosiova, M.; Velic, D.

    2008-01-01

    The paper presents results of quantitative measurements of uranium-238 by secondary ion mass spectrometry (SIMS) with using alpha spectrometry as well as complementary technique. Samples with specific activity of uranium-238 were prepared by electrodeposition from aqueous solution of UO 2 (NO 3 ) 2 ·6H 2 O. We tried to apply SIMS to quantitative analysis and search for correlation between intensity obtained from SIMS and activity of uranium-238 in dependence on the surface's weight and possibility of using SIMS in quantitative analysis of environmental samples. The obtained results and correlations as well as results of two real samples measurements are presented in this paper. (authors)

  5. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    Science.gov (United States)

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  6. Breach Risk Magnitude: A Quantitative Measure of Database Security.

    Science.gov (United States)

    Yasnoff, William A

    2016-01-01

    A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.

  7. Use of Quantitative Uncertainty Analysis to Support M&VDecisions in ESPCs

    Energy Technology Data Exchange (ETDEWEB)

    Mathew, Paul A.; Koehling, Erick; Kumar, Satish

    2005-05-11

    Measurement and Verification (M&V) is a critical elementof an Energy Savings Performance Contract (ESPC) - without M&V, thereisno way to confirm that the projected savings in an ESPC are in factbeing realized. For any given energy conservation measure in an ESPC,there are usually several M&V choices, which will vary in terms ofmeasurement uncertainty, cost, and technical feasibility. Typically,M&V decisions are made almost solely based on engineering judgmentand experience, with little, if any, quantitative uncertainty analysis(QUA). This paper describes the results of a pilot project initiated bythe Department of Energy s Federal Energy Management Program to explorethe use of Monte-Carlo simulation to assess savings uncertainty andthereby augment the M&V decision-making process in ESPCs. The intentwas to use QUA selectively in combination with heuristic knowledge, inorder to obtain quantitative estimates of the savings uncertainty withoutthe burden of a comprehensive "bottoms-up" QUA. This approach was used toanalyze the savings uncertainty in an ESPC for a large federal agency.The QUA was seamlessly integrated into the ESPC development process andthe incremental effort was relatively small with user-friendly tools thatare commercially available. As the case study illustrates, in some casesthe QUA simply confirms intuitive or qualitative information, while inother cases, it provides insight that suggests revisiting the M&Vplan. The case study also showed that M&V decisions should beinformed by the portfolio risk diversification. By providing quantitativeuncertainty information, QUA can effectively augment the M&Vdecision-making process as well as the overall ESPC financialanalysis.

  8. Quantitative optical microscopy: measurement of cellular biophysical features with a standard optical microscope.

    Science.gov (United States)

    Phillips, Kevin G; Baker-Groberg, Sandra M; McCarty, Owen J T

    2014-04-07

    We describe the use of a standard optical microscope to perform quantitative measurements of mass, volume, and density on cellular specimens through a combination of bright field and differential interference contrast imagery. Two primary approaches are presented: noninterferometric quantitative phase microscopy (NIQPM), to perform measurements of total cell mass and subcellular density distribution, and Hilbert transform differential interference contrast microscopy (HTDIC) to determine volume. NIQPM is based on a simplified model of wave propagation, termed the paraxial approximation, with three underlying assumptions: low numerical aperture (NA) illumination, weak scattering, and weak absorption of light by the specimen. Fortunately, unstained cellular specimens satisfy these assumptions and low NA illumination is easily achieved on commercial microscopes. HTDIC is used to obtain volumetric information from through-focus DIC imagery under high NA illumination conditions. High NA illumination enables enhanced sectioning of the specimen along the optical axis. Hilbert transform processing on the DIC image stacks greatly enhances edge detection algorithms for localization of the specimen borders in three dimensions by separating the gray values of the specimen intensity from those of the background. The primary advantages of NIQPM and HTDIC lay in their technological accessibility using "off-the-shelf" microscopes. There are two basic limitations of these methods: slow z-stack acquisition time on commercial scopes currently abrogates the investigation of phenomena faster than 1 frame/minute, and secondly, diffraction effects restrict the utility of NIQPM and HTDIC to objects from 0.2 up to 10 (NIQPM) and 20 (HTDIC) μm in diameter, respectively. Hence, the specimen and its associated time dynamics of interest must meet certain size and temporal constraints to enable the use of these methods. Excitingly, most fixed cellular specimens are readily investigated with

  9. Decision-making impairment in obsessive-compulsive disorder as measured by the Iowa Gambling Task

    Directory of Open Access Journals (Sweden)

    Felipe Filardi da Rocha

    2011-08-01

    Full Text Available OBJECTIVE: This study aims to evaluate the process of decision-making in patients with obsessive-compulsive disorder (OCD using the Iowa Gambling Task (IGT. In addition, we intend to expand the understanding of clinical and demographic characteristics that influence decision-making. METHOD: Our sample consisted of 214 subjects (107 diagnosed with OCD and 107 healthy controls who were evaluated on their clinical, demographic and neuropsychological features. Moreover, the Iowa Gambling Task (IGT, a task that detects and measures decision-making impairments, was used. RESULTS: We found that OCD patients performed significantly worse on the IGT. Furthermore, features such as symptoms of anxiety did not influence IGT performance. CONCLUSION: Impaired decision-making seems to be a key feature of OCD. Given that OCD is a complex heterogeneous disorder, homogeneous groups are necessary for an accurate characterization of our findings.

  10. Quantitative And Qualitative Measurement Of Radio- Activity In Sand Samples From Chalet Beach In Songkhla Province

    International Nuclear Information System (INIS)

    Sukhowattanakit, Jirapa; Kessaratikoon, Prasong; Udomsomporn, Suchin; Thorarit, Wutthidej

    2005-10-01

    The quantitative and qualitative measurement of radioactivity in 39 sand samples collected from Chalatat beach in Songkhla province are presented. Experimental results were obtained by using a high-purity germanium detector and gamma spectroscopy analysis system and comparing to the standard soil (IAEA SOIL 6) at the Office of Atoms for Peace (OAP). The measuring time of all sand samples is 10,000 seconds. Some radioisotopes such as K-40, Cs-137, Tl-208, Bi-212, Pb-212, Bi-214, Pb- 214, Ra-226 and Ac-228 were found in sand samples. In addition, the radioactivity of Ra-226 and Cs-137 in those samples were found in normal level

  11. Quantitation of specific myeloid cells in rat bone marrow measured by in vitro /sup 35/S-sulphate incorporation

    Energy Technology Data Exchange (ETDEWEB)

    Wright, A F; Rose, M S

    1984-08-01

    A biochemical measurement which can be used for quantitation of specific early myeloid cells in rat bone marrow has been developed. This measurement consists of a rapid, simple assay for the in vitro quantitation of /sup 35/S-sulfate incorporation into rat bone marrow cells. Incubation of bone marrow cells with /sup 35/S-sulfate led to a time-dependent increase in radioactivity obtained in perchloric acid insoluble fractions of bone marrow cell suspensions. This incorporation was inhibited by cyanide and puromycin. Autoradiography has demonstrated the radiolabel to be specifically associated with immature cells of the myeloid series. The cells most active in this respect were eosinophils. When rats were treated with endotoxin, the rate of /sup 35/S-sulfate incorporation was increased. Cell number measurements, using conventional histopathology and a Coulter Counter, demonstrated that endotoxin caused an initial release of mature granulocytes from the bone marrow. The regeneration of this mature population in the marrow was rapid, and was characterized by an increase in the number of immature cells and a concomitant increase in the rate of /sup 35/S-sulfate incorporation measured in preparations of bone marrow cells in vitro. Furthermore, this response to endotoxin has demonstrated that Coulter Counting techniques can be used to distinguish specific populations of cells (e.g. mature granulocytes) within the bone marrow.

  12. [Teaching quantitative methods in public health: the EHESP experience].

    Science.gov (United States)

    Grimaud, Olivier; Astagneau, Pascal; Desvarieux, Moïse; Chambaud, Laurent

    2014-01-01

    Many scientific disciplines, including epidemiology and biostatistics, are used in the field of public health. These quantitative sciences are fundamental tools necessary for the practice of future professionals. What then should be the minimum quantitative sciences training, common to all future public health professionals? By comparing the teaching models developed in Columbia University and those in the National School of Public Health in France, the authors recognize the need to adapt teaching to the specific competencies required for each profession. They insist that all public health professionals, whatever their future career, should be familiar with quantitative methods in order to ensure that decision-making is based on a reflective and critical use of quantitative analysis.

  13. What is a Beryllium Measurement? A Critical Look at Beryllium Quantitation

    International Nuclear Information System (INIS)

    Charles Davis; Dan Field; John Hess; Dan Jensen

    2006-01-01

    DOE workplaces strive to comply with the 10 CFR 850.31(b)(1) surface concentration release criterion. The usual planning considerations for demonstrating compliance are these: how many swipes, and where; which sample preparation and analytical methods; what reporting limits; and what sample statistic to compare with the criterion. We have reviewed swipe samples from hundreds of Nevada Test Site workplaces: office buildings; experimental facilities; forward area field units; shops; and tunnels. Our experiences have led us to a critical examination of the inner workings of the measurement process itself, involving details generally taken for granted when those usual questions are asked. In this presentation we dissect the ICP-AES Be measurement process. We discuss calibration options and how they impact the distributions of analytical results. We look at distributions of blank results obtained from different labs, and discuss their relevance to determining reporting limits. We examine the way measurements are made from spectra, how that process impacts our understanding of the actual statistical distributions of Be measurements, and how interferences can affect Be measurements. Our objective is to gain sufficient confidence in the measurement process so that the usual questions will make sense and the survey results will be credible. Based on our observations, we offer these recommendations: prepare calibration samples in digested blank swipes; force the calibration line through (0,0); base reporting limits on field blank measurement distributions rather than 40 CFR 236 calculations; use, but do not believe, the usual lognormal distribution assumption; and avoid the 234.861 nm emission line

  14. Quantitative Literacy at Michigan State University, 2: Connection to Financial Literacy

    Directory of Open Access Journals (Sweden)

    Dennis Gilliland

    2011-07-01

    Full Text Available The lack of capability of making financial decisions has been recently described for the adult United States population. A concerted effort to increase awareness of this crisis, to improve education in quantitative and financial literacy, and to simplify financial decision-making processes is critical to the solution. This paper describes a study that was undertaken to explore the relationship between quantitative literacy and financial literacy for entering college freshmen. In summer 2010, incoming freshmen to Michigan State University were assessed. Well-tested financial literacy items and validated quantitative literacy assessment instruments were administered to 531 subjects. Logistic regression models were used to assess the relationship between level of financial literacy and independent variables including quantitative literacy score, ACT mathematics score, and demographic variables including gender. The study establishes a strong positive association between quantitative literacy and financial literacy on top of the effects of the other independent variables. Adding one percent to the performance on a quantitative literacy assessment changes the odds for being at the highest level of financial literacy by a factor estimated to be 1.05. Gender is found to have a large, statistically significant effect as well with being female changing the odds by a factor estimated to be 0.49.

  15. A device for quantitative plutonium testing in mixed fuel by its neutron emission

    International Nuclear Information System (INIS)

    Gadzhiev, G.I.; Gorobets, A.K.; Golushko, V.V.; Dunaev, E.S.; Leshchenko, Yu.I.

    1987-01-01

    A device for quantitative plutonium testing in mixed fuel by its neutron emission is described. The method of ''assigned dead time'' for isolation of neutrons of spontaneous fission is used in the device. The main characteristics of the registrating equipment specifying the regime of measuring and affecting testing errors are presented. The results of spontaneous fission neutrons detection in the range up to 100 g of plutonium linearly depend on 240 Pu. Sensitivity of testing makes up about 28 pul./s per 1 g of 240 Pu

  16. Accurate quantitative XRD phase analysis of cement clinkers

    International Nuclear Information System (INIS)

    Kern, A.

    2002-01-01

    Full text: Knowledge about the absolute phase abundance in cement clinkers is a requirement for both, research and quality control. Traditionally, quantitative analysis of cement clinkers has been carried out by theoretical normative calculation from chemical analysis using the so-called Bogue method or by optical microscopy. Therefore chemical analysis, mostly performed by X-ray fluorescence (XRF), forms the basis of cement plan control by providing information for proportioning raw materials, adjusting kiln and burning conditions, as well as cement mill feed proportioning. In addition, XRF is of highest importance with respect to the environmentally relevant control of waste recovery raw materials and alternative fuels, as well as filters, plants and sewage. However, the performance of clinkers and cements is governed by the mineralogy and not the elemental composition, and the deficiencies and inherent errors of Bogue as well as microscopic point counting are well known. With XRD and Rietveld analysis a full quantitative analysis of cement clinkers can be performed providing detailed mineralogical information about the product. Until recently several disadvantages prevented the frequent application of the Rietveld method in the cement industry. As the measurement of a full pattern is required, extended measurement times made an integration of this method into existing automation environments difficult. In addition, several drawbacks of existing Rietveld software such as complexity, low performance and severe numerical instability were prohibitive for automated use. The latest developments of on-line instrumentation, as well as dedicated Rietveld software for quantitative phase analysis (TOPAS), now make a decisive breakthrough possible. TOPAS not only allows the analysis of extremely complex phase mixtures in the shortest time possible, but also a fully automated online phase analysis for production control and quality management, free of any human interaction

  17. Measurement of vertebral bone density. Quantitative CT or dual-photon absorptiometry

    International Nuclear Information System (INIS)

    Bergot, C.; Laval-Jeantet, A.M.; Laval-Jeantet, M.H.; Kuntz, D.

    1993-01-01

    We have compared vertebral bone density measurements (QCT and DXA) in women in the postmenopausal period who underwent both examinations. Our aim was to study the results and to define the respective indications of QCT and DXA in various clinical pictures of osteoporosis. The subjects of the study were distributed into various groups according to the presence or absence of vertebral collapse and/or peripheral fractures. The results of the measurements were expressed as Z-scores (deviation from age-normal average) to suppress the age effect and to make comparison between both methods possible. The values of both measurements are significantly lower in case of vertebral involvement. QCT is more sensitive than DXA to discriminate vertebral collapse. A vertebral fragility threshold was defined at a Z-score of-1 with DXA and-1.25 with QCT, corresponding to the best sensitivity for an acceptable specificity. The results of densitometry suggest that there is a peripheral osteoporosis, different from vertebral osteoporosis, as early as the postmenopausal period. Since DXA is easy to implement, it can be used to screen osteoporosis. When the vertebral measurement with DXA is normal although osteoporosis is obvious (previous collapse or fracture), QCT must be used as it is more sensitive

  18. Multi-Attribute Decision Making Based on Several Trigonometric Hamming Similarity Measures under Interval Rough Neutrosophic Environment

    Directory of Open Access Journals (Sweden)

    Surapati Pramanik

    2018-03-01

    Full Text Available In this paper, the sine, cosine and cotangent similarity measures of interval rough neutrosophic sets is proposed. Some properties of the proposed measures are discussed. We have proposed multi attribute decision making approaches based on proposed similarity measures. To demonstrate the applicability, a numerical example is solved.

  19. Comparison of panoramic radiography and cone-beam computed tomography for qualitative and quantitative measurements regarding localization of permanent impacted maxillary canines

    Directory of Open Access Journals (Sweden)

    Çiğdem Sarıkır

    2017-01-01

    Full Text Available Objective: The purpose of this retrospective study was to compare the correlation between digital panoramic radiography (DPR and cone-beam computed tomography (CBCT evaluations for localization of impacted permanent maxillary canines (IPMCs and for other qualitative and quantitative parameters. Materials and Method: DPR and CBCT images of 60 patients (17 men and 43 women were examined independently by two observers. Correlations between DPR and CBCT images were evaluated regarding qualitative (bucco-palatal positioning of IPMCs, morphology and presence of root resorption of adjacent permanent lateral incisors, and contact relationship between IPMCs and adjacent permanent lateral incisors and quantitative (angle measurements variables. All evaluations were repeated 1 month later by each observer. Chi-square and t-tests were used for statistical analysis. Kappa statistics were used to assess intra- and interobserver agreement (Cohen’s κ. Results: No correlation was observed for determination of bucco-palatal positioning of IPMCs between DPR and CBCT images (p>0.05. Correlations were observed for other qualitative variables (p<0.05. Differences between DPR and CBCT images were seen for all examined quantitative variables (p<0.01. Intra- and interobserver agreements were substantial to almost-perfect. Conclusion: No significant correlation was found between DPR and CBCT images for determination of bucco-palatal positioning of IPMCs. All quantitative measurements performed on DPR and CBCT images significantly differed from each other.

  20. Doing It Collaboratively! Addressing the Dilemmas of Designing Quantitative Effect Studies on Narrative Family Therapy in a Local Clinical Context

    DEFF Research Database (Denmark)

    Ejbye-Ernst, Ditte; Jørring, Nina Tejs

    2017-01-01

    suggest that involving narrative clinicians and clients in the development of a research design in the local clinical context might be helpful in overcoming narrative skepticism and criticism towards quantitative effect research. It is our hope that this article will inspire more narrative therapists...... in a local clinical context. The article offers a detailed case description of implementing psychometric effect measurements on narrative family therapy and of creating a shared collaborative stance for researchers using quantitative effect measurements and clinicians using narrative therapy. Our findings......This article suggests an approach for addressing the dilemmas narrative therapists face, wanting to make narrative therapy accessible to people seeking help in contexts favoring evidence-based therapy. The approach is inspired by participatory action research and involves clinicians and clients...

  1. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  2. Quantitative measurement of hypertrophic scar: intrarater reliability, sensitivity, and specificity.

    Science.gov (United States)

    Nedelec, Bernadette; Correa, José A; Rachelska, Grazyna; Armour, Alexis; LaSalle, Léo

    2008-01-01

    The comparison of scar evaluation over time requires measurement tools with acceptable intrarater reliability and the ability to discriminate skin characteristics of interest. The objective of this study was to evaluate the intrarater reliability and sensitivity and specificity of the Cutometer, the Mexameter, and the DermaScan C relative to the modified Vancouver Scar Scale (mVSS) in patient-matched normal skin, normal scar (donor sites), and hypertrophic scar (HSc). A single investigator evaluated four tissue types (severe HSc, less severe HSc, donor site, and normal skin) in 30 burn survivors with all four measurement tools. The intraclass correlation coefficient (ICC) for the Cutometer was acceptable (> or =0.75) for the maximum deformation measure for the donor site and normal skin (>0.78) but was below the acceptable range for the HSc sites and all other parameters. The ICC for the Mexameter erythema (>0.75) and melanin index (>0.89) and the DermaScan C total thickness measurement (>0.82) were acceptable for all sites. The ICC for the total of the height, pliability, and vascularity subscales of the mVSS was acceptable (0.81) for normal scar but below the acceptable range for the scar sites. The DermaScan C was clearly able to discriminate HSc from normal scar and normal skin based on the total thickness measure. The Cutometer was less discriminating but was still able to discriminate HSc from normal scar and normal skin. The Mexameter erythema index was not a good discriminator of HSc and normal scar. Receiver operating characteristic curves were generated to establish the best cutoff point for the DermaScan C total thickness and the Cutometer maximum deformation, which were 2.034 and 0.387 mm, respectively. This study showed that although the Cutometer, the DermaScan C, and the Mexameter have measurement properties that make them attractive substitutes for the mVSS, caution must be used when interpreting results since the Cutometer has a ceiling effect when

  3. Scanning transmission ion microscopy mass measurements for quantitative trace element analysis within biological samples and validation using atomic force microscopy thickness measurements

    Energy Technology Data Exchange (ETDEWEB)

    Deves, Guillaume [Laboratoire de chimie nucleaire analytique et bioenvironnementale, UMR 5084, CNRS-Universite de Bordeaux 1, BP 120 Chemin du solarium, F33175 Gradignan cedex (France)]. E-mail: deves@cenbg.in2p3.fr; Cohen-Bouhacina, Touria [Centre de Physique Moleculaire Optique et Hertzienne, Universite de Bordeaux 1, 351, cours de la Liberation, F33405 Talence cedex (France); Ortega, Richard [Laboratoire de chimie nucleaire analytique et bioenvironnementale, UMR 5084, CNRS-Universite de Bordeaux 1, BP 120 Chemin du solarium, F33175 Gradignan cedex (France)

    2004-10-08

    We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm)

  4. QTest: Quantitative Testing of Theories of Binary Choice.

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.

  5. QTest: Quantitative Testing of Theories of Binary Choice

    Science.gov (United States)

    Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William

    2014-01-01

    The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495

  6. A proposal of group decision making procedure for supporting social consensus making

    International Nuclear Information System (INIS)

    Shimizu, Yoshiaki

    1996-01-01

    Being interested in supporting social consensus making, in this paper, we have proposed a group decision making procedure through conflict resolution in the following situation: each group has a different privilege with decision making; the final goal should be evaluated by a few qualitative sub-goals besides quantitative ones. For this purpose, we have developed a step-wise procedure that has been popularly adapted when encountered with complicated and large-scale problem-solving. As well as at the value system design phase, we applied the analytic hierarchy process, AHP to decide weights standing for the privilege at the decision making phase. Then, after rearranging the hierarchy of the sub-goals depending on the nature, we have provided an iterative procedure to derive a final solution from a discrete optimization problem. To reduce the difficulties of multi-objective decision making thereat, we took a scoring method for total evaluation and applied the genetic algorithm as a solution method. Through numerical experiments applied to a planning problem of the radioactive waste management system, we have shown numerically the proposed approach is very promising for social consensus making. (author)

  7. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  8. Quantitative MRI of kidneys in renal disease.

    Science.gov (United States)

    Kline, Timothy L; Edwards, Marie E; Garg, Ishan; Irazabal, Maria V; Korfiatis, Panagiotis; Harris, Peter C; King, Bernard F; Torres, Vicente E; Venkatesh, Sudhakar K; Erickson, Bradley J

    2018-03-01

    To evaluate the reproducibility and utility of quantitative magnetic resonance imaging (MRI) sequences for the assessment of kidneys in young adults with normal renal function (eGFR ranged from 90 to 130 mL/min/1.73 m 2 ) and patients with early renal disease (autosomal dominant polycystic kidney disease). This prospective case-control study was performed on ten normal young adults (18-30 years old) and ten age- and sex-matched patients with early renal parenchymal disease (autosomal dominant polycystic kidney disease). All subjects underwent a comprehensive kidney MRI protocol, including qualitative imaging: T1w, T2w, FIESTA, and quantitative imaging: 2D cine phase contrast of the renal arteries, and parenchymal diffusion weighted imaging (DWI), magnetization transfer imaging (MTI), blood oxygen level dependent (BOLD) imaging, and magnetic resonance elastography (MRE). The normal controls were imaged on two separate occasions ≥24 h apart (range 24-210 h) to assess reproducibility of the measurements. Quantitative MR imaging sequences were found to be reproducible. The mean ± SD absolute percent difference between quantitative parameters measured ≥24 h apart were: MTI-derived ratio = 4.5 ± 3.6%, DWI-derived apparent diffusion coefficient (ADC) = 6.5 ± 3.4%, BOLD-derived R2* = 7.4 ± 5.9%, and MRE-derived tissue stiffness = 7.6 ± 3.3%. Compared with controls, the ADPKD patient's non-cystic renal parenchyma (NCRP) had statistically significant differences with regard to quantitative parenchymal measures: lower MTI percent ratios (16.3 ± 4.4 vs. 23.8 ± 1.2, p quantitative measurements was obtained in all cases. Significantly different quantitative MR parenchymal measurement parameters between ADPKD patients and normal controls were obtained by MT, DWI, BOLD, and MRE indicating the potential for detecting and following renal disease at an earlier stage than the conventional qualitative imaging techniques.

  9. First experiences with model based iterative reconstructions influence on quantitative plaque volume and intensity measurements in coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Precht, Helle; Kitslaar, Pieter H.; Broersen, Alexander

    2017-01-01

    Purpose: Investigate the influence of adaptive statistical iterative reconstruction (ASIR) and the model- based IR (Veo) reconstruction algorithm in coronary computed tomography angiography (CCTA) im- ages on quantitative measurements in coronary arteries for plaque volumes and intensities. Methods...

  10. Listening to light scattering in turbid media: quantitative optical scattering imaging using photoacoustic measurements with one-wavelength illumination

    International Nuclear Information System (INIS)

    Yuan, Zhen; Li, Xiaoqi; Xi, Lei

    2014-01-01

    Biomedical photoacoustic tomography (PAT), as a potential imaging modality, can visualize tissue structure and function with high spatial resolution and excellent optical contrast. It is widely recognized that the ability of quantitatively imaging optical absorption and scattering coefficients from photoacoustic measurements is essential before PAT can become a powerful imaging modality. Existing quantitative PAT (qPAT), while successful, has been focused on recovering absorption coefficient only by assuming scattering coefficient a constant. An effective method for photoacoustically recovering optical scattering coefficient is presently not available. Here we propose and experimentally validate such a method for quantitative scattering coefficient imaging using photoacoustic data from one-wavelength illumination. The reconstruction method developed combines conventional PAT with the photon diffusion equation in a novel way to realize the recovery of scattering coefficient. We demonstrate the method using various objects having scattering contrast only or both absorption and scattering contrasts embedded in turbid media. The listening-to-light-scattering method described will be able to provide high resolution scattering imaging for various biomedical applications ranging from breast to brain imaging. (papers)

  11. Electroencephalography and quantitative electroencephalography in mild traumatic brain injury.

    Science.gov (United States)

    Haneef, Zulfi; Levin, Harvey S; Frost, James D; Mizrahi, Eli M

    2013-04-15

    Mild traumatic brain injury (mTBI) causes brain injury resulting in electrophysiologic abnormalities visible in electroencephalography (EEG) recordings. Quantitative EEG (qEEG) makes use of quantitative techniques to analyze EEG characteristics such as frequency, amplitude, coherence, power, phase, and symmetry over time independently or in combination. QEEG has been evaluated for its use in making a diagnosis of mTBI and assessing prognosis, including the likelihood of progressing to the postconcussive syndrome (PCS) phase. We review the EEG and qEEG changes of mTBI described in the literature. An attempt is made to separate the findings seen during the acute, subacute, and chronic phases after mTBI. Brief mention is also made of the neurobiological correlates of qEEG using neuroimaging techniques or in histopathology. Although the literature indicates the promise of qEEG in making a diagnosis and indicating prognosis of mTBI, further study is needed to corroborate and refine these methods.

  12. Quantitative Analysis of Oxygen Gas Exhausted from Anode through In Situ Measurement during Electrolytic Reduction

    Directory of Open Access Journals (Sweden)

    Eun-Young Choi

    2017-01-01

    Full Text Available Quantitative analysis by in situ measurement of oxygen gas evolved from an anode was employed to monitor the progress of electrolytic reduction of simulated oxide fuel in a molten Li2O–LiCl salt. The electrolytic reduction of 0.6 kg of simulated oxide fuel was performed in 5 kg of 1.5 wt.% Li2O–LiCl molten salt at 650°C. Porous cylindrical pellets of simulated oxide fuel were used as the cathode by loading a stainless steel wire mesh cathode basket. A platinum plate was employed as the anode. The oxygen gas evolved from the anode was exhausted to the instrumentation for in situ measurement during electrolytic reduction. The instrumentation consisted of a mass flow controller, pump, wet gas meter, and oxygen gas sensor. The oxygen gas was successfully measured using the instrumentation in real time. The measured volume of the oxygen gas was comparable to the theoretically calculated volume generated by the charge applied to the simulated oxide fuel.

  13. Marine radioactivity measurements with liquid scintillation spectrometers

    International Nuclear Information System (INIS)

    Liong Wee Kwong, L.; Povinec, P.P.

    1999-01-01

    Liquid Scintillation Spectrometry (LSS) has now become the most widespread method for quantitative analytical measurement of low levels of β-emitting radionuclides like 3 H and 14 C. The high efficiency resulting from the latest development in LSS makes this technique not only appropriate but also enables direct measurement in environmental samples without excessive preparation. The introduction of several new cocktails based on solvents with a high flashpoint containing surfactants and having a high degree of aqueous sample compatibility has also contributed to the simplification of procedures

  14. Sensitivity, stability, and precision of quantitative Ns-LIBS-based fuel-air-ratio measurements for methane-air flames at 1-11 bar.

    Science.gov (United States)

    Hsu, Paul S; Gragston, Mark; Wu, Yue; Zhang, Zhili; Patnaik, Anil K; Kiefer, Johannes; Roy, Sukesh; Gord, James R

    2016-10-01

    Nanosecond laser-induced breakdown spectroscopy (ns-LIBS) is employed for quantitative local fuel-air (F/A) ratio (i.e., ratio of actual fuel-to-oxidizer mass over ratio of fuel-to-oxidizer mass at stoichiometry, measurements in well-characterized methane-air flames at pressures of 1-11 bar). We selected nitrogen and hydrogen atomic-emission lines at 568 nm and 656 nm, respectively, to establish a correlation between the line intensities and the F/A ratio. We have investigated the effects of laser-pulse energy, camera gate delay, and pressure on the sensitivity, stability, and precision of the quantitative ns-LIBS F/A ratio measurements. We determined the optimal laser energy and camera gate delay for each pressure condition and found that measurement stability and precision are degraded with an increase in pressure. We have identified primary limitations of the F/A ratio measurement employing ns-LIBS at elevated pressures as instabilities caused by the higher density laser-induced plasma and the presence of the higher level of soot. Potential improvements are suggested.

  15. Facial Phenotyping by Quantitative Photography Reflects Craniofacial Morphology Measured on Magnetic Resonance Imaging in Icelandic Sleep Apnea Patients

    Science.gov (United States)

    Sutherland, Kate; Schwab, Richard J.; Maislin, Greg; Lee, Richard W.W.; Benedikstdsottir, Bryndis; Pack, Allan I.; Gislason, Thorarinn; Juliusson, Sigurdur; Cistulli, Peter A.

    2014-01-01

    Study Objectives: (1) To determine whether facial phenotype, measured by quantitative photography, relates to underlying craniofacial obstructive sleep apnea (OSA) risk factors, measured with magnetic resonance imaging (MRI); (2) To assess whether these associations are independent of body size and obesity. Design: Cross-sectional cohort. Setting: Landspitali, The National University Hospital, Iceland. Participants: One hundred forty patients (87.1% male) from the Icelandic Sleep Apnea Cohort who had both calibrated frontal and profile craniofacial photographs and upper airway MRI. Mean ± standard deviation age 56.1 ± 10.4 y, body mass index 33.5 ± 5.05 kg/m2, with on-average severe OSA (apnea-hypopnea index 45.4 ± 19.7 h-1). Interventions: N/A. Measurements and Results: Relationships between surface facial dimensions (photos) and facial bony dimensions and upper airway soft-tissue volumes (MRI) was assessed using canonical correlation analysis. Photo and MRI craniofacial datasets related in four significant canonical correlations, primarily driven by measurements of (1) maxillary-mandibular relationship (r = 0.8, P photography and MRI. This study confirms that facial photographic phenotype reflects underlying aspects of craniofacial skeletal abnormalities associated with OSA. Therefore, facial photographic phenotyping may be a useful tool to assess intermediate phenotypes for OSA, particularly in large-scale studies. Citation: Sutherland K, Schwab RJ, Maislin G, Lee RW, Benedikstdsottir B, Pack AI, Gislason T, Juliusson S, Cistulli PA. Facial phenotyping by quantitative photography reflects craniofacial morphology measured on magnetic resonance imaging in icelandic sleep apnea patients. SLEEP 2014;37(5):959-968. PMID:24790275

  16. Quantitative simulation tools to analyze up- and downstream interactions of soil and water conservation measures: supporting policy making in the Green Water Credits program of Kenya.

    Science.gov (United States)

    Hunink, J E; Droogers, P; Kauffman, S; Mwaniki, B M; Bouma, J

    2012-11-30

    Upstream soil and water conservation measures in catchments can have positive impact both upstream in terms of less erosion and higher crop yields, but also downstream by less sediment flow into reservoirs and increased groundwater recharge. Green Water Credits (GWC) schemes are being developed to encourage upstream farmers to invest in soil and water conservation practices which will positively effect upstream and downstream water availability. Quantitative information on water and sediment fluxes is crucial as a basis for such financial schemes. A pilot design project in the large and strategically important Upper-Tana Basin in Kenya has the objective to develop a methodological framework for this purpose. The essence of the methodology is the integration and use of a collection of public domain tools and datasets: the so-called Green water and Blue water Assessment Toolkit (GBAT). This toolkit was applied in order to study different options to implement GWC in agricultural rainfed land for the pilot study. Impact of vegetative contour strips, mulching, and tied ridges were determined for: (i) three upstream key indicators: soil loss, crop transpiration and soil evaporation, and (ii) two downstream indicators: sediment inflow in reservoirs and groundwater recharge. All effects were compared with a baseline scenario of average conditions. Thus, not only actual land management was considered but also potential benefits of changed land use practices. Results of the simulations indicate that especially applying contour strips or tied ridges significantly reduces soil losses and increases groundwater recharge in the catchment. The model was used to build spatial expressions of the proposed management practices in order to assess their effectiveness. The developed procedure allows exploring the effects of soil conservation measures in a catchment to support the implementation of GWC. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Validating Quantitative Measurement Using Qualitative Data: Combining Rasch Scaling and Latent Semantic Analysis in Psychiatry

    Science.gov (United States)

    Lange, Rense

    2015-02-01

    An extension of concurrent validity is proposed that uses qualitative data for the purpose of validating quantitative measures. The approach relies on Latent Semantic Analysis (LSA) which places verbal (written) statements in a high dimensional semantic space. Using data from a medical / psychiatric domain as a case study - Near Death Experiences, or NDE - we established concurrent validity by connecting NDErs qualitative (written) experiential accounts with their locations on a Rasch scalable measure of NDE intensity. Concurrent validity received strong empirical support since the variance in the Rasch measures could be predicted reliably from the coordinates of their accounts in the LSA derived semantic space (R2 = 0.33). These coordinates also predicted NDErs age with considerable precision (R2 = 0.25). Both estimates are probably artificially low due to the small available data samples (n = 588). It appears that Rasch scalability of NDE intensity is a prerequisite for these findings, as each intensity level is associated (at least probabilistically) with a well- defined pattern of item endorsements.

  18. Speech graphs provide a quantitative measure of thought disorder in psychosis.

    Science.gov (United States)

    Mota, Natalia B; Vasconcelos, Nivaldo A P; Lemos, Nathalia; Pieretti, Ana C; Kinouchi, Osame; Cecchi, Guillermo A; Copelli, Mauro; Ribeiro, Sidarta

    2012-01-01

    Psychosis has various causes, including mania and schizophrenia. Since the differential diagnosis of psychosis is exclusively based on subjective assessments of oral interviews with patients, an objective quantification of the speech disturbances that characterize mania and schizophrenia is in order. In principle, such quantification could be achieved by the analysis of speech graphs. A graph represents a network with nodes connected by edges; in speech graphs, nodes correspond to words and edges correspond to semantic and grammatical relationships. To quantify speech differences related to psychosis, interviews with schizophrenics, manics and normal subjects were recorded and represented as graphs. Manics scored significantly higher than schizophrenics in ten graph measures. Psychopathological symptoms such as logorrhea, poor speech, and flight of thoughts were grasped by the analysis even when verbosity differences were discounted. Binary classifiers based on speech graph measures sorted schizophrenics from manics with up to 93.8% of sensitivity and 93.7% of specificity. In contrast, sorting based on the scores of two standard psychiatric scales (BPRS and PANSS) reached only 62.5% of sensitivity and specificity. The results demonstrate that alterations of the thought process manifested in the speech of psychotic patients can be objectively measured using graph-theoretical tools, developed to capture specific features of the normal and dysfunctional flow of thought, such as divergence and recurrence. The quantitative analysis of speech graphs is not redundant with standard psychometric scales but rather complementary, as it yields a very accurate sorting of schizophrenics and manics. Overall, the results point to automated psychiatric diagnosis based not on what is said, but on how it is said.

  19. Speech graphs provide a quantitative measure of thought disorder in psychosis.

    Directory of Open Access Journals (Sweden)

    Natalia B Mota

    Full Text Available BACKGROUND: Psychosis has various causes, including mania and schizophrenia. Since the differential diagnosis of psychosis is exclusively based on subjective assessments of oral interviews with patients, an objective quantification of the speech disturbances that characterize mania and schizophrenia is in order. In principle, such quantification could be achieved by the analysis of speech graphs. A graph represents a network with nodes connected by edges; in speech graphs, nodes correspond to words and edges correspond to semantic and grammatical relationships. METHODOLOGY/PRINCIPAL FINDINGS: To quantify speech differences related to psychosis, interviews with schizophrenics, manics and normal subjects were recorded and represented as graphs. Manics scored significantly higher than schizophrenics in ten graph measures. Psychopathological symptoms such as logorrhea, poor speech, and flight of thoughts were grasped by the analysis even when verbosity differences were discounted. Binary classifiers based on speech graph measures sorted schizophrenics from manics with up to 93.8% of sensitivity and 93.7% of specificity. In contrast, sorting based on the scores of two standard psychiatric scales (BPRS and PANSS reached only 62.5% of sensitivity and specificity. CONCLUSIONS/SIGNIFICANCE: The results demonstrate that alterations of the thought process manifested in the speech of psychotic patients can be objectively measured using graph-theoretical tools, developed to capture specific features of the normal and dysfunctional flow of thought, such as divergence and recurrence. The quantitative analysis of speech graphs is not redundant with standard psychometric scales but rather complementary, as it yields a very accurate sorting of schizophrenics and manics. Overall, the results point to automated psychiatric diagnosis based not on what is said, but on how it is said.

  20. Early Quantitative Assessment of Non-Functional Requirements

    NARCIS (Netherlands)

    Kassab, M.; Daneva, Maia; Ormandjieva, O.

    2007-01-01

    Non-functional requirements (NFRs) of software systems are a well known source of uncertainty in effort estimation. Yet, quantitatively approaching NFR early in a project is hard. This paper makes a step towards reducing the impact of uncertainty due to NRF. It offers a solution that incorporates

  1. Progress in quantitative GPR development at CNDE

    Science.gov (United States)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-02-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  2. Progress in quantitative GPR development at CNDE

    International Nuclear Information System (INIS)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott

    2014-01-01

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability

  3. Progress in quantitative GPR development at CNDE

    Energy Technology Data Exchange (ETDEWEB)

    Eisenmann, David; Margetan, F. J.; Chiou, C.-P.; Roberts, Ron; Wendt, Scott [Center for Nondestructive Evaluation, Iowa State University, 1915 Scholl Road, Ames, IA 50011-3042 (United States)

    2014-02-18

    Ground penetrating radar (GPR) uses electromagnetic (EM) radiation pulses to locate and map embedded objects. Commercial GPR instruments are generally geared toward producing images showing the location and extent of buried objects, and often do not make full use of available absolute amplitude information. At the Center for Nondestructive Evaluation (CNDE) at Iowa State University efforts are underway to develop a more quantitative approach to GPR inspections in which absolute amplitudes and spectra of measured signals play a key role. Guided by analogous work in ultrasonic inspection, there are three main thrusts to the effort. These focus, respectively, on the development of tools for: (1) analyzing raw GPR data; (2) measuring the EM properties of soils and other embedding media; and (3) simulating GPR inspections. This paper reviews progress in each category. The ultimate goal of the work is to develop model-based simulation tools that can be used assess the usefulness of GPR for a given inspection scenario, to optimize inspection choices, and to determine inspection reliability.

  4. Assess the discrimination of Achilles InSight calcaneus quantitative ultrasound device for osteoporosis in Chinese women: Compared with dual energy X-ray absorptiometry measurements

    Energy Technology Data Exchange (ETDEWEB)

    Jin Ningning, E-mail: ningning_jin@163.com [Department of Obstetrics and Gynecology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing, 100032 (China); Lin Shouqing, E-mail: Shouqing_Lin2003@yahoo.com.cn [Department of Obstetrics and Gynecology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing, 100032 (China); Zhang Ying, E-mail: steel_lee@sina.com.cn [Department of Obstetrics and Gynecology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing, 100032 (China); Chen Fengling, E-mail: bjzqk@126.com [Department of Obstetrics and Gynecology, Peking Union Medical College Hospital, Peking Union Medical College, Chinese Academy of Medical Sciences, Beijing, 100032 (China)

    2010-11-15

    Since the implementation of quantitative ultrasound (QUS) technology may become a part of future clinical decision making to identify osteoporosis and prevent fractures, this study was initiated to evaluate the correlations of QUS parameters and axial bone mineral density (BMD) using dual energy X-ray absorptiometry (DXA) and to assess the discrimination of QUS measurements for osteoporosis and osteopenia defined by WHO criteria. 106 native Chinese women (aged 50.2 {+-} 10.9 SD, 21-74 years) were involved. Each subject received both QUS measurements at left calcaneus with Achilles InSight and DXA measurements with DPX-L at lumbar spine (L{sub 2-4}), total hip and femoral neck. Achilles InSight provided the stiffness index (SI) which derived from Broadband Ultrasound Attenuation (BUA) and Speed of Sound (SOS), and the T-scores of SI were calculated. We found that the QUS parameter SI was statistically significant but medium correlated (r = 0.458-0.587) with DXA at the lumbar spine, total hip and femoral neck (P < 0.0001 for all correlations). With ROC analysis, the area under the ROC curve of diagnosis of osteoporosis and osteopenia were 0.933 and 0.796, respectively. To identify osteoporosis, when the T-score threshold of SI was defined as -1.4, the sensitivity was 100%, and the specificity was 73.7%. Our study confirmed that QUS measurements performed with Achilles InSight were capable to identify osteoporosis defined by axial BMD using DXA in Chinese women.

  5. Assess the discrimination of Achilles InSight calcaneus quantitative ultrasound device for osteoporosis in Chinese women: Compared with dual energy X-ray absorptiometry measurements

    International Nuclear Information System (INIS)

    Jin Ningning; Lin Shouqing; Zhang Ying; Chen Fengling

    2010-01-01

    Since the implementation of quantitative ultrasound (QUS) technology may become a part of future clinical decision making to identify osteoporosis and prevent fractures, this study was initiated to evaluate the correlations of QUS parameters and axial bone mineral density (BMD) using dual energy X-ray absorptiometry (DXA) and to assess the discrimination of QUS measurements for osteoporosis and osteopenia defined by WHO criteria. 106 native Chinese women (aged 50.2 ± 10.9 SD, 21-74 years) were involved. Each subject received both QUS measurements at left calcaneus with Achilles InSight and DXA measurements with DPX-L at lumbar spine (L 2-4 ), total hip and femoral neck. Achilles InSight provided the stiffness index (SI) which derived from Broadband Ultrasound Attenuation (BUA) and Speed of Sound (SOS), and the T-scores of SI were calculated. We found that the QUS parameter SI was statistically significant but medium correlated (r = 0.458-0.587) with DXA at the lumbar spine, total hip and femoral neck (P < 0.0001 for all correlations). With ROC analysis, the area under the ROC curve of diagnosis of osteoporosis and osteopenia were 0.933 and 0.796, respectively. To identify osteoporosis, when the T-score threshold of SI was defined as -1.4, the sensitivity was 100%, and the specificity was 73.7%. Our study confirmed that QUS measurements performed with Achilles InSight were capable to identify osteoporosis defined by axial BMD using DXA in Chinese women.

  6. Quantitative assessment of motor functions post-stroke: Responsiveness of upper-extremity robotic measures and its task dependence.

    Science.gov (United States)

    Hussain, Asif; Budhota, Aamani; Contu, Sara; Kager, Simone; Vishwanath, Deshmukh A; Kuah, Christopher W K; Yam, Lester H L; Chua, Karen S G; Masia, Lorenzo; Campolo, Domenico

    2017-07-01

    Technology aided measures offer a sensitive, accurate and time-efflcient approach for the assessment of sensorimotor function after neurological impairment compared to standard clinical assessments. This preliminary study investigated the relationship between task definition and its effect on robotic measures using a planar, two degree of freedom, robotic-manipulator (H-Man). Four chronic stroke participants (49.5±11.95 years, 2 Female, FMA: 37.5±13.96) and eight healthy control participants (26.25± 4.70 years, 2 Female) participated in the study. Motor functions were evaluated using line tracing and circle tracing tasks with dominant and nondominant hand of healthy and affected vs. non affected hand of stroke participants. The results show significant dependence of quantitative measures on investigated tasks.

  7. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  8. Measurement issues associated with quantitative molecular biology analysis of complex food matrices for the detection of food fraud.

    Science.gov (United States)

    Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy

    2016-01-07

    Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays.

  9. Efficacy and toxicity in brain tumor treatment - quantitative Measurements using advanced MRI

    DEFF Research Database (Denmark)

    Ravn, Søren

    2016-01-01

    From the clinical introduction in the 1980s, MRI has grown to become an indispensable brain imaging modality, mainly due to its excellent ability to visualize soft tissues. Morphologically, T1- and T2-weighted brain tumor MRI have been part of routine diagnostic radiology for more than two decades...... with the introduction of magnets with higher field strength. Ongoing technical development has enabled a change from semiquantitative measurements to a true quantitative approach. This step is expected to have a great impact on the treatment of brain tumor patients in the future. The aim of this Ph.D. dissertation...... was to explore how different advanced MRI techniques could contribute to a higher degree of individualized treatment of brain tumor patients. The thesis is based on three studies in which advanced MRI is used to evaluate the possible role of fMRI in presurgical planning, Diffusion Tensor Imaging (DTI...

  10. Quantitative diffusion characteristics of the human brain depend on MRI sequence parameters

    International Nuclear Information System (INIS)

    Wilson, M.; Blumhardt, L.D.; Morgan, P.S.

    2002-01-01

    Quantitative diffusion-weighted MRI has been applied to the study of neurological diseases, including multiple sclerosis, where the molecular self-diffusion coefficient D has been measured in both lesions and normal-appearing white matter. Histograms of D have been used as a novel measure of the ''lesion load'', with potential applications that include the monitoring of efficacy in new treatment trials. However different ways of measuring D may affect its value, making comparison between different centres and research groups impossible. We aimed to assess the effect, if any, of using two different MRI sequences on the value of D. We studied 13 healthy volunteers, using two different quantitative diffusion sequences (including different b max values and gradient applications). Maps of D were analysed using both regions of interest (ROI) in white matter and ''whole brain'' histograms, and compared between the two sequences. In addition, we studied three standardised test liquids (with known values of D) using both sequences. Histograms from the two sequences had different distributions, with a greater spread and higher peak position from the sequence with lower b max . This greater spread of D was also evident in the white matter and test liquid ROI. ''Limits of agreement'' analysis demonstrated that the differences could be clinically relevant, despite significant correlations between the sequences obtained using simple rank methods. We conclude that different quantitative diffusion sequences are unlikely to produce directly comparable values of D, particularly if different b max values are used. In addition, the use of inappropriate statistical tests may give false impressions of close agreement. Standardisation of methods for the measurement of D are required if these techniques are to become useful tools, for example in monitoring changes in the disease burden of multiple sclerosis. (orig.)

  11. Quantitative diffusion characteristics of the human brain depend on MRI sequence parameters

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, M.; Blumhardt, L.D. [University of Nottingham, Department of Neurology, Royal Preston Hospital, Preston (United Kingdom); Morgan, P.S. [Division of Academic Radiology, Queens Medical Centre, Nottingham (United Kingdom)

    2002-07-01

    Quantitative diffusion-weighted MRI has been applied to the study of neurological diseases, including multiple sclerosis, where the molecular self-diffusion coefficient D has been measured in both lesions and normal-appearing white matter. Histograms of D have been used as a novel measure of the ''lesion load'', with potential applications that include the monitoring of efficacy in new treatment trials. However different ways of measuring D may affect its value, making comparison between different centres and research groups impossible. We aimed to assess the effect, if any, of using two different MRI sequences on the value of D. We studied 13 healthy volunteers, using two different quantitative diffusion sequences (including different b{sub max} values and gradient applications). Maps of D were analysed using both regions of interest (ROI) in white matter and ''whole brain'' histograms, and compared between the two sequences. In addition, we studied three standardised test liquids (with known values of D) using both sequences. Histograms from the two sequences had different distributions, with a greater spread and higher peak position from the sequence with lower b{sub max}. This greater spread of D was also evident in the white matter and test liquid ROI. ''Limits of agreement'' analysis demonstrated that the differences could be clinically relevant, despite significant correlations between the sequences obtained using simple rank methods. We conclude that different quantitative diffusion sequences are unlikely to produce directly comparable values of D, particularly if different b{sub max} values are used. In addition, the use of inappropriate statistical tests may give false impressions of close agreement. Standardisation of methods for the measurement of D are required if these techniques are to become useful tools, for example in monitoring changes in the disease burden of multiple sclerosis. (orig.)

  12. Quantitative measurement of local elasticity of SiOx film by atomic force acoustic microscopy

    International Nuclear Information System (INIS)

    Cun-Fu, He; Gai-Mei, Zhang; Bin, Wu

    2010-01-01

    In this paper the elastic properties of SiO x film are investigated quantitatively for local fixed point and qualitatively for overall area by atomic force acoustic microscopy (AFAM) in which the sample is vibrated at the ultrasonic frequency while the sample surface is touched and scanned with the tip contacting the sample respectively for fixed point and continuous measurements. The SiO x films on the silicon wafers are prepared by the plasma enhanced chemical vapour deposition (PECVD). The local contact stiffness of the tip-SiO x film is calculated from the contact resonance spectrum measured with the atomic force acoustic microscopy. Using the reference approach, indentation modulus of SiO x film for fixed point is obtained. The images of cantilever amplitude are also visualized and analysed when the SiO x surface is excited at a fixed frequency. The results show that the acoustic amplitude images can reflect the elastic properties of the sample. (classical areas of phenomenology)

  13. Novel gravimetric measurement technique for quantitative volume calibration in the sub-microliter range

    International Nuclear Information System (INIS)

    Liang, Dong; Zengerle, Roland; Steinert, Chris; Ernst, Andreas; Koltay, Peter; Bammesberger, Stefan; Tanguy, Laurent

    2013-01-01

    We present a novel measurement method based on the gravimetric principles adapted from the ASTM E542 and ISO 4787 standards for quantitative volume determination in the sub-microliter range. Such a method is particularly important for the calibration of non-contact micro dispensers as well as other microfluidic devices. The novel method is based on the linear regression analysis of continuously monitored gravimetric results and therefore is referred to as ‘gravimetric regression method (GRM)’. In this context, the regression analysis is necessary to compensate the mass loss due to evaporation that is significant for very small dispensing volumes. A full assessment of the measurement uncertainty of GRM is presented and results in a standard measurement uncertainty around 6 nl for dosage volumes in the range from 40 nl to 1 µl. The GRM has been experimentally benchmarked with a dual-dye ratiometric photometric method (Artel Inc., Westbrook, ME, USA), which can provide traceability of measurement to the International System of Units (SI) through reference standards maintained by NIST. Good precision (max. CV = 2.8%) and consistency (bias around 7 nl in the volume range from 40 to 400 nl) have been observed comparing the two methods. Based on the ASTM and ISO standards on the one hand and the benchmark with the photometric method on the other hand, two different approaches for establishing traceability for the GRM are discussed. (paper)

  14. Defect Depth Measurement of Straight Pipe Specimen Using Shearography

    International Nuclear Information System (INIS)

    Chang, Ho Seob; Kim, Kyung Suk

    2012-01-01

    In the nuclear industry, wall thinning defect of straight pipe occur the enormous loss in life evaluation and safety evaluation. To use non-destructive technique, we measure deformation, vibration, defect evaluation. But, this techniques are a weak that is the measurement of the wide area is difficult and the time is caught long. In the secondary side of nuclear power plants mostly used steel pipe, artificiality wall thinning defect make in the side and different thickness make to the each other, wall thinning defect part of deformation measure by using shearography. In addition, optical measurement through deformation, vibration, defect evaluation evaluate pipe and thickness defects of pressure vessel is to evaluate quantitatively. By shearography interferometry to measure the pipe's internal wall thinning defect and the variation of pressure use the proposed technique, the quantitative defect is to evaluate the thickness of the surplus. The amount of deformation use thickness of surplus prediction of the actual thickness defect and approximately 7 percent error by ensure reliability. According to pressure the amount of deformation and the thickness of the surplus through DB construction, nuclear power plant pipe use wall thinning part soundness evaluation. In this study, pressure vessel of thickness defect measure proposed nuclear pipe of wall thinning defect prediction and integrity assessment technology development. As a basic research defected theory and experiment, pressure vessel of advanced stability and soundness and maintainability is expected to contribute foundation establishment

  15. A quantitative framework for assessing ecological resilience

    Science.gov (United States)

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  16. Cerenkov radiation imaging as a method for quantitative measurements of beta particles in a microfluidic chip

    International Nuclear Information System (INIS)

    Cho, Jennifer S; Taschereau, Richard; Olma, Sebastian; Liu Kan; Chen Yichun; Shen, Clifton K-F; Van Dam, R Michael; Chatziioannou, Arion F

    2009-01-01

    It has been observed that microfluidic chips used for synthesizing 18 F-labeled compounds demonstrate visible light emission without nearby scintillators or fluorescent materials. The origin of the light was investigated and found to be consistent with the emission characteristics from Cerenkov radiation. Since 18 F decays through the emission of high-energy positrons, the energy threshold for beta particles, i.e. electrons or positrons, to generate Cerenkov radiation was calculated for water and polydimethylsiloxane (PDMS), the most commonly used polymer-based material for microfluidic chips. Beta particles emitted from 18 F have a continuous energy spectrum, with a maximum energy that exceeds this energy threshold for both water and PDMS. In addition, the spectral characteristics of the emitted light from 18 F in distilled water were also measured, yielding a broad distribution from 300 nm to 700 nm, with higher intensity at shorter wavelengths. A photograph of the 18 F solution showed a bluish-white light emitted from the solution, further suggesting Cerenkov radiation. In this study, the feasibility of using this Cerenkov light emission as a method for quantitative measurements of the radioactivity within the microfluidic chip in situ was evaluated. A detector previously developed for imaging microfluidic platforms was used. The detector consisted of a charge-coupled device (CCD) optically coupled to a lens. The system spatial resolution, minimum detectable activity and dynamic range were evaluated. In addition, the calibration of a Cerenkov signal versus activity concentration in the microfluidic chip was determined. This novel method of Cerenkov radiation measurements will provide researchers with a simple yet robust quantitative imaging tool for microfluidic applications utilizing beta particles.

  17. Quantitative blood flow measurements in the small animal cardiopulmonary system using digital subtraction angiography

    Energy Technology Data Exchange (ETDEWEB)

    Lin Mingde; Marshall, Craig T.; Qi, Yi; Johnston, Samuel M.; Badea, Cristian T.; Piantadosi, Claude A.; Johnson, G. Allan [Department of Radiology, Center for In Vivo Microscopy and Department of Biomedical Engineering, Duke University Medical Center, Box 3302, Durham, North Carolina 27710 (United States); Division of Pulmonary and Critical Care Medicine and Center for Hyperbaric Medicine and Environmental Physiology, Duke University Medical Center, Box 3823, Durham, North Carolina 27710 (United States); Department of Radiology, Center for In Vivo Microscopy, Duke University Medical Center, Box 3302, Durham, North Carolina 27710 (United States); Department of Radiology, Center for In Vivo Microscopy and Department of Biomedical Engineering, Duke University Medical Center, Box 3302, Durham, North Carolina 27710 (United States); Department of Radiology, Center for In Vivo Microscopy, Duke University Medical Center, Box 3302, Durham, North Carolina 27710 (United States); Division of Pulmonary and Critical Care Medicine and Center for Hyperbaric Medicine and Environmental Physiology, Duke University Medical Center, Box 3823, Durham, North Carolina 27710 (United States); Department of Radiology, Center for In Vivo Microscopy and Department of Biomedical Engineering, Duke University Medical Center, Box 3302, Durham, North Carolina 27710 (United States)

    2009-11-15

    Purpose: The use of preclinical rodent models of disease continues to grow because these models help elucidate pathogenic mechanisms and provide robust test beds for drug development. Among the major anatomic and physiologic indicators of disease progression and genetic or drug modification of responses are measurements of blood vessel caliber and flow. Moreover, cardiopulmonary blood flow is a critical indicator of gas exchange. Current methods of measuring cardiopulmonary blood flow suffer from some or all of the following limitations--they produce relative values, are limited to global measurements, do not provide vasculature visualization, are not able to measure acute changes, are invasive, or require euthanasia. Methods: In this study, high-spatial and high-temporal resolution x-ray digital subtraction angiography (DSA) was used to obtain vasculature visualization, quantitative blood flow in absolute metrics (ml/min instead of arbitrary units or velocity), and relative blood volume dynamics from discrete regions of interest on a pixel-by-pixel basis (100x100 {mu}m{sup 2}). Results: A series of calibrations linked the DSA flow measurements to standard physiological measurement using thermodilution and Fick's method for cardiac output (CO), which in eight anesthetized Fischer-344 rats was found to be 37.0{+-}5.1 ml/min. Phantom experiments were conducted to calibrate the radiographic density to vessel thickness, allowing a link of DSA cardiac output measurements to cardiopulmonary blood flow measurements in discrete regions of interest. The scaling factor linking relative DSA cardiac output measurements to the Fick's absolute measurements was found to be 18.90xCO{sub DSA}=CO{sub Fick}. Conclusions: This calibrated DSA approach allows repeated simultaneous visualization of vasculature and measurement of blood flow dynamics on a regional level in the living rat.

  18. Quantitative blood flow measurements in the small animal cardiopulmonary system using digital subtraction angiography

    International Nuclear Information System (INIS)

    Lin Mingde; Marshall, Craig T.; Qi, Yi; Johnston, Samuel M.; Badea, Cristian T.; Piantadosi, Claude A.; Johnson, G. Allan

    2009-01-01

    Purpose: The use of preclinical rodent models of disease continues to grow because these models help elucidate pathogenic mechanisms and provide robust test beds for drug development. Among the major anatomic and physiologic indicators of disease progression and genetic or drug modification of responses are measurements of blood vessel caliber and flow. Moreover, cardiopulmonary blood flow is a critical indicator of gas exchange. Current methods of measuring cardiopulmonary blood flow suffer from some or all of the following limitations--they produce relative values, are limited to global measurements, do not provide vasculature visualization, are not able to measure acute changes, are invasive, or require euthanasia. Methods: In this study, high-spatial and high-temporal resolution x-ray digital subtraction angiography (DSA) was used to obtain vasculature visualization, quantitative blood flow in absolute metrics (ml/min instead of arbitrary units or velocity), and relative blood volume dynamics from discrete regions of interest on a pixel-by-pixel basis (100x100 μm 2 ). Results: A series of calibrations linked the DSA flow measurements to standard physiological measurement using thermodilution and Fick's method for cardiac output (CO), which in eight anesthetized Fischer-344 rats was found to be 37.0±5.1 ml/min. Phantom experiments were conducted to calibrate the radiographic density to vessel thickness, allowing a link of DSA cardiac output measurements to cardiopulmonary blood flow measurements in discrete regions of interest. The scaling factor linking relative DSA cardiac output measurements to the Fick's absolute measurements was found to be 18.90xCO DSA =CO Fick . Conclusions: This calibrated DSA approach allows repeated simultaneous visualization of vasculature and measurement of blood flow dynamics on a regional level in the living rat.

  19. A Fuzzy-Based Fusion Method of Multimodal Sensor-Based Measurements for the Quantitative Evaluation of Eye Fatigue on 3D Displays

    Directory of Open Access Journals (Sweden)

    Jae Won Bang

    2015-05-01

    Full Text Available With the rapid increase of 3-dimensional (3D content, considerable research related to the 3D human factor has been undertaken for quantitatively evaluating visual discomfort, including eye fatigue and dizziness, caused by viewing 3D content. Various modalities such as electroencephalograms (EEGs, biomedical signals, and eye responses have been investigated. However, the majority of the previous research has analyzed each modality separately to measure user eye fatigue. This cannot guarantee the credibility of the resulting eye fatigue evaluations. Therefore, we propose a new method for quantitatively evaluating eye fatigue related to 3D content by combining multimodal measurements. This research is novel for the following four reasons: first, for the evaluation of eye fatigue with high credibility on 3D displays, a fuzzy-based fusion method (FBFM is proposed based on the multimodalities of EEG signals, eye blinking rate (BR, facial temperature (FT, and subjective evaluation (SE; second, to measure a more accurate variation of eye fatigue (before and after watching a 3D display, we obtain the quality scores of EEG signals, eye BR, FT and SE; third, for combining the values of the four modalities we obtain the optimal weights of the EEG signals BR, FT and SE using a fuzzy system based on quality scores; fourth, the quantitative level of the variation of eye fatigue is finally obtained using the weighted sum of the values measured by the four modalities. Experimental results confirm that the effectiveness of the proposed FBFM is greater than other conventional multimodal measurements. Moreover, the credibility of the variations of the eye fatigue using the FBFM before and after watching the 3D display is proven using a t-test and descriptive statistical analysis using effect size.

  20. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  1. Photocatalytic water splitting: Quantitative approaches toward photocatalysis by design

    KAUST Repository

    Takanabe, Kazuhiro

    2017-10-11

    A widely used term, “photocatalysis”, generally addresses photocatalytic (energetically down-hill) and photosynthetic (energetically up-hill) reactions and refers to the use of photonic energy as a driving force for chemical transformations, i.e., electron reorganization to form/break chemical bonds. Although there are many such important reactions, this contribution focuses on the fundamental aspects of photocatalytic water splitting into hydrogen and oxygen by using light from the solar spectrum, which is one of the most investigated photosynthetic reactions. Photocatalytic water splitting using solar energy is considered to be artificial photosynthesis that produces a solar fuel because the reaction mimics nature’s photosynthesis not only in its redox reaction type but also in its thermodynamics (water splitting: 1.23 eV vs. glucose formation: 1.24 eV). To achieve efficient photocatalytic water splitting, all of the parameters, though involved at different timescales and spatial resolutions, should be optimized because the overall efficiency is obtained as the multiplication of all these fundamental efficiencies. The purpose of this review article is to provide the guidelines of a concept, “photocatalysis by design”, which is the opposite of “black box screening”; this concept refers to making quantitative descriptions of the associated physical and chemical properties to determine which events/parameters have the most impact on improving the overall photocatalytic performance, in contrast to arbitrarily ranking different photocatalyst materials. First, the properties that can be quantitatively measured or calculated are identified. Second, the quantities of these identified properties are determined by performing adequate measurements and/or calculations. Third, the obtained values of these properties are integrated into equations so that the kinetic/energetic bottlenecks of specific properties/processes can be determined, and the properties can

  2. Photocatalytic water splitting: Quantitative approaches toward photocatalysis by design

    KAUST Repository

    Takanabe, Kazuhiro

    2017-01-01

    A widely used term, “photocatalysis”, generally addresses photocatalytic (energetically down-hill) and photosynthetic (energetically up-hill) reactions and refers to the use of photonic energy as a driving force for chemical transformations, i.e., electron reorganization to form/break chemical bonds. Although there are many such important reactions, this contribution focuses on the fundamental aspects of photocatalytic water splitting into hydrogen and oxygen by using light from the solar spectrum, which is one of the most investigated photosynthetic reactions. Photocatalytic water splitting using solar energy is considered to be artificial photosynthesis that produces a solar fuel because the reaction mimics nature’s photosynthesis not only in its redox reaction type but also in its thermodynamics (water splitting: 1.23 eV vs. glucose formation: 1.24 eV). To achieve efficient photocatalytic water splitting, all of the parameters, though involved at different timescales and spatial resolutions, should be optimized because the overall efficiency is obtained as the multiplication of all these fundamental efficiencies. The purpose of this review article is to provide the guidelines of a concept, “photocatalysis by design”, which is the opposite of “black box screening”; this concept refers to making quantitative descriptions of the associated physical and chemical properties to determine which events/parameters have the most impact on improving the overall photocatalytic performance, in contrast to arbitrarily ranking different photocatalyst materials. First, the properties that can be quantitatively measured or calculated are identified. Second, the quantities of these identified properties are determined by performing adequate measurements and/or calculations. Third, the obtained values of these properties are integrated into equations so that the kinetic/energetic bottlenecks of specific properties/processes can be determined, and the properties can

  3. A bench-top K X-ray fluorescence system for quantitative measurement of gold nanoparticles for biological sample diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Ricketts, K., E-mail: k.ricketts@ucl.ac.uk [Division of Surgery and Interventional Sciences, University College London, Royal Free Campus, Rowland Hill Street, London NW3 2PF (United Kingdom); Guazzoni, C.; Castoldi, A. [Dipartimento di Elettronica, Informazione e Bioingegneria Politecnico di Milano and INFN, Sezione di Milano P.za Leonardo da Vinci, 32-20133 Milano (Italy); Royle, G. [Department of Medical Physics and Bioengineering, University College London, Malet Place Engineering Building, Gower Street, London WC1E 6BT (United Kingdom)

    2016-04-21

    Gold nanoparticles can be targeted to biomarkers to give functional information on a range of tumour characteristics. X-ray fluorescence (XRF) techniques offer potential quantitative measurement of the distribution of such heavy metal nanoparticles. Biologists are developing 3D tissue engineered cellular models on the centimetre scale to optimise targeting techniques of nanoparticles to a range of tumour characteristics. Here we present a high energy bench-top K-X-ray fluorescence system designed for sensitivity to bulk measurement of gold nanoparticle concentration for intended use in such thick biological samples. Previous work has demonstrated use of a L-XRF system in measuring gold concentrations but being a low energy technique it is restricted to thin samples or superficial tumours. The presented system comprised a high purity germanium detector and filtered tungsten X-ray source, capable of quantitative measurement of gold nanoparticle concentration of thicker samples. The developed system achieved a measured detection limit of between 0.2 and 0.6 mgAu/ml, meeting specifications of biologists and being approximately one order of magnitude better than the detection limit of alternative K-XRF nanoparticle detection techniques. The scatter-corrected K-XRF signal of gold was linear with GNP concentrations down to the detection limit, thus demonstrating potential in GNP concentration quantification. The K-XRF system demonstrated between 5 and 9 times less sensitivity than a previous L-XRF bench-top system, due to a fundamental limitation of lower photoelectric interaction probabilities at higher K-edge energies. Importantly, the K-XRF technique is however less affected by overlying thickness, and so offers future potential in interrogating thick biological samples.

  4. Quantitative secondary electron detection

    Science.gov (United States)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    2018-05-08

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  5. Are Quantitative Measures of Academic Productivity Correlated with Academic Rank in Plastic Surgery? A National Study.

    Science.gov (United States)

    Susarla, Srinivas M; Lopez, Joseph; Swanson, Edward W; Miller, Devin; O'Brien-Coon, Devin; Zins, James E; Serletti, Joseph M; Yaremchuk, Michael J; Manson, Paul N; Gordon, Chad R

    2015-09-01

    The purpose of this study was to investigate the correlation between quantitative measures of academic productivity and academic rank among full-time academic plastic surgeons. Bibliometric indices were computed for all full-time academic plastic surgeons in the United States. The primary study variable was academic rank. Bibliometric predictors included the Hirsch index, I-10 index, number of publications, number of citations, and highest number of citations for a single publication. Descriptive, bivariate, and correlation analyses were computed. Multiple comparisons testing was used to calculate adjusted associations for subgroups. For all analyses, a value of p productivity. Although academic promotion is the result of success in multiple different areas, bibliometric measures may be useful adjuncts for assessment of research productivity.

  6. Application of quantitative and qualitative methods for determination ...

    African Journals Online (AJOL)

    This article covers the issues of integration of qualitative and quantitative methods applied when justifying management decision-making in companies implementing lean manufacturing. The authors defined goals and subgoals and justified the evaluation criteria which lead to the increased company value if achieved.

  7. [Quantitative data analysis for live imaging of bone.

    Science.gov (United States)

    Seno, Shigeto

    Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.

  8. Quantitative measurement of Au and Fe in ferromagnetic nanoparticles with Laser Induced Breakdown Spectroscopy using a polymer-based gel matrix

    International Nuclear Information System (INIS)

    Borowik, T.; Przybyło, M.; Pala, K.; Otlewski, J.; Langner, M.

    2011-01-01

    The medical applications of nanomaterials require substantial changes in the research and development stage, such as the introduction of new processes and methods, and adequate modifications of the national and international laws on the medical product registration. To accomplish this, proper parameterizations of nano-scaled products need to be developed and implemented, accompanied by suitable measuring methods. The introduction of metallic particles to medical practices requires the precise, quantitative evaluation of the production process and later quantification and characterization of the nanoparticles in biological matrices for the bioavailability and biodistribution evaluation. In order to address these issues we propose a method for the quantitative analysis of the metallic nanoparticles composition by Laser Induced Breakdown Spectroscopy (LIBS). Au/Fe ferro-magnetic nanoparticles were used to evaluate the method applicability. Since the powder form of nanoparticles spatters upon laser ablation, first we had to develop fast, convenient and quantitative method for the nano-powdered sample preparation. The proposed method is based on the polymer gelation of nanopowders or their water suspensions. It has been shown that nanopowders compositional changes throughout the production process, along with their final characterization, can be reliable performed with LIBS technique. The quantitative values obtained were successfully correlated with those derived with ICP technique. - Highlights: ► The atomic composition of nanoparticles was analyzed with LIBS. ► The amount of gold on ferromagnetic particles was quantified by the method. ► Gel fixation was used as new way of handling powdered samples. ► LIBS results are comparable with other equivalent methods (ICP). ► There was a difference between measured and assumed nanoparticle composition.

  9. A Study on the Quantitative Assessment Method of Software Requirement Documents Using Software Engineering Measures and Bayesian Belief Networks

    International Nuclear Information System (INIS)

    Eom, Heung Seop; Kang, Hyun Gook; Park, Ki Hong; Kwon, Kee Choon; Chang, Seung Cheol

    2005-01-01

    One of the major challenges in using the digital systems in a NPP is the reliability estimation of safety critical software embedded in the digital safety systems. Precise quantitative assessment of the reliability of safety critical software is nearly impossible, since many of the aspects to be considered are of qualitative nature and not directly measurable, but they have to be estimated for a practical use. Therefore an expert's judgment plays an important role in estimating the reliability of the software embedded in safety-critical systems in practice, because they can deal with all the diverse evidence relevant to the reliability and can perform an inference based on the evidence. But, in general, the experts' way of combining the diverse evidence and performing an inference is usually informal and qualitative, which is hard to discuss and will eventually lead to a debate about the conclusion. We have been carrying out research on a quantitative assessment of the reliability of safety critical software using Bayesian Belief Networks (BBN). BBN has been proven to be a useful modeling formalism because a user can represent a complex set of events and relationships in a fashion that can easily be interpreted by others. In the previous works we have assessed a software requirement specification of a reactor protection system by using our BBN-based assessment model. The BBN model mainly employed an expert's subjective probabilities as inputs. In the process of assessing the software requirement documents we found out that the BBN model was excessively dependent on experts' subjective judgments in a large part. Therefore, to overcome the weakness of our methodology we employed conventional software engineering measures into the BBN model as shown in this paper. The quantitative relationship between the conventional software measures and the reliability of software were not identified well in the past. Then recently there appeared a few researches on a ranking of

  10. Cross Entropy Measures of Bipolar and Interval Bipolar Neutrosophic Sets and Their Application for Multi-Attribute Decision-Making

    Directory of Open Access Journals (Sweden)

    Surapati Pramanik

    2018-03-01

    Full Text Available The bipolar neutrosophic set is an important extension of the bipolar fuzzy set. The bipolar neutrosophic set is a hybridization of the bipolar fuzzy set and neutrosophic set. Every element of a bipolar neutrosophic set consists of three independent positive membership functions and three independent negative membership functions. In this paper, we develop cross entropy measures of bipolar neutrosophic sets and prove their basic properties. We also define cross entropy measures of interval bipolar neutrosophic sets and prove their basic properties. Thereafter, we develop two novel multi-attribute decision-making strategies based on the proposed cross entropy measures. In the decision-making framework, we calculate the weighted cross entropy measures between each alternative and the ideal alternative to rank the alternatives and choose the best one. We solve two illustrative examples of multi-attribute decision-making problems and compare the obtained result with the results of other existing strategies to show the applicability and effectiveness of the developed strategies. At the end, the main conclusion and future scope of research are summarized.

  11. Qualitative and quantitative methods in health research

    OpenAIRE

    V?zquez Navarrete, M. Luisa

    2009-01-01

    Introduction Research in the area of health has been traditionally dominated by quantitative research. However, the complexity of ill-health, which is socially constructed by individuals, health personnel and health authorities have motivated the search for other forms to approach knowledge. Aim To discuss the complementarities of qualitative and quantitative research methods in the generation of knowledge. Contents The purpose of quantitative research is to measure the magnitude of an event,...

  12. Quantitative measurement of pathogen specific human memory T cell repertoire diversity using a CDR3β-specific microarray

    Directory of Open Access Journals (Sweden)

    Gorski Jack

    2007-09-01

    Full Text Available Abstract Background Providing quantitative microarray data that is sensitive to very small differences in target sequence would be a useful tool in any number of venues where a sample can consist of a multiple related sequences present in various abundances. Examples of such applications would include measurement of pseudo species in viral infections and the measurement of species of antibodies or T cell receptors that constitute immune repertoires. Difficulties that must be overcome in such a method would be to account for cross-hybridization and for differences in hybridization efficiencies between the arrayed probes and their corresponding targets. We have used the memory T cell repertoire to an influenza-derived peptide as a test case for developing such a method. Results The arrayed probes were corresponded to a 17 nucleotide TCR-specific region that distinguished sequences differing by as little as a single nucleotide. Hybridization efficiency between highly related Cy5-labeled subject sequences was normalized by including an equimolar mixture of Cy3-labeled synthetic targets representing all 108 arrayed probes. The same synthetic targets were used to measure the degree of cross hybridization between probes. Reconstitution studies found the system sensitive to input ratios as low as 0.5% and accurate in measuring known input percentages (R2 = 0.81, R = 0.90, p 0.05. Conclusion This novel strategy appears to be robust and can be adapted to any situation where complex mixtures of highly similar sequences need to be quantitatively resolved.

  13. The profile of quantitative risk indicators in Krsko NPP

    International Nuclear Information System (INIS)

    Vrbanic, I.; Basic, I.; Bilic-Zabric, T.; Spiler, J.

    2004-01-01

    During the past decade strong initiative was observed which was aimed at incorporating information on risk into various aspects of operation of nuclear power plants. The initiative was observable in activities carried out by regulators as well as utilities and industry. It resulted in establishing the process, or procedure, which is often referred to as integrated decision making or risk informed decision making. In this process, engineering analyses and evaluations that are usually termed traditional and that rely on considerations of safety margins and defense in depth are supplemented by quantitative indicators of risk. Throughout the process, the plant risk was most commonly expressed in terms of likelihood of events involving damage to the reactor core and events with radiological releases to the environment. These became two commonly used quantitative indicators or metrics of plant risk (or, reciprocally, plant safety). They were evaluated for their magnitude (e.g. the expected number of events per specified time interval), as well as their profile (e.g. the types of contributing events). The information for quantitative risk indicators (to be used in risk informing process) is obtained from plant's probabilistic safety analyses or analyses of hazards. It is dependable on issues such as availability of input data or quality of model or analysis. Nuclear power plant Krsko has recently performed Periodic Safety Review, which was a good opportunity to evaluate and integrate the plant specific information on quantitative plant risk indicators and their profile. The paper discusses some aspects of quantitative plant risk profile and its perception.(author)

  14. Sacroiliac Joint/Sacrum Uptake Ratio Measured by Quantitative Sacroiliac Joint Scintigraphy

    International Nuclear Information System (INIS)

    Lee, Young Yiul; Park, Seon Yang; Lee, Myung Chul; Choi, Sang Jae; Cho, Bo Youn; Choe, Kang Won; Koh, Chang Soon

    1982-01-01

    To evaluate the diagnostic usefulness and significance of quantitative sacroiliac joint scintigraphy in the assessment of sacroiliitis, we measured Sacroiliac Joint/Sacrum Uptake Ratio (SIS Ratio) by region of interest (ROI) method using 99m Tc-methylene diphosphonate. The observed results were as follows:1) Using ROI method, the SIS ratios for the control group of 65 persons were 1.05±0.08 (left) and 1.06±0.07 (right) which were narrower in range than those of slice method (mean±S.D.) 2) The effects of age, gender and laterality on SIS ratio were not significant. 3) In left side, one of 6 patients with rheumatoid arthritis had SIS ratio in excess of 2 standard deviation of normal control group, and remainder had SIS ratios within normal limit. In right side, 3 patients had SIS ratios in excess of 2 standard deviation of normal control group, and remainder, within normal limit. 4) In both sacroiliac joint, 2 of 3 patients having sacroiliitis clinically with Reiter's syndrome whose pelvis A-P X-ray findings showed normal had high SIS ratios (left/right; 1.31/1.69, 1.90/1.80), but SIS ratio of one patient who bad no evidence of sacroiliitis clinically was within normal limit. 5) In 6 patients with ankylosing spondylitis in both sacroiliac joints, q whose pelvis A-P X-ray findings showed severe sclerotic change of sacroiliac pints had SIS ratio within normal limit or below that of normal control group, and SIS ratios of 2 patients whose pelvis A-P X-ray findings showed were increased. 6) 4 of 5 patients with low back pain of which cause could not be evaluated clinically and radiologically had SIS ratios in excess of that of normal control group. It would be concluded the quantitative sacroiliac joint scintigraphy is useful and sensitive screening method in the diagnosis as well as in the assessment clinical activity of sacroiliitis.

  15. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    Science.gov (United States)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  16. Control measurement system in purex process

    International Nuclear Information System (INIS)

    Mani, V.V.S.

    1985-01-01

    The dependence of a bulk facility handling Purex Process on the control measurement system for evaluating the process performance needs hardly be emphasized. process control, Plant control, inventory control and quality control are the four components of the control measurement system. The scope and requirements of each component are different and the measurement methods are selected accordingly. However, each measurement system has six important elements. These are described in detail. The quality assurance programme carried out by the laboratory as a mechanism through which the quality of measurements is regularly tested and stated in quantitative terms is also explained in terms of internal and external quality assurance, with examples. Suggestions for making the control measurement system more responsive to the operational needs in future are also briefly discussed. (author)

  17. Quantitative head ultrasound measurements to determine thresholds for preterm neonates requiring interventional therapies following intraventricular hemorrhage

    Science.gov (United States)

    Kishimoto, Jessica; Fenster, Aaron; Salehi, Fateme; Romano, Walter; Lee, David S. C.; de Ribaupierre, Sandrine

    2016-04-01

    Dilation of the cerebral ventricles is a common condition in preterm neonates with intraventricular hemorrhage (IVH). This post hemorrhagic ventricle dilation (PHVD) can lead to lifelong neurological impairment through ischemic injury due to increased intracranial pressure and without treatment, can lead to death. Clinically, 2D ultrasound (US) through the fontanelles ('soft spots') of the patients are serially acquired to monitor the progression of the ventricle dilation. These images are used to determine when interventional therapies such as needle aspiration of the built up cerebrospinal fluid (CSF) ('ventricle tap', VT) might be indicated for a patient; however, quantitative measurements of the growth of the ventricles are often not performed. There is no consensus on when a neonate with PHVD should have an intervention and often interventions are performed after the potential for brain damage is quite high. Previously we have developed and validated a 3D US system to monitor the progression of ventricle volumes (VV) in IVH patients. We will describe the potential utility of quantitative 2D and 3D US to monitor and manage PHVD in neonates. Specifically, we will look to determine image-based measurement thresholds for patients who will require VT in comparison to patients with PHVD who resolve without intervention. Additionally, since many patients who have an initial VT will require subsequent interventions, we look at the potential for US to determine which PHVD patients will require additional VT after the initial one has been performed.

  18. Age-Related Quantitative and Qualitative Changes in Decision Making Ability

    OpenAIRE

    Isella, Valeria; Mapelli, Cristina; Morielli, Nadia; Pelati, Oriana; Franceschi, Massimo; Appollonio, Ildebrando Marco

    2008-01-01

    The ?frontal aging hypothesis? predicts that brain senescence affects predominantly the prefrontal regions. Preliminary evidence has recently been gathered in favour of an age-related change in a typically frontal process, i.e. decision making, using the Iowa Gambling Task (IGT), but overall findings have been conflicting. Following the traditional scoring method, coupled with a qualitative analysis, in the present study we compared IGT performance of 40 young (mean age: 27.9 ? 4.7) and 40 ol...

  19. Quantitative imaging of magnetic nanoparticles by magneto-relaxometric tomography for biomedical applications; Quantitative Bildgebung magnetischer Nanopartikel mittels magnetrelaxometrischer Tomographie fuer biomedizinische Anwendungen

    Energy Technology Data Exchange (ETDEWEB)

    Liebl, Maik

    2016-11-18

    generate a time-multiplexed sequence of precise magnetic fields for spatially constrained magnetizing of the MNP distribution. The unit has been integrated into a sensor system containing 304 superconducting quantum interference devices (SQUIDs) used for the spatially resolved detection of the MNP responses after each magnetizing. Furthermore, for evaluation of MRX tomography MNP phantoms reflecting the MNP distribution after magnetic drug targeting therapy in animals were designed and implemented. Using these phantoms, MNP distributions with clinical MNP doses in the milligram range could be quantitatively reconstructed by MRX tomography within a field of view up to 600 cm³ and a spatial resolution of a few cubic centimeters. The deviation between the quantified and nominal MNP amount was found to be below 10%. With the present experimental setup MRX tomography measurements of a complete MNP distribution were performed within the typical anesthesia time interval of a few minutes prevailing in preclinical animal studies. By implementing advanced magnetizing sequences this measurement time of the MRX tomography setup could be reduced to below 30 s. Finally, using the same MRX tomography setup a binding state specific quantitative imaging of MNP distributions was achieved by incorporating the temporal MNP relaxation behavior into the reconstruction. Hence, MRX tomography has the potential to image the influence of the local biological environment on the physical properties of the MNPs. The presented MRX tomography setup allows for sensitive and specific spatially resolved 3D quantification of MNPs in small animals. This represents an important step towards the development of a clinical imaging tool for the control and assessment of MNP based cancer treatments. Moreover, by adjusting the excitation coils the field of view could be easily enlarged making MRX tomography quite conceivable for human application.

  20. Quantitative measurement of lung density with x-ray CT and positron CT, (2). Diseased subjects

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Kengo; Ito, Masatoshi; Kubota, Kazuo

    1985-05-01

    Lung density was quantitatively measured on six diseased patients with X-ray CT (XCT) and Positron CT(PCT). The findings are as follows: In the silicosis, extravascular lung density was found to be remarkably increased compared to normals (0.29gcm/sup 3/), but blood volume was in normal range. In the post-irradiated lung cancers, extravascular lung density increased in the irradiated sites compared to the non-irradiated opposite sites, and blood volume varied in each case. In a patient with chronic heart failure, blood volume decreased (0.11mlcm/sup 3/) with increased extravascular lung density (0.23gcm/sup 3/). In the chronic obstructive pulmonary disease, both extravascular lung density and blood volume decreased (0.11gcm/sup 3/ and 0.10mlcm/sup 3/ respectively). Lung density measured with XCT was constantly lower than that with PCT in all cases. But changes in the values of lung density measured, correlated well with each other. In conclusion, the method presented here may clarify the etiology of the diffuse pulmonary diseases, and be used to differentiate and grade the diseases.

  1. Quantitative metal magnetic memory reliability modeling for welded joints

    Science.gov (United States)

    Xing, Haiyan; Dang, Yongbin; Wang, Ben; Leng, Jiancheng

    2016-03-01

    Metal magnetic memory(MMM) testing has been widely used to detect welded joints. However, load levels, environmental magnetic field, and measurement noises make the MMM data dispersive and bring difficulty to quantitative evaluation. In order to promote the development of quantitative MMM reliability assessment, a new MMM model is presented for welded joints. Steel Q235 welded specimens are tested along the longitudinal and horizontal lines by TSC-2M-8 instrument in the tensile fatigue experiments. The X-ray testing is carried out synchronously to verify the MMM results. It is found that MMM testing can detect the hidden crack earlier than X-ray testing. Moreover, the MMM gradient vector sum K vs is sensitive to the damage degree, especially at early and hidden damage stages. Considering the dispersion of MMM data, the K vs statistical law is investigated, which shows that K vs obeys Gaussian distribution. So K vs is the suitable MMM parameter to establish reliability model of welded joints. At last, the original quantitative MMM reliability model is first presented based on the improved stress strength interference theory. It is shown that the reliability degree R gradually decreases with the decreasing of the residual life ratio T, and the maximal error between prediction reliability degree R 1 and verification reliability degree R 2 is 9.15%. This presented method provides a novel tool of reliability testing and evaluating in practical engineering for welded joints.

  2. Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.

    Science.gov (United States)

    Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R

    2014-08-01

    To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.

  3. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    Science.gov (United States)

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  4. Applications of quantitative remote sensing to hydrology

    NARCIS (Netherlands)

    Su, Z.; Troch, P.A.A.

    2003-01-01

    In order to quantify the rates of the exchanges of energy and matter among hydrosphere, biosphere and atmosphere, quantitative description of land surface processes by means of measurements at different scales are essential. Quantitative remote sensing plays an important role in this respect. The

  5. Effort-Based Decision Making in Schizophrenia: Evaluation of Paradigms to Measure Motivational Deficits.

    Science.gov (United States)

    Green, Michael F; Horan, William P

    2015-09-01

    Effort-based decision making requires one to decide how much effort to expend for a certain amount of reward. As the amount of reward goes up most people are willing to exert more effort. This relationship between reward level and effort expenditure can be measured in specialized performance-based tasks that have only recently been applied to schizophrenia. Such tasks provide a way to measure objectively motivational deficits in schizophrenia, which now are only assessed with clinical interviews of negative symptoms. The articles in this theme provide reviews of the relevant animal and human literatures (first 2 articles), and then a psychometric evaluation of 5 effort-based decision making paradigms (last 2 articles). This theme section is intended to stimulate interest in this emerging area among basic scientists developing paradigms for preclinical studies, human experimentalists trying to disentangle factors that contribute to performance on effort-based tasks, and investigators looking for objective endpoints for clinical trials of negative symptoms in schizophrenia. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  6. An experimental method for making spectral emittance and surface temperature measurements of opaque surfaces

    International Nuclear Information System (INIS)

    Moore, Travis J.; Jones, Matthew R.; Tree, Dale R.; Daniel Maynes, R.; Baxter, Larry L.

    2011-01-01

    An experimental procedure has been developed to make spectral emittance and temperature measurements. The spectral emittance of an object is calculated using measurements of the spectral emissive power and of the surface temperature of the object obtained using a Fourier transform infrared (FTIR) spectrometer. A calibration procedure is described in detail which accounts for the temperature dependence of the detector. The methods used to extract the spectral emissive power and surface temperature from measured infrared spectra were validated using a blackbody radiator at known temperatures. The average error in the measured spectral emittance was 2.1% and the average difference between the temperature inferred from the recorded spectra and the temperature indicated on the blackbody radiator was 1.2%. The method was used to measure the spectral emittance of oxidized copper at various temperatures.

  7. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  8. Managerial Decision Making in Geopolitically Turbulent Environments

    OpenAIRE

    Gawlik, Remigiusz

    2010-01-01

    The presented paper is a presentation of final results of research led throughout past years on a group of Polish and international SME’s. The essential aim was the elaboration of a decision – making model including both qualitative and quantitative factors that influence decision – making processes. Most focus has been put on geopolitical determinants of international companies’ development. In order to narrow the research field, a further limitation has been made in the type of undertaken s...

  9. Urban flooding and health risk analysis by use of quantitative microbial risk assessment

    DEFF Research Database (Denmark)

    Andersen, Signe Tanja

    D thesis is to identify the limitations and possibilities for optimising microbial risk assessments of urban flooding through more evidence-based solutions, including quantitative microbial data and hydrodynamic water quality models. The focus falls especially on the problem of data needs and the causes......, but also when wading through a flooded area. The results in this thesis have brought microbial risk assessments one step closer to more uniform and repeatable risk analysis by using actual and relevant measured data and hydrodynamic water quality models to estimate the risk from flooding caused...... are expected to increase in the future. To ensure public health during extreme rainfall, solutions are needed, but limited knowledge on microbial water quality, and related health risks, makes it difficult to implement microbial risk analysis as a part of the basis for decision making. The main aim of this Ph...

  10. Moving towards tangible decision-making tools for policy makers: Measuring and monitoring energy access provision

    International Nuclear Information System (INIS)

    Bhanot, Jaya; Jha, Vivek

    2012-01-01

    Access to energy services has been recognised as central to achieving economic growth and sustainable development. However, almost 1.3 billion people in the world still lack access to electricity and 2.7 billion lack access to clean cooking facilities. In this backdrop, the issue of energy access is receiving more interest than ever before and this has brought to the fore, the need for a robust decision support tool for policy makers to measure the progress of energy access provision and also to provide direction for future policy making. The paper studies existing definitions of energy access and identifies the key requirements for an appropriate decision-making tool to measure and monitor energy access provision. In this context the paper assesses the strengths and weaknesses of the metrics currently being used to measure energy access in policy, as well as of contemporary monitoring and evaluation frameworks being used in other sectors. Based on these insights, a dashboard of indicators is proposed as an alternate decision support tool for policy makers to measure energy access. The paper concludes with a discussion on what is needed to operationalise this proposed framework. - Highlights: ► No one indicator or metric can successfully capture progress on energy access. ► A service oriented approach is necessary to measure energy access. ► Socio-economic and political contexts influence success of energy access policies.

  11. Patients' and observers' perceptions of involvement differ. Validation study on inter-relating measures for shared decision making.

    Directory of Open Access Journals (Sweden)

    Jürgen Kasper

    Full Text Available OBJECTIVE: Patient involvement into medical decisions as conceived in the shared decision making method (SDM is essential in evidence based medicine. However, it is not conclusively evident how best to define, realize and evaluate involvement to enable patients making informed choices. We aimed at investigating the ability of four measures to indicate patient involvement. While use and reporting of these instruments might imply wide overlap regarding the addressed constructs this assumption seems questionable with respect to the diversity of the perspectives from which the assessments are administered. METHODS: The study investigated a nested cohort (N = 79 of a randomized trial evaluating a patient decision aid on immunotherapy for multiple sclerosis. Convergent validities were calculated between observer ratings of videotaped physician-patient consultations (OPTION and patients' perceptions of the communication (Shared Decision Making Questionnaire, Control Preference Scale & Decisional Conflict Scale. RESULTS: OPTION reliability was high to excellent. Communication performance was low according to OPTION and high according to the three patient administered measures. No correlations were found between observer and patient judges, neither for means nor for single items. Patient report measures showed some moderate correlations. CONCLUSION: Existing SDM measures do not refer to a single construct. A gold standard is missing to decide whether any of these measures has the potential to indicate patient involvement. PRACTICE IMPLICATIONS: Pronounced heterogeneity of the underpinning constructs implies difficulties regarding the interpretation of existing evidence on the efficacy of SDM. Consideration of communication theory and basic definitions of SDM would recommend an inter-subjective focus of measurement. TRIAL REGISTRATION: Controlled-Trials.com ISRCTN25267500.

  12. Patients' and observers' perceptions of involvement differ. Validation study on inter-relating measures for shared decision making.

    Science.gov (United States)

    Kasper, Jürgen; Heesen, Christoph; Köpke, Sascha; Fulcher, Gary; Geiger, Friedemann

    2011-01-01

    Patient involvement into medical decisions as conceived in the shared decision making method (SDM) is essential in evidence based medicine. However, it is not conclusively evident how best to define, realize and evaluate involvement to enable patients making informed choices. We aimed at investigating the ability of four measures to indicate patient involvement. While use and reporting of these instruments might imply wide overlap regarding the addressed constructs this assumption seems questionable with respect to the diversity of the perspectives from which the assessments are administered. The study investigated a nested cohort (N = 79) of a randomized trial evaluating a patient decision aid on immunotherapy for multiple sclerosis. Convergent validities were calculated between observer ratings of videotaped physician-patient consultations (OPTION) and patients' perceptions of the communication (Shared Decision Making Questionnaire, Control Preference Scale & Decisional Conflict Scale). OPTION reliability was high to excellent. Communication performance was low according to OPTION and high according to the three patient administered measures. No correlations were found between observer and patient judges, neither for means nor for single items. Patient report measures showed some moderate correlations. Existing SDM measures do not refer to a single construct. A gold standard is missing to decide whether any of these measures has the potential to indicate patient involvement. Pronounced heterogeneity of the underpinning constructs implies difficulties regarding the interpretation of existing evidence on the efficacy of SDM. Consideration of communication theory and basic definitions of SDM would recommend an inter-subjective focus of measurement. Controlled-Trials.com ISRCTN25267500.

  13. Consideration of Normal Variation of Perfusion Measurements in the Quantitative Analysis of Myocardial Perfusion SPECT: Usefulness in Assessment of Viable Myocardium

    International Nuclear Information System (INIS)

    Paeng, Jin Chul; Lim, Il Han; Kim, Ki Bong; Lee, Dong Soo

    2008-01-01

    Although automatic quantification software of myocardial perfusion SPECT provides highly objective and reproducible quantitative measurements, there is still some limitation in the direct use of quantitative measurements. In this study we derived parameters using normal variation of perfusion measurements, and tried to test the usefulness of these parameters. In order to calculate normal variation of perfusion measurements on myocardial perfusion SPECT, 55 patients (M:F=28:27) of low-likelihood for coronary artery disease were enrolled and 201 Tl rest / 99m Tc-MIBI stress SPECT studies were performed. Using 20-segment model, mean (m) and standard deviation (SD) of perfusion were calculated in each segment. As a myocardial viability assessment group, another 48 patients with known coronary artery disease, who underwent coronary artery bypass graft surgery (CABG) were enrolled. 201 Tl rest / 99m Tc-MIBI stress / 201 Tl 24-hr delayed SPECT was performed before CABG and SPECT was followed up 3 months after CABG. From the preoperative 24-hr delayed SPECT, Q delay (perfusion measurement), Δ delay (Q delay .m) and Z delay ((Q delay .m)/SD) were defined and diagnostic performances of them for myocardial viability were evaluated using area under curve (AUC) on receiver operating characteristic (ROC) curve analysis. Segmental perfusion measurements showed considerable normal variations among segments. In men, the lowest segmental perfusion measurement was 51.8±6.5 and the highest segmental perfusion was 87.0±5.9, and they are 58.7±8.1 and 87.3±6.0, respectively in women. In the viability assessment, Q delay showed AUC of 0.633, while those for Δ delay and Z delay were 0.735 and 0.716, respectively. The AUCs of Δ delay and Z delay were significantly higher than that of Q delay (p=0.001 and 0.018, respectively). The diagnostic performance of Δ delay , which showed highest AUC, was 85% of sensitivity and 53% of specificity at the optimal cutoff of -24.7. On automatic

  14. Theoretical approach on microscopic bases of stochastic functional self-organization: quantitative measures of the organizational degree of the environment

    Energy Technology Data Exchange (ETDEWEB)

    Oprisan, Sorinel Adrian [Department of Psychology, University of New Orleans, New Orleans, LA (United States)]. E-mail: soprisan@uno.edu

    2001-11-30

    There has been increased theoretical and experimental research interest in autonomous mobile robots exhibiting cooperative behaviour. This paper provides consistent quantitative measures of organizational degree of a two-dimensional environment. We proved, by the way of numerical simulations, that the theoretically derived values of the feature are reliable measures of aggregation degree. The slope of the feature's dependence on memory radius leads to an optimization criterion for stochastic functional self-organization. We also described the intellectual heritages that have guided our research, as well as possible future developments. (author)

  15. Student evaluations of teaching: teaching quantitative courses can be hazardous to one's career.

    Science.gov (United States)

    Uttl, Bob; Smibert, Dylan

    2017-01-01

    Anonymous student evaluations of teaching (SETs) are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs) such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors' teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English) is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  16. Laser-induced breakdown spectroscopy (LIBS) to measure quantitatively soil carbon with emphasis on soil organic carbon. A review.

    Science.gov (United States)

    Senesi, Giorgio S; Senesi, Nicola

    2016-09-28

    Soil organic carbon (OC) measurement is a crucial factor for quantifying soil C pools and inventories and monitoring the inherent temporal and spatial heterogeneity and changes of soil OC content. These are relevant issues in addressing sustainable management of terrestrial OC aiming to enhance C sequestration in soil, thus mitigating the impact of increasing CO2 concentration in the atmosphere and related effects on global climate change. Nowadays, dry combustion by an elemental analyzer or wet combustion by dichromate oxidation of the soil sample are the most recommended and commonly used methods for quantitative soil OC determination. However, the unanimously recognized uncertainties and limitations of these classical laboursome methods have prompted research efforts focusing on the development and application of more advanced and appealing techniques and methods for the measurement of soil OC in the laboratory and possibly in situ in the field. Among these laser-induced breakdown spectroscopy (LIBS) has raised the highest interest for its unique advantages. After an introduction and a highlight of the LIBS basic principles, instrumentation, methodologies and supporting chemometric methods, the main body of this review provides an historical and critical overview of the developments and results obtained up-to-now by the application of LIBS to the quantitative measurement of soil C and especially OC content. A brief critical summary of LIBS advantages and limitations/drawbacks including some final remarks and future perspectives concludes this review. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. A quantitative method for measurement of lysosomal acid phosphatase latency in cultured rat heart cells with 210Pb

    International Nuclear Information System (INIS)

    Hale, T.W.; Wenzel, D.G.

    1978-01-01

    A method is described for measuring the latency of lysomal acid phosphatase in cultured rat heart endotheloid cells. 210 Pb was added to a medium used to demonstrate acid phosphatase activity by the Gomori lead method, and the amount of lead deposited was measured with a liquid scintillation counter. Deposition rates were measured after enzyme activation pretreatments with acetate buffer (pH 5.0) at various osmolalities, and after formaldehyde fixation. Formaldehyde, alloxan, or fluoride in the Gomori medium were evaluated for their differential effects on lysosomal and non-lysosomal acid phosphatase The method was found to provide a sensitive, rapid and quantitative evaluation of acid phosphatase latency and should be useful for studying the integrity of lysosomes within cells. (author)

  18. Quantitative radioisotope measurement of duodenogastric reflux in patients with ulcer or gastrectomized for ulcer

    International Nuclear Information System (INIS)

    Hyoedynmaa, S.; Paeaekkoenen, A.; Laensimies, E.; Korhonen, K.; Paeaekkoenen, M.; Aukee, S.

    1985-01-01

    In this work the duodenogastric reflux was quantified as the amount of radioactivity entering the stomach after an i.v. administration of sup(99m)Tc-HIDA in ulcer patients and in patients who had undergone BI gastrectomy. The results were compared with visual evidence of gastric activity in the gamma camera images and biochemical determination of gastric bile reflux. The method is useful in quantifying the reflux if the activity is above the background activity. It allows the determination of an upper limit for the reflux when the reflux is evident visually. Only two or three images are needed for the quantitation. No correlation was found between biochemical measurements of fasting bile reflux in the stomach and radioisotopic quantification. (orig.) [de

  19. Quantitative radioisotope measurement of duodenogastric reflux in patients with ulcer or gastrectomized for ulcer

    Energy Technology Data Exchange (ETDEWEB)

    Hyoedynmaa, S.; Paeaekkoenen, A.; Laensimies, E.; Korhonen, K.; Paeaekkoenen, M.; Aukee, S.

    1985-06-01

    In this work the duodenogastric reflux was quantified as the amount of radioactivity entering the stomach after an i.v. administration of sup(99m)Tc-HIDA in ulcer patients and in patients who had undergone BI gastrectomy. The results were compared with visual evidence of gastric activity in the gamma camera images and biochemical determination of gastric bile reflux. The method is useful in quantifying the reflux if the activity is above the background activity. It allows the determination of an upper limit for the reflux when the reflux is evident visually. Only two or three images are needed for the quantitation. No correlation was found between biochemical measurements of fasting bile reflux in the stomach and radioisotopic quantification.

  20. Actual interaction effects between policy measures for energy efficiency-A qualitative matrix method and quantitative simulation results for households

    International Nuclear Information System (INIS)

    Boonekamp, Piet G.M.

    2006-01-01

    Starting from the conditions for a successful implementation of saving options, a general framework was developed to investigate possible interaction effects in sets of energy policy measures. Interaction regards the influence of one measure on the energy saving effect of another measure. The method delivers a matrix for all combinations of measures, with each cell containing qualitative information on the strength and type of interaction: overlapping, reinforcing, or independent of each other. Results are presented for the set of policy measures on household energy efficiency in the Netherlands for 1990-2003. The second part regards a quantitative analysis of the interaction effects between three major measures: a regulatory energy tax, investment subsidies and regulation of gas use for space heating. Using a detailed bottom-up model, household energy use in the period 1990-2000 was simulated with and without these measures. The results indicate that combinations of two or three policy measures yield 13-30% less effect than the sum of the effects of the separate measures

  1. The making of pressure measurement device on heating-02 based realtime

    International Nuclear Information System (INIS)

    Giarno; Kussigit Santosa; Agus Nur Rachman; G B Heru K

    2013-01-01

    In order to modify the installation strand BETA Test Section Test integrated with heating-02 into a closed loop, it would require an additional system that can measure pressure changes in the closed-loop system. By making the measurement device to test the system pressure at the heating-expected 02 researchers can monitor the pressure changes that occur in the system. The pressure gauge device fabrication using manufacturing simulation methodology, the preparation of the hardware and software and test functions. Manufacturing simulation using measuring devices HIOKI DC current source Signal Source, preparation of pressure measurement devices require hardware such as pressure transducers, NI cRIO-9074, NI 9203 analog module, Computer and software LabVIEW 2011 as programming. In the test process function method is used to provide flow simulation module that is connected to the 9203 NI NI cRIO-9074. Current provision tailored to the specifics pressure transducer is 4 mA s/d 20 mA. Based on the test results obtained function value of the lowest current is 4.00 mA = 0.001 bar, and the highest current value of 20.00 mA = 4995 bar. From the results of calculations using the linear equations obtained correlation coefficient (R 2 ) of 0.999, so it is evident that the pressure changes in LabVIEW is affected by changes in flow. The results obtained from this activity is a device that can measure the pressure in the heating-02 test. (author)

  2. Quantitative measurement of natural radioactivity in some roofing tile materials used in upper Egypt

    International Nuclear Information System (INIS)

    Uosif, M. A. M.

    2013-01-01

    The quantitative measurement of radionuclides ( 226 Ra, 232 Th and 40 K) in some roofing tile materials (granite, alabaster, marble, traditional and advanced ceramic) used in Upper Egypt is presented in this paper. Measurements were done by using gamma spectrometry (NaI (Tl) 3'' x 3''). The values of concentration of natural radionuclides were in the following ranges: 12-78.9 Bq kg -1 for 226 Ra, 8.4-113.1 Bq kg -1 for 232 Th and 94.9-509 Bq kg -1 for 40 K. The activity concentration index (I), the specific dose rates indoors (D . ) and the annual effective dose (D E ) due to gamma radiation were calculated for each investigated sample. The lowest value of I is 0.19 for alabaster, while the highest one is 0.88 for traditional and advanced ceramic. The ranges of D E are between 0.03 and 0.13 mSv, it is below the maximal permitted values, so that the examined materials could be used as roofing tiles in the construction of new buildings. (authors)

  3. Importance measures in risk-informed decision making: Ranking, optimisation and configuration control

    Energy Technology Data Exchange (ETDEWEB)

    Vaurio, Jussi K., E-mail: jussi.vaurio@pp1.inet.fi [Prometh Solutions, Hiihtaejaenkuja 3K, 06100 Porvoo (Finland)

    2011-11-15

    This paper describes roles, extensions and applications of importance measures of components and configurations for making risk-informed decisions relevant to system operations, maintenance and safety. Basic importance measures and their relationships are described for independent and mutually exclusive events and for groups of events associated with common cause failures. The roles of importances are described mainly in two groups of activities: (a) ranking safety significance of systems, structures, components and human actions for preventive safety assurance activities, and (b) making decisions about permissible permanent and temporary configurations and allowed configuration times for regulation, technical specifications and for on-line risk monitoring. Criticality importance and sums of criticalities turn out to be appropriate measures for ranking and optimization. Several advantages are pointed out and consistent ranking of pipe segments for in-service inspection is provided as an example. Risk increase factor and its generalization risk gain are most appropriately used to assess corrective priorities and acceptability of a situation when components are already failed or when planning to take one or more components out of service for maintenance. Precise definitions are introduced for multi-failure configurations and it is shown how they can be assessed under uncertainties, in particular when common cause failures or success states may be involved. A general weighted average method is compared to other candidate methods in benchmark cases. It is the preferable method for prediction when a momentary configuration is known or only partially known. Potential applications and optimization of allowed outage times are described. The results show how to generalize and apply various importance measures to ranking and optimization and how to manage configurations in uncertain multi-failure situations. - Highlights: > Rigorous methods developed for using importances

  4. Importance measures in risk-informed decision making: Ranking, optimisation and configuration control

    International Nuclear Information System (INIS)

    Vaurio, Jussi K.

    2011-01-01

    This paper describes roles, extensions and applications of importance measures of components and configurations for making risk-informed decisions relevant to system operations, maintenance and safety. Basic importance measures and their relationships are described for independent and mutually exclusive events and for groups of events associated with common cause failures. The roles of importances are described mainly in two groups of activities: (a) ranking safety significance of systems, structures, components and human actions for preventive safety assurance activities, and (b) making decisions about permissible permanent and temporary configurations and allowed configuration times for regulation, technical specifications and for on-line risk monitoring. Criticality importance and sums of criticalities turn out to be appropriate measures for ranking and optimization. Several advantages are pointed out and consistent ranking of pipe segments for in-service inspection is provided as an example. Risk increase factor and its generalization risk gain are most appropriately used to assess corrective priorities and acceptability of a situation when components are already failed or when planning to take one or more components out of service for maintenance. Precise definitions are introduced for multi-failure configurations and it is shown how they can be assessed under uncertainties, in particular when common cause failures or success states may be involved. A general weighted average method is compared to other candidate methods in benchmark cases. It is the preferable method for prediction when a momentary configuration is known or only partially known. Potential applications and optimization of allowed outage times are described. The results show how to generalize and apply various importance measures to ranking and optimization and how to manage configurations in uncertain multi-failure situations. - Highlights: → Rigorous methods developed for using importances

  5. A far-field-viewing sensor for making analytical measurements in remote locations.

    Science.gov (United States)

    Michael, K L; Taylor, L C; Walt, D R

    1999-07-15

    We demonstrate a far-field-viewing GRINscope sensor for making analytical measurements in remote locations. The GRINscope was fabricated by permanently affixing a micro-Gradient index (GRIN) lens on the distal face of a 350-micron-diameter optical imaging fiber. The GRINscope can obtain both chemical and visual information. In one application, a thin, pH-sensitive polymer layer was immobilized on the distal end of the GRINscope. The ability of the GRINscope to visually image its far-field surroundings and concurrently detect pH changes in a flowing stream was demonstrated. In a different application, the GRINscope was used to image pH- and O2-sensitive particles on a remote substrate and simultaneously measure their fluorescence intensity in response to pH or pO2 changes.

  6. Quantitative Market Research Regarding Funding of District 8 Construction Projects

    Science.gov (United States)

    1995-05-01

    The primary objective of this quantitative research is to provide information : for more effective decision making regarding the level of investment in various : transportation systems in District 8. : This objective was accomplished by establishing ...

  7. Identification of Phosphoglycerate Kinase 1 (PGK1 as a reference gene for quantitative gene expression measurements in human blood RNA

    Directory of Open Access Journals (Sweden)

    Unger Elizabeth R

    2011-09-01

    Full Text Available Abstract Background Blood is a convenient sample and increasingly used for quantitative gene expression measurements with a variety of diseases including chronic fatigue syndrome (CFS. Quantitative gene expression measurements require normalization of target genes to reference genes that are stable and independent from variables being tested in the experiment. Because there are no genes that are useful for all situations, reference gene selection is an essential step to any quantitative reverse transcription-PCR protocol. Many publications have described appropriate genes for a wide variety of tissues and experimental conditions, however, reference genes that may be suitable for the analysis of CFS, or human blood RNA derived from whole blood as well as isolated peripheral blood mononuclear cells (PBMCs, have not been described. Findings Literature review and analyses of our unpublished microarray data were used to narrow down the pool of candidate reference genes to six. We assayed whole blood RNA from Tempus tubes and cell preparation tube (CPT-collected PBMC RNA from 46 subjects, and used the geNorm and NormFinder algorithms to select the most stable reference genes. Phosphoglycerate kinase 1 (PGK1 was one of the optimal normalization genes for both whole blood and PBMC RNA, however, additional genes differed for the two sample types; Ribosomal protein large, P0 (RPLP0 for PBMC RNA and Peptidylprolyl isomerase B (PPIB for whole blood RNA. We also show that the use of a single reference gene is sufficient for normalization when the most stable candidates are used. Conclusions We have identified PGK1 as a stable reference gene for use with whole blood RNA and RNA derived from PBMC. When stable genes are selected it is possible to use a single gene for normalization rather than two or three. Optimal normalization will improve the ability of results from PBMC RNA to be compared with those from whole blood RNA and potentially allows comparison of

  8. Fuzzy Group Decision Making Approach for Ranking Work Stations Based on Physical Pressure

    Directory of Open Access Journals (Sweden)

    Hamed Salmanzadeh

    2014-06-01

    Full Text Available This paper proposes a Fuzzy Group Decision Making approach for ranking work stations based on physical pressure. Fuzzy group decision making approach allows experts to evaluate different ergonomic factors using linguistic terms such as very high, high, medium, low, very low, rather than precise numerical values. In this way, there is no need to measure parameters and evaluation can be easily made in a group. According to ergonomics much work contents and situations, accompanied with multiple parameters and uncertainties, fuzzy group decision making is the best way to evaluate such a chameleon of concept. A case study was down to utilize the approach and illustrate its application in ergonomic assessment and ranking the work stations based on work pressure and found that this approach provides flexibility, practicality, efficiency in making decision around ergonomics areas. The normalized defuzzification numbers which are resulted from this method are compared with result of quantitative assessment of Automotive Assembly Work Sheet auto, it’s demonstrated that the proposed method result is 10% less than Automotive Assembly Work Sheet, approximately.

  9. A development of a quantitative situation awareness measurement tool: Computational Representation of Situation Awareness with Graphical Expressions (CoRSAGE)

    International Nuclear Information System (INIS)

    Yim, Ho Bin; Lee, Seung Min; Seong, Poong Hyun

    2014-01-01

    Highlights: • We proposed quantitative situation awareness (SA) evaluation technique. • We developed a computer based SA evaluation tool for NPPs training environment. • We introduced three rules and components to express more human-like results. • We conducted three sets of training with real plant operators. • Results showed that the tool could reasonably represent operator’s SA. - Abstract: Operator performance measures are used for multiple purposes, such as control room design, human system interface (HSI) evaluation, training, and so on. Performance measures are often focused on results; however, especially for a training purpose – at least in a nuclear industry, more detailed descriptions about processes are required. Situation awareness (SA) measurements have directly/indirectly played as a complimentary measure and provided descriptive insights on how to improve performance of operators for the next training. Unfortunately, most of the well-developed SA measurement techniques, such as Situation Awareness Global Assessment Technique (SAGAT) need an expert opinion which sometimes troubles easy spread of measurement’s application or usage. A quantitative SA measurement tool named Computational Representation of Situation Awareness with Graphical Expressions (CoRSAGE) is introduced to resolve some of these concerns. CoRSAGE is based on production rules to represent a human operator’s cognitive process of problem solving, and Bayesian inference to quantify it. Petri Net concept is also used for graphical expressions of SA flow. Three components – inference transition, volatile/non-volatile memory tokens – were newly developed to achieve required functions. Training data of a Loss of Coolant Accident (LOCA) scenario for an emergency condition and an earthquake scenario for an abnormal condition by real plant operators were used to validate the tool. The validation result showed that CoRSAGE performed a reasonable match to other performance

  10. Accuracy of quantitative visual soil assessment

    Science.gov (United States)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  11. Quantitative Measures of Mineral Supply Risk

    Science.gov (United States)

    Long, K. R.

    2009-12-01

    Almost all metals and many non-metallic minerals are traded internationally. An advantage of global mineral markets is that minerals can be obtained from the globally lowest-cost source. For example, one rare-earth element (REE) mine in China, Bayan Obo, is able to supply most of world demand for rare earth elements at a cost significantly less than its main competitors. Concentration of global supplies at a single mine raises significant political risks, illustrated by China’s recent decision to prohibit the export of some REEs and severely limit the export of others. The expected loss of REE supplies will have a significant impact on the cost and production of important national defense technologies and on alternative energy programs. Hybrid vehicles and wind-turbine generators, for example, require REEs for magnets and batteries. Compact fluorescent light bulbs use REE-based phosphors. These recent events raise the general issue of how to measure the degree of supply risk for internationally sourced minerals. Two factors, concentration of supply and political risk, must first be addressed. Concentration of supply can be measured with standard economic tools for measuring industry concentration, using countries rather than firms as the unit of analysis. There are many measures of political risk available. That of the OECD is a measure of a country’s commitment to rule-of-law and enforcement of contracts, as well as political stability. Combining these measures provides a comparative view of mineral supply risk across commodities and identifies several minerals other than REEs that could suddenly become less available. Combined with an assessment of the impact of a reduction in supply, decision makers can use these measures to prioritize risk reduction efforts.

  12. Assessment of cognitive bias in decision-making and leadership styles among critical care nurses: a mixed methods study.

    Science.gov (United States)

    Lean Keng, Soon; AlQudah, Hani Nawaf Ibrahim

    2017-02-01

    To raise awareness of critical care nurses' cognitive bias in decision-making, its relationship with leadership styles and its impact on care delivery. The relationship between critical care nurses' decision-making and leadership styles in hospitals has been widely studied, but the influence of cognitive bias on decision-making and leadership styles in critical care environments remains poorly understood, particularly in Jordan. Two-phase mixed methods sequential explanatory design and grounded theory. critical care unit, Prince Hamza Hospital, Jordan. Participant sampling: convenience sampling Phase 1 (quantitative, n = 96), purposive sampling Phase 2 (qualitative, n = 20). Pilot tested quantitative survey of 96 critical care nurses in 2012. Qualitative in-depth interviews, informed by quantitative results, with 20 critical care nurses in 2013. Descriptive and simple linear regression quantitative data analyses. Thematic (constant comparative) qualitative data analysis. Quantitative - correlations found between rationality and cognitive bias, rationality and task-oriented leadership styles, cognitive bias and democratic communication styles and cognitive bias and task-oriented leadership styles. Qualitative - 'being competent', 'organizational structures', 'feeling self-confident' and 'being supported' in the work environment identified as key factors influencing critical care nurses' cognitive bias in decision-making and leadership styles. Two-way impact (strengthening and weakening) of cognitive bias in decision-making and leadership styles on critical care nurses' practice performance. There is a need to heighten critical care nurses' consciousness of cognitive bias in decision-making and leadership styles and its impact and to develop organization-level strategies to increase non-biased decision-making. © 2016 John Wiley & Sons Ltd.

  13. Susceptibility Testing by Polymerase Chain Reaction DNA Quantitation: A Method to Measure Drug Resistance of Human Immunodeficiency Virus Type 1 Isolates

    Science.gov (United States)

    Eron, Joseph J.; Gorczyca, Paul; Kaplan, Joan C.; D'Aquila, Richard T.

    1992-04-01

    Polymerase chain reaction (PCR) DNA quantitation (PDQ) susceptibility testing rapidly and directly measures nucleoside sensitivity of human immunodeficiency virus type 1 (HIV-1) isolates. PCR is used to quantitate the amount of HIV-1 DNA synthesized after in vitro infection of peripheral blood mononuclear cells. The relative amounts of HIV-1 DNA in cell lysates from cultures maintained at different drug concentrations reflect drug inhibition of virus replication. The results of PDQ susceptibility testing of 2- or 3-day cultures are supported by assays measuring HIV-1 p24 antigen production in supernatants of 7- or 10-day cultures. DNA sequence analyses to identify mutations in the reverse transcriptase gene that cause resistance to 3'-azido-3'-deoxythymidine also support the PDQ results. With the PDQ method, both infectivity titration and susceptibility testing can be performed on supernatants from primary cultures of peripheral blood mononuclear cells. PDQ susceptibility testing should facilitate epidemiologic studies of the clinical significance of drug-resistant HIV-1 isolates.

  14. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    International Nuclear Information System (INIS)

    Egan, James; McMillan, Normal; Denieffe, David

    2011-01-01

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  15. Fundamental limits of measurement in telecommunications: Experimental and modeling studies in a test optical network on proposal for the reform of telecommunication quantitations

    Energy Technology Data Exchange (ETDEWEB)

    Egan, James; McMillan, Normal; Denieffe, David, E-mail: eganj@itcarlow.ie [IT Carlow (Ireland)

    2011-08-17

    Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.

  16. Making Riverscapes Real (Invited)

    Science.gov (United States)

    Marcus, A.; Carbonneau, P.; Fonstad, M. A.; Walther, S. C.

    2009-12-01

    The structure and function of rivers have long been characterized either by: (1) qualitative models such as the River Continuum Concept or Serial Discontinuity Concept which paint broad descriptive portraits of how river habitats and communities vary, or (2) quantitative models, such as Downstream Hydraulic Geometry, which rely on a limited number of measurements spread widely throughout a river basin. In contrast, Fausch et al. (2002) proposed applying landscape ecology methods to rivers to create “riverscapes.” Application of the riverscape concept requires information on the spatial distribution of organism-scale habitats throughout entire river systems. In practical terms, this means that researchers must replicate maps of local habitat continuously throughout entire rivers to document and predict total habitat availability, structure, and function. Likewise, information on time-dependent variations in these river habitats is necessary. Given these requirements, it is not surprising that the riverscape approach has largely remained a conceptual framework with limited practical application. Recent advances in remote sensing and desktop computing, however, make the riverscape concept more achievable from a mapping perspective. Remote sensing methods now enable sub-meter measurements of depth, water surface slope, grain size, biotypes, algae, and plants, as well as estimation of derived parameters such as velocity and stream power. Although significant obstacles remain to basin-extent sub-meter mapping of physical habitat, recent advances are overcoming these obstacles and allowing the riverscape concept to be put into use by different agencies - at least from a physical habitat perspective. More problematic to the riverscape approach, however, are two major issues that cannot be solved with technical solutions. First is the difficulty in acquiring maps of fauna, whether they be macroinvertebrates, fish, or microorganisms, at scales and spatial extents

  17. Quantitative images of metals in plant tissues measured by laser ablation inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Becker, J.S.; Dietrich, R.C.; Matusch, A.; Pozebon, D.; Dressler, V.L.

    2008-01-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) was used for quantitative imaging of toxic and essential elements in thin sections (thickness of 30 or 40 μm) of tobacco plant tissues. Two-dimensional images of Mg, Fe, Mn, Zn, Cu, Cd, Rh, Pt and Pb in leaves, shoots and roots of tobacco were produced. Sections of the plant tissues (fixed onto glass slides) were scanned by a focused beam of a Nd:YAG laser in a laser ablation chamber. The ablated material was transported with argon as carrier gas to the ICP ion source at a quadrupole ICP-MS instrument. Ion intensities of the investigated elements were measured together with 13 C + , 33 S + and 34 S + within the entire plant tissue section. Matrix matching standards (prepared using powder of dried tobacco leaves) were used to constitute calibration curves, whereas the regression coefficient of the attained calibration curves was typically 0.99. The variability of LA-ICP-MS process, sample heterogeneity and water content in the sample were corrected by using 13 C + as internal standard. Quantitative imaging of the selected elements revealed their inhomogeneous distribution in leaves, shoots and roots

  18. INTEGRATED QUANTITATIVE ASSESSMENT OF CHANGES IN NEURO-ENDOCRINE-IMMUNE COMPLEX AND METABOLISM IN RATS EXPOSED TO ACUTE COLD-IMMOBILIZATION STRESS

    Directory of Open Access Journals (Sweden)

    Sydoruk O Sydoruk

    2016-09-01

        Abstracts Background. It is known that the reaction of the neuroendocrine-immune complex to acute and chronic stress are different. It is also known about sex differences in stress reactions. Previously we have been carry out integrated quantitative estimation of neuroendocrine and immune responses to chronic restraint stress at male rats. The purpose of this study - to carry out integrated quantitative estimation of neuroendocrine, immune and metabolic responses to acute stress at male and female rats. Material and research methods. The experiment is at 58 (28 male and 30 female white rats Wistar line weighing 170-280 g (Mean=220 g; SD=28 g. The day after acute (water immersion restraint stress determined HRV, endocrine, immune and metabolic parameters as well as gastric mucosa injuries and comparing them with parameters of intact animals. Results. Acute cold-immobilization stress caused moderate injuries the stomach mucosa as erosions and ulcers. Among the metabolic parameters revealed increased activity Acid Phosphatase, Asparagine and Alanine Aminotranspherase as well as Creatinephosphokinase. It was also found to reduce plasma Testosterone as well as serum Potassium and Phosphate probably due to increased Parathyrine and Mineralocorticoid activity and Sympathotonic shift of sympatho-vagal balance. Integrated quantitative measure manifestations of Acute Stress as mean of modules of Z-Scores makes for 10 metabolic parameters 0,75±0,10 σ and for 8 neuro-endocrine parameters 0,40±0,07 σ. Among immune parameters some proved resistant to acute stress factors, while 10 significant suppressed and 12 activated. Integrated quantitative measure poststressory changes makes 0,73±0,08 σ. Found significant differences integrated status intact males and females, whereas after stress differences are insignificant. Conclusion. The approach to integrated quantitative assessment of neuroendocrine-immune complex and metabolism may be useful for testing the

  19. Development of film dosimetric measurement system for verification of RTP

    International Nuclear Information System (INIS)

    Chen Yong; Bao Shanglian; Ji Changguo; Zhang Xin; Wu Hao; Han Shukui; Xiao Guiping

    2007-01-01

    Objective: To develop a novel film dosimetry system based on general laser scanner in order to verify patient-specific Radiotherapy Treatment Plan(RTP) in three-Dimensional Adaptable Radiotherapy(3D ART) and Intensity Modulated Radiotherapy (IMRT). Methods: Some advanced methods, including film saturated development, wavelet filtering with multi-resolution thresholds and discrete Fourier reconstruction are employed in this system to reduce artifacts, noise and distortion induced by film digitizing with general scanner; a set of coefficients derived from Monte Carlo(MC) simulation are adopted to correct the film over-response to low energy scattering photons; a set of newly emerging criteria, including γ index and Normalized Agreement Test (NAT) method, are employed to quantitatively evaluate agreement of 2D dose distributions between the results measured by the films and calculated by Treatment Planning System(TPS), so as to obtain straightforward presentations, displays and results with high accuracy and reliability. Results: Radiotherapy doses measured by developed system agree within 2% with those measured by ionization chamber and VeriSoft Film Dosimetry System, and quantitative evaluation indexes are within 3%. Conclusions: The developed system can be used to accurately measure the radiotherapy dose and reliably make quantitative evaluation for RTP dose verification. (authors)

  20. Quantitative skeletal scintiscanning

    International Nuclear Information System (INIS)

    Haushofer, R.

    1982-01-01

    330 patients were examined by skeletal scintiscanning with sup(99m)Tc pyrophosphate and sup(99m)methylene diphosphonate in the years between 1977 and 1979. Course control examinations were carried out in 12 patients. The collective of patients presented with primary skeletal tumours, metastases, inflammatory and degenerative skeletal diseases. Bone scintiscanning combined with the ''region of interest'' technique was found to be an objective and reproducible technique for quantitative measurement of skeletal radioactivity concentrations. The validity of nuclear skeletal examinations can thus be enhanced as far as diagnosis, course control, and differential diagnosis are concerned. Quantitative skeletal scintiscanning by means of the ''region of interest'' technique has opened up a new era in skeletal diagnosis by nuclear methods. (orig./MG) [de

  1. Determining the psychometric properties of the Enhancing Decision-making Assessment in Midwifery (EDAM) measure in a cross cultural context.

    Science.gov (United States)

    Jefford, Elaine; Jomeen, Julie; Martin, Colin R

    2016-04-28

    The ability to act on and justify clinical decisions as autonomous accountable midwifery practitioners, is encompassed within many international regulatory frameworks, yet decision-making within midwifery is poorly defined. Decision-making theories from medicine and nursing may have something to offer, but fail to take into consideration midwifery context and philosophy and the decisional autonomy of women. Using an underpinning qualitative methodology, a decision-making framework was developed, which identified Good Clinical Reasoning and Good Midwifery Practice as two conditions necessary to facilitate optimal midwifery decision-making during 2nd stage labour. This study aims to confirm the robustness of the framework and describe the development of Enhancing Decision-making Assessment in Midwifery (EDAM) as a measurement tool through testing of its factor structure, validity and reliability. A cross-sectional design for instrument development and a 2 (country; Australia/UK) x 2 (Decision-making; optimal/sub-optimal) between-subjects design for instrument evaluation using exploratory and confirmatory factor analysis, internal consistency and known-groups validity. Two 'expert' maternity panels, based in Australia and the UK, comprising of 42 participants assessed 16 midwifery real care episode vignettes using the empirically derived 26 item framework. Each item was answered on a 5 point likert scale based on the level of agreement to which the participant felt each item was present in each of the vignettes. Participants were then asked to rate the overall decision-making (optimal/sub-optimal). Post factor analysis the framework was reduced to a 19 item EDAM measure, and confirmed as two distinct scales of 'Clinical Reasoning' (CR) and 'Midwifery Practice' (MP). The CR scale comprised of two subscales; 'the clinical reasoning process' and 'integration and intervention'. The MP scale also comprised two subscales; women's relationship with the midwife' and 'general

  2. Commentary: moving toward cost-effectiveness in using psychophysiological measures in clinical assessment: validity, decision making, and adding value.

    Science.gov (United States)

    Youngstrom, Eric A; De Los Reyes, Andres

    2015-01-01

    Psychophysiological measures offer a variety of potential advantages, including more direct assessment of certain processes, as well as provision of information that may contrast with other sources. The role of psychophysiological measures in clinical practice will be best defined when researchers (a) switch to research designs and statistical models that better approximate how clinicians administer assessments and make clinical decisions in practice, (b) systematically compare the validity of psychophysiological measures to incumbent methods for assessing similar criteria, (c) test whether psychophysiological measures show either greater validity or clinically meaningful incremental validity, and (d) factor in fiscal costs as well as the utilities that the client attaches to different assessment outcomes. The statistical methods are now readily available, along with the interpretive models for integrating assessment results into client-centered decision making. These, combined with technology reducing the cost of psychophysiological measurement and improving ease of interpretation, poise the field for a rapid transformation of assessment practice, but only if we let go of old habits of research.

  3. PCA-based groupwise image registration for quantitative MRI

    NARCIS (Netherlands)

    Huizinga, W.; Poot, D. H. J.; Guyader, J.-M.; Klaassen, R.; Coolen, B. F.; van Kranenburg, M.; van Geuns, R. J. M.; Uitterdijk, A.; Polfliet, M.; Vandemeulebroucke, J.; Leemans, A.; Niessen, W. J.; Klein, S.

    2016-01-01

    Quantitative magnetic resonance imaging (qMRI) is a technique for estimating quantitative tissue properties, such as the T5 and T2 relaxation times, apparent diffusion coefficient (ADC), and various perfusion measures. This estimation is achieved by acquiring multiple images with different

  4. Quantitative Phase Imaging Using Hard X Rays

    International Nuclear Information System (INIS)

    Nugent, K.A.; Gureyev, T.E.; Cookson, D.J.; Paganin, D.; Barnea, Z.

    1996-01-01

    The quantitative imaging of a phase object using 16keV xrays is reported. The theoretical basis of the techniques is presented along with its implementation using a synchrotron x-ray source. We find that our phase image is in quantitative agreement with independent measurements of the object. copyright 1996 The American Physical Society

  5. Quantitatively accurate activity measurements with a dedicated cardiac SPECT camera: Physical phantom experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pourmoghaddas, Amir, E-mail: apour@ottawaheart.ca; Wells, R. Glenn [Physics Department, Carleton University, Ottawa, Ontario K1S 5B6, Canada and Cardiology, The University of Ottawa Heart Institute, Ottawa, Ontario K1Y4W7 (Canada)

    2016-01-15

    Purpose: Recently, there has been increased interest in dedicated cardiac single photon emission computed tomography (SPECT) scanners with pinhole collimation and improved detector technology due to their improved count sensitivity and resolution over traditional parallel-hole cameras. With traditional cameras, energy-based approaches are often used in the clinic for scatter compensation because they are fast and easily implemented. Some of the cardiac cameras use cadmium-zinc-telluride (CZT) detectors which can complicate the use of energy-based scatter correction (SC) due to the low-energy tail—an increased number of unscattered photons detected with reduced energy. Modified energy-based scatter correction methods can be implemented, but their level of accuracy is unclear. In this study, the authors validated by physical phantom experiments the quantitative accuracy and reproducibility of easily implemented correction techniques applied to {sup 99m}Tc myocardial imaging with a CZT-detector-based gamma camera with multiple heads, each with a single-pinhole collimator. Methods: Activity in the cardiac compartment of an Anthropomorphic Torso phantom (Data Spectrum Corporation) was measured through 15 {sup 99m}Tc-SPECT acquisitions. The ratio of activity concentrations in organ compartments resembled a clinical {sup 99m}Tc-sestamibi scan and was kept consistent across all experiments (1.2:1 heart to liver and 1.5:1 heart to lung). Two background activity levels were considered: no activity (cold) and an activity concentration 1/10th of the heart (hot). A plastic “lesion” was placed inside of the septal wall of the myocardial insert to simulate the presence of a region without tracer uptake and contrast in this lesion was calculated for all images. The true net activity in each compartment was measured with a dose calibrator (CRC-25R, Capintec, Inc.). A 10 min SPECT image was acquired using a dedicated cardiac camera with CZT detectors (Discovery NM530c, GE

  6. Problems of standardized handling and quantitative evaluation of autoradiograms

    International Nuclear Information System (INIS)

    Treutler, H.C.; Freyer, K.

    1985-01-01

    In the last years autoradiography has gained increasing importance as a quantitative method of measuring radioactivity or element concentration. Mostly relative measurements are carried out. The optical density of the photographic emulsion produced by a calibrated radiation source is compared with that produced by a sample. The influences of different parameters, such as beta particle energy, backscattering, fading of the latent image, developing conditions, matrix effects and others on the results are described and the errors of the quantitative evaluation of autoradiograms are assessed. The performance of the method is demonstrated taking the quantitative determination of gold in silicon as an example

  7. Model-independent quantitative measurement of nanomechanical oscillator vibrations using electron-microscope linescans

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Huan; Fenton, J. C.; Chiatti, O. [London Centre for Nanotechnology, University College London, 17–19 Gordon Street, London WC1H 0AH (United Kingdom); Warburton, P. A. [London Centre for Nanotechnology, University College London, 17–19 Gordon Street, London WC1H 0AH (United Kingdom); Department of Electronic and Electrical Engineering, University College London, Torrington Place, London WC1E 7JE (United Kingdom)

    2013-07-15

    Nanoscale mechanical resonators are highly sensitive devices and, therefore, for application as highly sensitive mass balances, they are potentially superior to micromachined cantilevers. The absolute measurement of nanoscale displacements of such resonators remains a challenge, however, since the optical signal reflected from a cantilever whose dimensions are sub-wavelength is at best very weak. We describe a technique for quantitative analysis and fitting of scanning-electron microscope (SEM) linescans across a cantilever resonator, involving deconvolution from the vibrating resonator profile using the stationary resonator profile. This enables determination of the absolute amplitude of nanomechanical cantilever oscillations even when the oscillation amplitude is much smaller than the cantilever width. This technique is independent of any model of secondary-electron emission from the resonator and is, therefore, applicable to resonators with arbitrary geometry and material inhomogeneity. We demonstrate the technique using focussed-ion-beam–deposited tungsten cantilevers of radius ∼60–170 nm inside a field-emission SEM, with excitation of the cantilever by a piezoelectric actuator allowing measurement of the full frequency response. Oscillation amplitudes approaching the size of the primary electron-beam can be resolved. We further show that the optimum electron-beam scan speed is determined by a compromise between deflection of the cantilever at low scan speeds and limited spatial resolution at high scan speeds. Our technique will be an important tool for use in precise characterization of nanomechanical resonator devices.

  8. Automatic and quantitative measurement of collagen gel contraction using model-guided segmentation

    Science.gov (United States)

    Chen, Hsin-Chen; Yang, Tai-Hua; Thoreson, Andrew R.; Zhao, Chunfeng; Amadio, Peter C.; Sun, Yung-Nien; Su, Fong-Chin; An, Kai-Nan

    2013-08-01

    Quantitative measurement of collagen gel contraction plays a critical role in the field of tissue engineering because it provides spatial-temporal assessment (e.g., changes of gel area and diameter during the contraction process) reflecting the cell behavior and tissue material properties. So far the assessment of collagen gels relies on manual segmentation, which is time-consuming and suffers from serious intra- and inter-observer variability. In this study, we propose an automatic method combining various image processing techniques to resolve these problems. The proposed method first detects the maximal feasible contraction range of circular references (e.g., culture dish) and avoids the interference of irrelevant objects in the given image. Then, a three-step color conversion strategy is applied to normalize and enhance the contrast between the gel and background. We subsequently introduce a deformable circular model which utilizes regional intensity contrast and circular shape constraint to locate the gel boundary. An adaptive weighting scheme was employed to coordinate the model behavior, so that the proposed system can overcome variations of gel boundary appearances at different contraction stages. Two measurements of collagen gels (i.e., area and diameter) can readily be obtained based on the segmentation results. Experimental results, including 120 gel images for accuracy validation, showed high agreement between the proposed method and manual segmentation with an average dice similarity coefficient larger than 0.95. The results also demonstrated obvious improvement in gel contours obtained by the proposed method over two popular, generic segmentation methods.

  9. Single beam Fourier transform digital holographic quantitative phase microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Anand, A., E-mail: arun-nair-in@yahoo.com; Chhaniwal, V. K.; Mahajan, S.; Trivedi, V. [Optics Laboratory, Applied Physics Department, Faculty of Technology and Engineering, M.S. University of Baroda, Vadodara 390001 (India); Faridian, A.; Pedrini, G.; Osten, W. [Institut für Technische Optik, Universität Stuttgart, Pfaffenwaldring 9, 70569 Stuttgart (Germany); Dubey, S. K. [Siemens Technology and Services Pvt. Ltd, Corporate Technology—Research and Technology Centre, Bangalore 560100 (India); Javidi, B. [Department of Electrical and Computer Engineering, U-4157, University of Connecticut, Storrs, Connecticut 06269-2157 (United States)

    2014-03-10

    Quantitative phase contrast microscopy reveals thickness or height information of a biological or technical micro-object under investigation. The information obtained from this process provides a means to study their dynamics. Digital holographic (DH) microscopy is one of the most used, state of the art single-shot quantitative techniques for three dimensional imaging of living cells. Conventional off axis DH microscopy directly provides phase contrast images of the objects. However, this process requires two separate beams and their ratio adjustment for high contrast interference fringes. Also the use of two separate beams may make the system more vulnerable to vibrations. Single beam techniques can overcome these hurdles while remaining compact as well. Here, we describe the development of a single beam DH microscope providing whole field imaging of micro-objects. A hologram of the magnified object projected on to a diffuser co-located with a pinhole is recorded with the use of a commercially available diode laser and an arrayed sensor. A Fourier transform of the recorded hologram directly yields the complex amplitude at the image plane. The method proposed was investigated using various phase objects. It was also used to image the dynamics of human red blood cells in which sub-micrometer level thickness variation were measurable.

  10. Student evaluations of teaching: teaching quantitative courses can be hazardous to one’s career

    Directory of Open Access Journals (Sweden)

    Bob Uttl

    2017-05-01

    Full Text Available Anonymous student evaluations of teaching (SETs are used by colleges and universities to measure teaching effectiveness and to make decisions about faculty hiring, firing, re-appointment, promotion, tenure, and merit pay. Although numerous studies have found that SETs correlate with various teaching effectiveness irrelevant factors (TEIFs such as subject, class size, and grading standards, it has been argued that such correlations are small and do not undermine the validity of SETs as measures of professors’ teaching effectiveness. However, previous research has generally used inappropriate parametric statistics and effect sizes to examine and to evaluate the significance of TEIFs on personnel decisions. Accordingly, we examined the influence of quantitative vs. non-quantitative courses on SET ratings and SET based personnel decisions using 14,872 publicly posted class evaluations where each evaluation represents a summary of SET ratings provided by individual students responding in each class. In total, 325,538 individual student evaluations from a US mid-size university contributed to theses class evaluations. The results demonstrate that class subject (math vs. English is strongly associated with SET ratings, has a substantial impact on professors being labeled satisfactory vs. unsatisfactory and excellent vs. non-excellent, and the impact varies substantially depending on the criteria used to classify professors as satisfactory vs. unsatisfactory. Professors teaching quantitative courses are far more likely not to receive tenure, promotion, and/or merit pay when their performance is evaluated against common standards.

  11. Measurement of strains by means of electro-optics holography

    Science.gov (United States)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.; Albertazzi, Armando, Jr.

    1991-03-01

    The use of a TV camera as a recording medium and the observation of whole field displacements in real time makes holographic TV a very interesting and powerful tool in a variety of areas from NDE to research and development. The paper presents new developments in the field that add to the versatility of the technique by introducing portability and methods to obtain accurate quantitative results. Examples of applications are given to the measurement of strains both at room and at high temperatures and strain measurements at the microscopic level. 1.

  12. HOSPITAL SITE SELECTION USING TWO-STAGE FUZZY MULTI-CRITERIA DECISION MAKING PROCESS

    Directory of Open Access Journals (Sweden)

    Ali Soltani

    2011-06-01

    Full Text Available Site selection for sitting of urban activities/facilities is one of the crucial policy-related decisions taken by urban planners and policy makers. The process of site selection is inherently complicated. A careless site imposes exorbitant costs on city budget and damages the environment inevitably. Nowadays, multi-attributes decision making approaches are suggested to use to improve precision of decision making and reduce surplus side effects. Two well-known techniques, analytical hierarchal process and analytical network process are among multi-criteria decision making systems which can easily be consistent with both quantitative and qualitative criteria. These are also developed to be fuzzy analytical hierarchal process and fuzzy analytical network process systems which are capable of accommodating inherent uncertainty and vagueness in multi-criteria decision-making. This paper reports the process and results of a hospital site selection within the Region 5 of Shiraz metropolitan area, Iran using integrated fuzzy analytical network process systems with Geographic Information System (GIS. The weights of the alternatives were calculated using fuzzy analytical network process. Then a sensitivity analysis was conducted to measure the elasticity of a decision in regards to different criteria. This study contributes to planning practice by suggesting a more comprehensive decision making tool for site selection.

  13. HOSPITAL SITE SELECTION USING TWO-STAGE FUZZY MULTI-CRITERIA DECISION MAKING PROCESS

    Directory of Open Access Journals (Sweden)

    Ali Soltani

    2011-01-01

    Full Text Available Site selection for sitting of urban activities/facilities is one of the crucial policy-related decisions taken by urban planners and policy makers. The process of site selection is inherently complicated. A careless site imposes exorbitant costs on city budget and damages the environment inevitably. Nowadays, multi-attributes decision making approaches are suggested to use to improve precision of decision making and reduce surplus side effects. Two well-known techniques, analytical hierarchal process and analytical network process are among multi-criteria decision making systems which can easily be consistent with both quantitative and qualitative criteria. These are also developed to be fuzzy analytical hierarchal process and fuzzy analytical network process systems which are capable of accommodating inherent uncertainty and vagueness in multi-criteria decision-making. This paper reports the process and results of a hospital site selection within the Region 5 of Shiraz metropolitan area, Iran using integrated fuzzy analytical network process systems with Geographic Information System (GIS. The weights of the alternatives were calculated using fuzzy analytical network process. Then a sensitivity analysis was conducted to measure the elasticity of a decision in regards to different criteria. This study contributes to planning practice by suggesting a more comprehensive decision making tool for site selection.

  14. A quantitative risk assessment model to evaluate effective border control measures for rabies prevention

    Science.gov (United States)

    Weng, Hsin-Yi; Wu, Pei-I; Yang, Ping-Cheng; Tsai, Yi-Lun; Chang, Chao-Chin

    2009-01-01

    Border control is the primary method to prevent rabies emergence. This study developed a quantitative risk model incorporating stochastic processes to evaluate whether border control measures could efficiently prevent rabies introduction through importation of cats and dogs using Taiwan as an example. Both legal importation and illegal smuggling were investigated. The impacts of reduced quarantine and/or waiting period on the risk of rabies introduction were also evaluated. The results showed that Taiwan’s current animal importation policy could effectively prevent rabies introduction through legal importation of cats and dogs. The median risk of a rabid animal to penetrate current border control measures and enter Taiwan was 5.33 × 10−8 (95th percentile: 3.20 × 10−7). However, illegal smuggling may pose Taiwan to the great risk of rabies emergence. Reduction of quarantine and/or waiting period would affect the risk differently, depending on the applied assumptions, such as increased vaccination coverage, enforced custom checking, and/or change in number of legal importations. Although the changes in the estimated risk under the assumed alternatives were not substantial except for completely abolishing quarantine, the consequences of rabies introduction may yet be considered to be significant in a rabies-free area. Therefore, a comprehensive benefit-cost analysis needs to be conducted before recommending these alternative measures. PMID:19822125

  15. Relationships between diffusing capacity for carbon monoxide (D{sub L}CO), and quantitative computed tomography measurements and visual assessment for chronic obstructive pulmonary disease

    Energy Technology Data Exchange (ETDEWEB)

    Nambu, Atsushi, E-mail: nambu-a@gray.plala.or.jp [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO 80206 (United States); Department of Radiology, Teikyo University Mizonokuchi Hospital (Japan); Zach, Jordan, E-mail: ZachJ@NJHealth.org [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO 80206 (United States); Schroeder, Joyce, E-mail: Joyce.schroeder@stanfordalumni.org [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO 80206 (United States); Jin, Gong Yong, E-mail: gyjin@chonbuk.ac.kr [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO 80206 (United States); Department of Radiology, Chonbuk National University Hospital (Korea, Republic of); Kim, Song Soo, E-mail: haneul88@hanmail.net [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO 80206 (United States); Department of Radiology, Chungnam National Hospital, Chungnam National University School of Medicine (Korea, Republic of); Kim, Yu-IL, E-mail: kyionly@chonnam.ac.kr [Department of Medicine, National Jewish Health, Denver, CO (United States); Department of Internal Medicine, Chonnam National University Hospital, Gwangju (Korea, Republic of); Schnell, Christina, E-mail: SchnellC@NJHealth.org [Department of Medicine, National Jewish Health, Denver, CO (United States); Bowler, Russell, E-mail: BowlerR@NJHealth.org [Division of Pulmonary Medicine, Department of Medicine, National Jewish Health (United States); Lynch, David A., E-mail: LynchD@NJHealth.org [Department of Radiology, National Jewish Health, 1400 Jackson Street, Denver, CO 80206 (United States)

    2015-05-15

    Highlights: • Quantitative CT measurements significantly correlated with D{sub L}CO/V{sub A}. • 15{sup th} percentile HU had the strongest correlation with D{sub L}CO/V{sub A}. • Visual scoring of emphysema had independent significant correlations with D{sub L}CO/V{sub A}. - Abstract: Purpose: To evaluate the relationships between D{sub L}CO, and Quantitative CT (QCT) measurements and visual assessment of pulmonary emphysema and to test the relative roles of visual and quantitative assessment of emphysema. Materials and methods: The subjects included 199 current and former cigarette smokers from the COPDGene cohort who underwent inspiratory and expiratory CT and also had diffusing capacity for carbon monoxide corrected for alveolar volume (D{sub L}CO/V{sub A}). Quantitative CT measurements included % low attenuation areas (%LAA−950ins = voxels ≤−950 Hounsfield unit (HU), % LAA{sub −910ins}, and % LAA{sub −856ins}), mean CT attenuation and 15th percentile HU value on inspiratory CT, and %LAA{sub −856exp} (voxels ≤−856 HU on expiratory CT). The extent of emphysema was visually assessed using a 5-point grading system. Univariate and multiple variable linear regression analyses were employed to evaluate the correlations between D{sub L}CO/V{sub A} and QCT parameters and visual extent of emphysema. Results: The D{sub L}CO/V{sub A} correlated most strongly with 15th percentile HU (R{sup 2} = 0.440, p < 0.001) closely followed by % LAA{sub −950ins} (R{sup 2} = 0.417, p < 0.001) and visual extent of emphysema (R{sup 2} = 0.411, p < 0.001). Multiple variable analysis showed that visual extent of emphysema and 15th percentile HU were independent significant predictors of D{sub L}CO/V{sub A} at an R{sup 2} of 0.599. Conclusions: 15th percentile HU seems the best parameter to represent the respiratory condition of COPD. Visual and Quantitative CT assessment of emphysema provide complementary information to QCT analysis.

  16. Fundamental quantitative security in quantum key generation

    International Nuclear Information System (INIS)

    Yuen, Horace P.

    2010-01-01

    We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographic context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.

  17. Quantitative diffusion and swelling kinetic measurements using large-angle interferometric refractometry.

    Science.gov (United States)

    Saunders, John E; Chen, Hao; Brauer, Chris; Clayton, McGregor; Chen, Weijian; Barnes, Jack A; Loock, Hans-Peter

    2015-12-07

    The uptake and release of sorbates into films and coatings is typically accompanied by changes of the films' refractive index and thickness. We provide a comprehensive model to calculate the concentration of the sorbate from the average refractive index and the film thickness, and validate the model experimentally. The mass fraction of the analyte partitioned into a film is described quantitatively by the Lorentz-Lorenz equation and the Clausius-Mosotti equation. To validate the model, the uptake kinetics of water and other solvents into SU-8 films (d = 40-45 μm) were explored. Large-angle interferometric refractometry measurements can be used to characterize films that are between 15 μm to 150 μm thick and, Fourier analysis, is used to determine independently the thickness, the average refractive index and the refractive index at the film-substrate interface at one-second time intervals. From these values the mass fraction of water in SU-8 was calculated. The kinetics were best described by two independent uptake processes having different rates. Each process followed one-dimensional Fickian diffusion kinetics with diffusion coefficients for water into SU-8 photoresist film of 5.67 × 10(-9) cm(2) s(-1) and 61.2 × 10(-9) cm(2) s(-1).

  18. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    Science.gov (United States)

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Conformational analysis by quantitative NOE measurements of the β-proton pairs across individual disulfide bonds in proteins

    International Nuclear Information System (INIS)

    Takeda, Mitsuhiro; Terauchi, Tsutomu; Kainosho, Masatsune

    2012-01-01

    NOEs between the β-protons of cysteine residues across disulfide bonds in proteins provide direct information on the connectivities and conformations of these important cross-links, which are otherwise difficult to investigate. With conventional [U- 13 C, 15 N]-proteins, however, fast spin diffusion processes mediated by strong dipolar interactions between geminal β-protons prohibit the quantitative measurements and thus the analyses of long-range NOEs across disulfide bonds. We describe a robust approach for alleviating such difficulties, by using proteins selectively labeled with an equimolar mixture of (2R, 3S)-[β- 13 C; α,β- 2 H 2 ] Cys and (2R, 3R)-[β- 13 C; α,β- 2 H 2 ] Cys, but otherwise fully deuterated. Since either one of the prochiral methylene protons, namely β2 (proS) or β3 (proR), is always replaced with a deuteron and no other protons remain in proteins prepared by this labeling scheme, all four of the expected NOEs for the β-protons across disulfide bonds could be measured without any spin diffusion interference, even with long mixing times. Therefore, the NOEs for the β2 and β3 pairs across each of the disulfide bonds could be observed at high sensitivity, even though they are 25% of the theoretical maximum for each pair. With the NOE information, the disulfide bond connectivities can be unambiguously established for proteins with multiple disulfide bonds. In addition, the conformations around disulfide bonds, namely χ 2 and χ 3 , can be determined based on the precise proton distances of the four β-proton pairs, by quantitative measurements of the NOEs across the disulfide bonds. The feasibility of this method is demonstrated for bovine pancreatic trypsin inhibitor, which has three disulfide bonds.

  20. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  1. Simultaneous measurement and quantitation of 4-hydroxyphenylacetic acid and dopamine with fast-scan cyclic voltammetry.

    Science.gov (United States)

    Shin, Mimi; Kaplan, Sam V; Raider, Kayla D; Johnson, Michael A

    2015-05-07

    Caged compounds have been used extensively to investigate neuronal function in a variety of preparations, including cell culture, ex vivo tissue samples, and in vivo. As a first step toward electrochemically measuring the extent of caged compound photoactivation while also measuring the release of the catecholamine neurotransmitter, dopamine, fast-scan cyclic voltammetry at carbon-fiber microelectrodes (FSCV) was used to electrochemically characterize 4-hydroxyphenylacetic acid (4HPAA) in the absence and presence of dopamine. 4HPAA is a by-product formed during the process of photoactivation of p-hydroxyphenacyl-based caged compounds, such as p-hydroxyphenylglutamate (pHP-Glu). Our data suggest that the oxidation of 4HPAA occurs through the formation of a conjugated species. Moreover, we found that a triangular waveform of -0.4 V to +1.3 V to -0.4 V at 600 V s(-1), repeated every 100 ms, provided an oxidation current of 4HPAA that was enhanced with a limit of detection of 100 nM, while also allowing the detection and quantitation of dopamine within the same scan. Along with quantifying 4HPAA in biological preparations, the results from this work will allow the electrochemical measurement of photoactivation reactions that generate 4HPAA as a by-product as well as provide a framework for measuring the photorelease of electroactive by-products from caged compounds that incorporate other chromophores.

  2. Effect of attenuation by the cranium on quantitative SPECT measurements of cerebral blood flow and a correction method

    International Nuclear Information System (INIS)

    Iwase, Mikio; Kurono, Kenji; Iida, Akihiko.

    1998-01-01

    Attenuation correction for cerebral blood flow SPECT image reconstruction is usually performed by considering the head as a whole to be equivalent to water, and the effects of differences in attenuation between subjects produced by the cranium have not been taken into account. We determined the differences in attenuation between subjects and assessed a method of correcting quantitative cerebral blood flow values. Attenuations by head on the right and left sides were measured before intravenous injection of 123 I-IMP, and water-converted diameters of both sides (Ta) were calculated from the measurements obtained. After acquiring SPECT images, attenuation correction was conducted according to the method of Sorenson, and images were reconstructed. The diameters of the right and left sides in the same position as the Ta (Tt) were calculated from the contours determined by threshold values. Using Ts given by 2 Ts=Ta-Tt, the correction factor λ=exp(μ 1 Ts) was calculated and multiplied as the correction factor when rCBF was determined. The results revealed significant differences between Tt and Ta. Although no gender differences were observed in Tt, they were seen in both Ta and Ts. Thus, interindividual differences in attenuation by the cranium were found to have an influence that cannot be ignored. Inter-subject correlation is needed to obtain accurate quantitative values. (author)

  3. Quantitative Evaluation of MODIS Fire Radiative Power Measurement for Global Smoke Emissions Assessment

    Science.gov (United States)

    Ichoku, Charles; Ellison, Luke

    2011-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate

  4. Reliability and short-term intra-individual variability of telomere length measurement using monochrome multiplexing quantitative PCR.

    Directory of Open Access Journals (Sweden)

    Sangmi Kim

    Full Text Available Studies examining the association between telomere length and cancer risk have often relied on measurement of telomere length from a single blood draw using a real-time PCR technique. We examined the reliability of telomere length measurement using sequential samples collected over a 9-month period.Relative telomere length in peripheral blood was estimated using a single tube monochrome multiplex quantitative PCR assay in blood DNA samples from 27 non-pregnant adult women (aged 35 to 74 years collected in 7 visits over a 9-month period. A linear mixed model was used to estimate the components of variance for telomere length measurements attributed to variation among women and variation between time points within women. Mean telomere length measurement at any single visit was not significantly different from the average of 7 visits. Plates had a significant systematic influence on telomere length measurements, although measurements between different plates were highly correlated. After controlling for plate effects, 64% of the remaining variance was estimated to be accounted for by variance due to subject. Variance explained by time of visit within a subject was minor, contributing 5% of the remaining variance.Our data demonstrate good short-term reliability of telomere length measurement using blood from a single draw. However, the existence of technical variability, particularly plate effects, reinforces the need for technical replicates and balancing of case and control samples across plates.

  5. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    Science.gov (United States)

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  6. Integrating a quantitative risk appraisal in a health impact assessment

    DEFF Research Database (Denmark)

    Adám, Balázs; Molnár, Agnes; Gulis, Gabriel

    2013-01-01

    BACKGROUND: Although the quantification of health outcomes in a health impact assessment (HIA) is scarce in practice, it is preferred by policymakers, as it assists various aspects of the decision-making process. This article provides an example of integrating a quantitative risk appraisal...... in an HIA performed for the recently adopted Hungarian anti-smoking policy which introduced a smoking ban in closed public places, workplaces and public transport vehicles, and is one of the most effective measures to decrease smoking-related ill health. METHODS: A comprehensive, prospective HIA...... to decrease the prevalence of active and passive smoking and result in a considerably positive effect on several diseases, among which lung cancer, chronic pulmonary diseases, coronary heart diseases and stroke have the greatest importance. The health gain calculated for the quantifiable health outcomes...

  7. Dominance in Domestic Dogs : A Quantitative Analysis of Its Behavioural Measures

    NARCIS (Netherlands)

    van der Borg, Joanne A M; Schilder, Matthijs B H; Vinke, Claudia M; de Vries, Han

    2015-01-01

    A dominance hierarchy is an important feature of the social organisation of group living animals. Although formal and/or agonistic dominance has been found in captive wolves and free-ranging dogs, applicability of the dominance concept in domestic dogs is highly debated, and quantitative data are

  8. Quantitative Phase Determination by Using a Michelson Interferometer

    Science.gov (United States)

    Pomarico, Juan A.; Molina, Pablo F.; D'Angelo, Cristian

    2007-01-01

    The Michelson interferometer is one of the best established tools for quantitative interferometric measurements. It has been, and is still successfully used, not only for scientific purposes, but it is also introduced in undergraduate courses for qualitative demonstrations as well as for quantitative determination of several properties such as…

  9. Quantitative method of measuring cancer cell urokinase and metastatic potential

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    1993-01-01

    The metastatic potential of tumors can be evaluated by the quantitative detection of urokinase and DNA. The cell sample selected for examination is analyzed for the presence of high levels of urokinase and abnormal DNA using analytical flow cytometry and digital image analysis. Other factors such as membrane associated urokinase, increased DNA synthesis rates and certain receptors can be used in the method for detection of potentially invasive tumors.

  10. Quantitative analytical hierarchy process to marketing store location selection

    Directory of Open Access Journals (Sweden)

    Harwati

    2018-01-01

    Full Text Available The selection of Store to market the product is belong to Multi Criteria Decision Making problem. The criteria used have conflict of interest with each other to produce an optimal location. This research uses four important criteria to select new location of marketing store appropriate with the references: distance to location, competition level with competitor, number of potential customer, and location rent cost. Quantitative data is used to determine the optimum location with AHP method. Quantitative data are preferred to avoid inconsistency when using expert opinion. The AHP result optimum location among three alternatives places.

  11. Interrelationships Among Several Variables Reflecting Quantitative Thinking in Elementary School Children with Particular Emphasis upon Those Measures Involving Metric and Decimal Skills

    Science.gov (United States)

    Selman, Delon; And Others

    1976-01-01

    The relationships among measures of quantitative thinking in first through fifth grade children assigned either to an experimental math program emphasizing tactile, manipulative, or individual activity in learning metric and decimal concepts, or to a control group, were examined. Tables are presented and conclusions discussed. (Author/JKS)

  12. Presentation of a Software Method for Use of Risk Assessment in Building Fire Safety Measure Optimization

    Directory of Open Access Journals (Sweden)

    A. R. Koohpaei

    2012-05-01

    Full Text Available Background and aims: The property loss and physical injuries due to fire events in buildings demonstrate the necessity of implementation of efficient and performance based fire safety measures. Effective and high efficiency protection is possible when design and selection of protection measures are based on risk assessment. This study aims at presenting a software method to make possible selection and design of building fire safety measures based upon quantitative risk assessment and building characteristics. Methods: based on “Fire Risk Assessment Method for Engineer (FRAME” a program in MATLB software was written. The first section of this program, according to the FRAME method and based on the specification of a building, calculates the potential risk and acceptable risk level. In the second section, according to potential risk, acceptable risk level and the fire risk level that user want, program calculate concession of protective factor for that building.Results: The prepared software make it possible to assign the fire safety measure based on quantitative risk level and all building specifications. All calculations were performed with 0.001 of precision and the accuracy of this software was assessed with handmade calculations. During the use of the software if an error occurs in calculations, it can be distinguished in the output. Conclusion: Application of quantitative risk assessment is a suitable tool for increasing of efficiency in designing and execution of fire protection measure in building. With using this software the selected fire safety measure would be more efficient and suitable since the selection of fire safety measures performed on risk assessment and particular specification of a building. Moreover fire risk in the building can be managed easily and carefully.

  13. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    Science.gov (United States)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  14. From inverse problems in mathematical physiology to quantitative differential diagnoses.

    Directory of Open Access Journals (Sweden)

    Sven Zenker

    2007-11-01

    Full Text Available The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting, using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge. We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of

  15. From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses

    Science.gov (United States)

    Zenker, Sven; Rubin, Jonathan; Clermont, Gilles

    2007-01-01

    The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses

  16. Combining disparate data for decision making

    Science.gov (United States)

    Gettings, M. E.

    2010-12-01

    Combining information of disparate types from multiple data or model sources is a fundamental task in decision making theory. Procedures for combining and utilizing quantitative data with uncertainties are well-developed in several approaches, but methods for including qualitative and semi-quantitative data are much less so. Possibility theory offers an approach to treating all three data types in an objective and repeatable way. In decision making, biases are frequently present in several forms, including those arising from data quality, data spatial and temporal distribution, and the analyst's knowledge and beliefs as to which data or models are most important. The latter bias is particularly evident in the case of qualitative data and there are numerous examples of analysts feeling that a qualitative dataset is more relevant than a quantified one. Possibility theory and fuzzy logic now provide fairly general rules for quantifying qualitative and semi-quantitative data in ways that are repeatable and minimally biased. Once a set of quantified data and/or model layers is obtained, there are several methods of combining them to obtain insight useful in decision making. These include: various combinations of layers using formal fuzzy logic (for example, layer A and (layer B or layer C) but not layer D); connecting the layers with varying influence links in a Fuzzy Cognitive Map; and using the set of layers for the universe of discourse for agent based model simulations. One example of logical combinations that have proven useful is the definition of possible habitat for valley fever fungus (Coccidioides sp.) using variables such as soil type, altitude, aspect, moisture and temperature. A second example is the delineation of the lithology and possible mineralization of several areas beneath basin fill in southern Arizona. A Fuzzy Cognitive Map example is the impacts of development and operation of a hypothetical mine in an area adjacent to a city. In this model

  17. Making College Count: An Examination of Quantitative Reasoning Activities in Higher Education

    Directory of Open Access Journals (Sweden)

    Louis M. Rocconi

    2013-07-01

    Full Text Available Findings from national studies along with more frequent calls from those who employ college graduates suggest an urgent need for colleges and universities to increase opportunities for students to develop quantitative reasoning (QR skills. To address this issue, the current study examines the relationship between the frequency of QR activities during college and student and institutional characteristics, as well as whether students at institutions with an emphasis on QR (at least one QR course requirement for all students report more QR activity. Results show that gender, race-ethnicity, major, full-time status, first-generation status, age, institutional enrollment size, and institutional control are related to the frequency of QR activities. Findings also suggest that such activities are indeed more common among institutions that emphasize QR.

  18. Quantitative Nuclear Medicine Imaging: Concepts, Requirements and Methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-01-15

    The absolute quantification of radionuclide distribution has been a goal since the early days of nuclear medicine. Nevertheless, the apparent complexity and sometimes limited accuracy of these methods have prevented them from being widely used in important applications such as targeted radionuclide therapy or kinetic analysis. The intricacy of the effects degrading nuclear medicine images and the lack of availability of adequate methods to compensate for these effects have frequently been seen as insurmountable obstacles in the use of quantitative nuclear medicine in clinical institutions. In the last few decades, several research groups have consistently devoted their efforts to the filling of these gaps. As a result, many efficient methods are now available that make quantification a clinical reality, provided appropriate compensation tools are used. Despite these efforts, many clinical institutions still lack the knowledge and tools to adequately measure and estimate the accumulated activities in the human body, thereby using potentially outdated protocols and procedures. The purpose of the present publication is to review the current state of the art of image quantification and to provide medical physicists and other related professionals facing quantification tasks with a solid background of tools and methods. It describes and analyses the physical effects that degrade image quality and affect the accuracy of quantification, and describes methods to compensate for them in planar, single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. The fast paced development of the computational infrastructure, both hardware and software, has made drastic changes in the ways image quantification is now performed. The measuring equipment has evolved from the simple blind probes to planar and three dimensional imaging, supported by SPECT, PET and hybrid equipment. Methods of iterative reconstruction have been developed to allow for

  19. What to measure next to improve decision making? On top-down task driven feature saliency

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Karadogan, Seliz; Marchegiani, Letizia

    2011-01-01

    Top-down attention is modeled as decision making based on incomplete information. We consider decisions made in a sequential measurement situation where initially only an incomplete input feature vector is available, however, where we are given the possibility to acquire additional input values...... among the missing features. The procecure thus poses the question what to do next? We take an information theoretical approach implemented for generality in a generative mixture model. The framework allows us reduce the decision about what to measure next in a classification problem to the estimation...

  20. A quantitative risk-based model for reasoning over critical system properties

    Science.gov (United States)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  1. Quantitative measurement of zinc secretion from pancreatic islets with high temporal resolution using droplet-based microfluidics.

    Science.gov (United States)

    Easley, Christopher J; Rocheleau, Jonathan V; Head, W Steven; Piston, David W

    2009-11-01

    We assayed glucose-stimulated insulin secretion (GSIS) from live, murine islets of Langerhans in microfluidic devices by the downstream formation of aqueous droplets. Zinc ions, which are cosecreted with insulin from beta-cells, were quantitatively measured from single islets with high temporal resolution using a fluorescent indicator, FluoZin-3. Real-time storage of secretions into droplets (volume of 0.470 +/- 0.009 nL) effectively preserves the temporal chemical information, allowing reconstruction of the secretory time record. The use of passive flow control within the device removes the need for syringe pumps, requiring only a single hand-held syringe. Under stimulatory glucose levels (11 mM), bursts of zinc as high as approximately 800 fg islet(-1) min(-1) were measured. Treatment with diazoxide effectively blocked zinc secretion, as expected. High temporal resolution reveals two major classes of oscillations in secreted zinc, with predominate periods at approximately 20-40 s and approximately 5-10 min. The more rapid oscillation periods match closely with those of intraislet calcium oscillations, while the slower oscillations are consistent with insulin pulses typically measured in bulk islet experiments or in the bloodstream. This droplet sampling technique should be widely applicable to time-resolved cellular secretion measurements, either in real-time or for postprocessing.

  2. A measurement of the LPM effect

    Energy Technology Data Exchange (ETDEWEB)

    Klein, S.R. [California Univ., Santa Cruz, CA (United States). Inst. for Particle Physics; Anthony, P. [Lawrence Livermore National Lab., CA (United States)]|[Stanford Linear Accelerator Center, Menlo Park, CA (United States); Becker-Szendy, R. [Stanford Linear Accelerator Center, Menlo Park, CA (United States)] [and others

    1993-11-01

    We have performed an experiment to measure accurately the Landau-Pomeranchuk-Migdal (LPM) effect in the production of 5 to 500 MeV photons due to bremsstrahlung of 8 and 25 GeV electron beams traversing thin (2 to 6% X₀) targets of varying densities. Our measurements confirm that the LPM effect exists and that the Migdal calculations are accurate. We see that, for thin targets, LPM suppression disappears leaving a Bethe-Heitler spectrum, as predicted by theory. For intermediate target thicknesses, we lack an acceptable theory, but have measured energy spectra for targets of differing thickness. We have also measured the production rate of 500 keV to 5 MeV photons at the same electron energies, to study dielectric suppression. We see qualitative agreement with the theory of Ter-Mikaelian; more work is needed before we can make quantitative comparisons.

  3. Citizen surveillance for environmental monitoring: combining the efforts of citizen science and crowdsourcing in a quantitative data framework.

    Science.gov (United States)

    Welvaert, Marijke; Caley, Peter

    2016-01-01

    Citizen science and crowdsourcing have been emerging as methods to collect data for surveillance and/or monitoring activities. They could be gathered under the overarching term citizen surveillance . The discipline, however, still struggles to be widely accepted in the scientific community, mainly because these activities are not embedded in a quantitative framework. This results in an ongoing discussion on how to analyze and make useful inference from these data. When considering the data collection process, we illustrate how citizen surveillance can be classified according to the nature of the underlying observation process measured in two dimensions-the degree of observer reporting intention and the control in observer detection effort. By classifying the observation process in these dimensions we distinguish between crowdsourcing, unstructured citizen science and structured citizen science. This classification helps the determine data processing and statistical treatment of these data for making inference. Using our framework, it is apparent that published studies are overwhelmingly associated with structured citizen science, and there are well developed statistical methods for the resulting data. In contrast, methods for making useful inference from purely crowd-sourced data remain under development, with the challenges of accounting for the unknown observation process considerable. Our quantitative framework for citizen surveillance calls for an integration of citizen science and crowdsourcing and provides a way forward to solve the statistical challenges inherent to citizen-sourced data.

  4. Deconvolution map-making for cosmic microwave background observations

    International Nuclear Information System (INIS)

    Armitage, Charmaine; Wandelt, Benjamin D.

    2004-01-01

    We describe a new map-making code for cosmic microwave background observations. It implements fast algorithms for convolution and transpose convolution of two functions on the sphere [B. Wandelt and K. Gorski, Phys. Rev. D 63, 123002 (2001)]. Our code can account for arbitrary beam asymmetries and can be applied to any scanning strategy. We demonstrate the method using simulated time-ordered data for three beam models and two scanning patterns, including a coarsened version of the WMAP strategy. We quantitatively compare our results with a standard map-making method and demonstrate that the true sky is recovered with high accuracy using deconvolution map-making

  5. Quantitative measurement of brightness from living cells in the presence of photodepletion.

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Hur

    Full Text Available The brightness of fluorescently labeled proteins provides an excellent marker for identifying protein interactions in living cells. Quantitative interpretation of brightness, however, hinges on a detailed understanding of the processes that affect the signal fluctuation of the fluorescent label. Here, we focus on the cumulative influence of photobleaching on brightness measurements in cells. Photobleaching within the finite volume of the cell leads to a depletion of the population of fluorescently labeled proteins with time. The process of photodepletion reduces the fluorescence signal which biases the analysis of brightness data. Our data show that even small reductions in the signal can introduce significant bias into the analysis of the data. We develop a model that quantifies the bias and introduce an analysis method that accurately determines brightness in the presence of photodepletion as verified by experiments with mammalian and yeast cells. In addition, photodepletion experiments with the fluorescent protein EGFP reveal the presence of a photoconversion process, which leads to a marked decrease in the brightness of the EGFP protein. We also identify conditions where the effect of EGFP's photoconversion on brightness experiments can be safely ignored.

  6. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  7. Quantitative outcome measures for systemic sclerosis-related Microangiopathy - Reliability of image acquisition in Nailfold Capillaroscopy.

    Science.gov (United States)

    Dinsdale, Graham; Moore, Tonia; O'Leary, Neil; Berks, Michael; Roberts, Christopher; Manning, Joanne; Allen, John; Anderson, Marina; Cutolo, Maurizio; Hesselstrand, Roger; Howell, Kevin; Pizzorni, Carmen; Smith, Vanessa; Sulli, Alberto; Wildt, Marie; Taylor, Christopher; Murray, Andrea; Herrick, Ariane L

    2017-09-01

    Nailfold capillaroscopic parameters hold increasing promise as outcome measures for clinical trials in systemic sclerosis (SSc). Their inclusion as outcomes would often naturally require capillaroscopy images to be captured at several time points during any one study. Our objective was to assess repeatability of image acquisition (which has been little studied), as well as of measurement. 41 patients (26 with SSc, 15 with primary Raynaud's phenomenon) and 10 healthy controls returned for repeat high-magnification (300×) videocapillaroscopy mosaic imaging of 10 digits one week after initial imaging (as part of a larger study of reliability). Images were assessed in a random order by an expert blinded observer and 4 outcome measures extracted: (1) overall image grade and then (where possible) distal vessel locations were marked, allowing (2) vessel density (across the whole nailfold) to be calculated (3) apex width measurement and (4) giant vessel count. Intra-rater, intra-visit and intra-rater inter-visit (baseline vs. 1week) reliability were examined in 475 and 392 images respectively. A linear, mixed-effects model was used to estimate variance components, from which intra-class correlation coefficients (ICCs) were determined. Intra-visit and inter-visit reliability estimates (ICCs) were (respectively): overall image grade, 0.97 and 0.90; vessel density, 0.92 and 0.65; mean vessel width, 0.91 and 0.79; presence of giant capillary, 0.68 and 0.56. These estimates were conditional on each parameter being measurable. Within-operator image analysis and acquisition are reproducible. Quantitative nailfold capillaroscopy, at least with a single observer, provides reliable outcome measures for clinical studies including randomised controlled trials. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Statistical mechanics and the evolution of polygenic quantitative traits

    NARCIS (Netherlands)

    Barton, N.H.; De Vladar, H.P.

    The evolution of quantitative characters depends on the frequencies of the alleles involved, yet these frequencies cannot usually be measured. Previous groups have proposed an approximation to the dynamics of quantitative traits, based on an analogy with statistical mechanics. We present a modified

  9. Quantitative Reasoning and the Sine Function: The Case of Zac

    Science.gov (United States)

    Moore, Kevin C.

    2014-01-01

    A growing body of literature has identified quantitative and covariational reasoning as critical for secondary and undergraduate student learning, particularly for topics that require students to make sense of relationships between quantities. The present study extends this body of literature by characterizing an undergraduate precalculus…

  10. Test–retest repeatability of quantitative cardiac 11C-meta-hydroxyephedrine measurements in rats by small animal positron emission tomography

    International Nuclear Information System (INIS)

    Thackeray, James T.; Renaud, Jennifer M.; Kordos, Myra; Klein, Ran; Kemp, Robert A. de; Beanlands, Rob S.B.; DaSilva, Jean N.

    2013-01-01

    Introduction: The norepinephrine analogue 11 C-meta-hydroxyephedrine (HED) has been used to interrogate sympathetic neuronal reuptake in cardiovascular disease. Application for longitudinal studies in small animal models of disease necessitates an understanding of test–retest variability. This study evaluated the repeatability of multiple quantitative cardiac measurements of HED retention and washout and the pharmacological response to reuptake blockade and enhanced norepinephrine levels. Methods: Small animal PET images were acquired over 60 min following HED administration to healthy male Sprague Dawley rats. Paired test and retest scans were undertaken in individual animals over 7 days. Additional HED scans were conducted following administration of norepinephrine reuptake inhibitor desipramine or continuous infusion of exogenous norepinephrine. HED retention was quantified by retention index, standardized uptake value (SUV), monoexponential and one-compartment washout. Plasma and cardiac norepinephrine were measured by high performance liquid chromatography. Results: Test retest variability was lower for retention index (15% ± 12%) and SUV (19% ± 15%) as compared to monoexponential washout rates (21% ± 13%). Desipramine pretreatment reduced myocardial HED retention index by 69% and SUV by 85%. Chase treatment with desipramine increased monoexponential HED washout by 197% compared to untreated controls. Norepinephrine infusion dose-dependently reduced HED accumulation, reflected by both retention index and SUV, with a corresponding increase in monoexponential washout. Plasma and cardiac norepinephrine levels correlated with HED quantitative measurements. Conclusion: The repeatability of HED retention index, SUV, and monoexponential washout supports its suitability for longitudinal PET studies in rats. Uptake and washout of HED are sensitive to acute increases in norepinephrine concentration

  11. A heteroscedastic measurement error model for method comparison data with replicate measurements.

    Science.gov (United States)

    Nawarathna, Lakshika S; Choudhary, Pankaj K

    2015-03-30

    Measurement error models offer a flexible framework for modeling data collected in studies comparing methods of quantitative measurement. These models generally make two simplifying assumptions: (i) the measurements are homoscedastic, and (ii) the unobservable true values of the methods are linearly related. One or both of these assumptions may be violated in practice. In particular, error variabilities of the methods may depend on the magnitude of measurement, or the true values may be nonlinearly related. Data with these features call for a heteroscedastic measurement error model that allows nonlinear relationships in the true values. We present such a model for the case when the measurements are replicated, discuss its fitting, and explain how to evaluate similarity of measurement methods and agreement between them, which are two common goals of data analysis, under this model. Model fitting involves dealing with lack of a closed form for the likelihood function. We consider estimation methods that approximate either the likelihood or the model to yield approximate maximum likelihood estimates. The fitting methods are evaluated in a simulation study. The proposed methodology is used to analyze a cholesterol dataset. Copyright © 2015 John Wiley & Sons, Ltd.

  12. Radiation applications in art and archaeometry X-ray fluorescence applications to archaeometry. Possibility of obtaining non-destructive quantitative analyses

    International Nuclear Information System (INIS)

    Milazzo, Mario

    2004-01-01

    The possibility of obtaining quantitative XRF analysis in archaeometric applications is considered in the following cases: - Examinations of metallic objects with irregular surface: coins, for instance. - Metallic objects with a natural or artificial patina on the surface. - Glass or ceramic samples for which the problems for quantitative analysis rise from the non-detectability of matrix low Z elements. The fundamental parameter method for quantitative XRF analysis is based on a numerical procedure involving he relative values of XRF lines intensity. As a consequence it can be applied also to the experimental XRF spectra obtained for metallic objects if the correction for the irregular shape consists only in introducing a constant factor which does not affect the XRF intensity relative value. This is in fact possible in non-very-restrictive conditions for the experimental set up. The finenesses of coins with a superficial patina can be evaluated by resorting to the measurements of Rayleigh to Compton scattering intensity ratio at an incident energy higher than the one of characteristic X-ray. For glasses and ceramics the measurements of the Compton scattered intensity of the exciting radiation and the use of a proper scaling law make possible to evaluate the matrix absorption coefficients for all characteristic X-ray line energies

  13. Extracting quantitative measures from EAP: a small clinical study using BFOR.

    Science.gov (United States)

    Hosseinbor, A Pasha; Chung, Moo K; Wu, Yu-Chien; Fleming, John O; Field, Aaron S; Alexander, Andrew L

    2012-01-01

    The ensemble average propagator (EAP) describes the 3D average diffusion process of water molecules, capturing both its radial and angular contents, and hence providing rich information about complex tissue microstructure properties. Bessel Fourier orientation reconstruction (BFOR) is one of several analytical, non-Cartesian EAP reconstruction schemes employing multiple shell acquisitions that have recently been proposed. Such modeling bases have not yet been fully exploited in the extraction of rotationally invariant q-space indices that describe the degree of diffusion anisotropy/restrictivity. Such quantitative measures include the zero-displacement probability (P(o)), mean squared displacement (MSD), q-space inverse variance (QIV), and generalized fractional anisotropy (GFA), and all are simply scalar features of the EAP. In this study, a general relationship between MSD and q-space diffusion signal is derived and an EAP-based definition of GFA is introduced. A significant part of the paper is dedicated to utilizing BFOR in a clinical dataset, comprised of 5 multiple sclerosis (MS) patients and 4 healthy controls, to estimate P(o), MSD, QIV, and GFA of corpus callosum, and specifically, to see if such indices can detect changes between normal appearing white matter (NAWM) and healthy white matter (WM). Although the sample size is small, this study is a proof of concept that can be extended to larger sample sizes in the future.

  14. Measurement, monitoring, and verification: make it work!

    Science.gov (United States)

    Coeli M. Hoover

    2011-01-01

    The capacity of forests to absorb and store carbon is certainly, as the authors note, an important tool in the greenhouse gas mitigation toolbox. Our understanding of what elements can make forest carbon offset projects successful has grown a great deal over time, as the global community has come to understand that forest degradation and conversion are the result of a...

  15. Innovative Performance Measurement: an Integrative Perspective of Stakeholder's View

    Directory of Open Access Journals (Sweden)

    López-Fresno Palmira

    2014-11-01

    Full Text Available Business Process Management (BPM has been increasingly focused as an holistic approach to manage organizations for better organizational effectiveness. BPM involves the use of innovative performance measurement systems to follow up, coordinate, control and improve processes and overall business efficacy and efficiency. In this paper we propose a global holistic perspective of integrated information, combining the view of all stakeholders and both qualitative and quantitative information, as a basic prerequisite for quality of information for better decision making. The paper includes findings from an empirical case study of measuring Parkinson's Disease Neurosurgery process, including stakeholder's view with an integrative perspective.

  16. Quantitative measurement of post-irradiation neck fibrosis based on the young modulus: description of a new method and clinical results.

    Science.gov (United States)

    Leung, Sing-Fai; Zheng, Yongping; Choi, Charles Y K; Mak, Suzanne S S; Chiu, Samuel K W; Zee, Benny; Mak, Arthur F T

    2002-08-01

    Postirradiation fibrosis is one of the most common late effects of radiation therapy for patients with head and neck carcinoma. An objective and quantitative method for its measurement is much desired, but the criteria currently used to score fibrosis are mostly semiquantitative and partially subjective. The Young Modulus (YM) is a physical parameter that characterizes the deformability of material to stress. The authors measured the YM in soft tissues of the neck, at defined reference points, using an ultrasound probe and computer algorithm that quantified the indentation (deformation) on tissue due to a measured, applied force. One hundred five patients who had received previous radiation therapy to the entire neck were assessed, and the results were compared with the hand palpation scores and with a functional parameter represented by the range of neck rotation, and all results were correlated with symptoms. The YM was obtained successfully in all patients examined. It had a significant positive correlation with the palpation score and a significant negative correlation with the range of neck rotation. The YM was significantly higher on the side of the neck that received a boost dose of radiation, although the corresponding palpation scores were similar. The results of all three measurement methods were correlated with symptoms. Postirradiation neck fibrosis can be measured in absolute units based on the YM. The results showed a significant correlation with hand palpation scores, with restriction of neck rotation, and with symptoms. Compared with the palpation method, the YM is more quantitative, objective, focused on small subregions, and better discriminates regions subject to differential radiation dose levels. Its inclusion in the Analytic category of the Late Effects of Normal Tissues-SOMA system should be considered to facilitate comparative studies. Copyright 2002 American Cancer Society.

  17. Prediction of trabecular bone qualitative properties using scanning quantitative ultrasound

    Science.gov (United States)

    Qin, Yi-Xian; Lin, Wei; Mittra, Erik; Xia, Yi; Cheng, Jiqi; Judex, Stefan; Rubin, Clint; Müller, Ralph

    2013-11-01

    Microgravity induced bone loss represents a critical health problem in astronauts, particularly occurred in weight-supporting skeleton, which leads to osteopenia and increase of fracture risk. Lack of suitable evaluation modality makes it difficult for monitoring skeletal status in long term space mission and increases potential risk of complication. Such disuse osteopenia and osteoporosis compromise trabecular bone density, and architectural and mechanical properties. While X-ray based imaging would not be practical in space, quantitative ultrasound may provide advantages to characterize bone density and strength through wave propagation in complex trabecular structure. This study used a scanning confocal acoustic diagnostic and navigation system (SCAN) to evaluate trabecular bone quality in 60 cubic trabecular samples harvested from adult sheep. Ultrasound image based SCAN measurements in structural and strength properties were validated by μCT and compressive mechanical testing. This result indicated a moderately strong negative correlations observed between broadband ultrasonic attenuation (BUA) and μCT-determined bone volume fraction (BV/TV, R2=0.53). Strong correlations were observed between ultrasound velocity (UV) and bone's mechanical strength and structural parameters, i.e., bulk Young's modulus (R2=0.67) and BV/TV (R2=0.85). The predictions for bone density and mechanical strength were significantly improved by using a linear combination of both BUA and UV, yielding R2=0.92 for BV/TV and R2=0.71 for bulk Young's modulus. These results imply that quantitative ultrasound can characterize trabecular structural and mechanical properties through measurements of particular ultrasound parameters, and potentially provide an excellent estimation for bone's structural integrity.

  18. Fast automatic quantitative cell replication with fluorescent live cell imaging

    Directory of Open Access Journals (Sweden)

    Wang Ching-Wei

    2012-01-01

    Full Text Available Abstract Background live cell imaging is a useful tool to monitor cellular activities in living systems. It is often necessary in cancer research or experimental research to quantify the dividing capabilities of cells or the cell proliferation level when investigating manipulations of the cells or their environment. Manual quantification of fluorescence microscopic image is difficult because human is neither sensitive to fine differences in color intensity nor effective to count and average fluorescence level among cells. However, auto-quantification is not a straightforward problem to solve. As the sampling location of the microscopy changes, the amount of cells in individual microscopic images varies, which makes simple measurement methods such as the sum of stain intensity values or the total number of positive stain within each image inapplicable. Thus, automated quantification with robust cell segmentation techniques is required. Results An automated quantification system with robust cell segmentation technique are presented. The experimental results in application to monitor cellular replication activities show that the quantitative score is promising to represent the cell replication level, and scores for images from different cell replication groups are demonstrated to be statistically significantly different using ANOVA, LSD and Tukey HSD tests (p-value Conclusion A robust automated quantification method of live cell imaging is built to measure the cell replication level, providing a robust quantitative analysis system in fluorescent live cell imaging. In addition, the presented unsupervised entropy based cell segmentation for live cell images is demonstrated to be also applicable for nuclear segmentation of IHC tissue images.

  19. Instrumentation and quantitative methods of evaluation

    International Nuclear Information System (INIS)

    Beck, R.N.; Cooper, M.D.

    1991-01-01

    This report summarizes goals and accomplishments of the research program entitled Instrumentation and Quantitative Methods of Evaluation, during the period January 15, 1989 through July 15, 1991. This program is very closely integrated with the radiopharmaceutical program entitled Quantitative Studies in Radiopharmaceutical Science. Together, they constitute the PROGRAM OF NUCLEAR MEDICINE AND QUANTITATIVE IMAGING RESEARCH within The Franklin McLean Memorial Research Institute (FMI). The program addresses problems involving the basic science and technology that underlie the physical and conceptual tools of radiotracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 234 refs., 11 figs., 2 tabs

  20. First quantitative measurements by IR spectroscopy of dioxins and furans by means of broadly tunable quantum cascade lasers

    International Nuclear Information System (INIS)

    Siciliani de Cumis, M; D’Amato, F; Viciani, S; Patrizi, B; Foggi, P; Galea, C L

    2013-01-01

    We demonstrate the possibility of a quantitative analysis of the concentration of several dioxins and furans, among the most toxic ones, by only using infrared absorption laser spectroscopy. Two broadly tunable quantum cascade lasers, emitting in the mid-infrared, have been used to measure the absorption spectra of dioxins and furans, dissolved in CCl 4 , in direct absorption mode. The minimum detectable concentrations are inferred by analyzing diluted samples. A comparison between this technique and standard Fourier transform spectroscopy has been carried out and an analysis of future perspectives is reported. (paper)