WorldWideScience

Sample records for elastique quantitative applications

  1. Quantitative elastic migration. Applications to 3D borehole seismic surveys; Migration elastique quantitative. Applications a la sismique de puits 3D

    Energy Technology Data Exchange (ETDEWEB)

    Clochard, V.

    1998-12-02

    3D VSP imaging is nowadays a strategic requirement by petroleum companies. It is used to precise in details the geology close to the well. Because of the lack of redundancy and limited coverage in the data. this kind of technology is more restrictive than surface seismic which allows an investigation at a higher scale. Our contribution was to develop an elastic quantitative imagine (GRT migration) which can be applied to 3 components borehole dataset. The method is similar to the Kirchhoff migration using sophistical weighting of the seismic amplitudes. In reality. GRT migration uses pre-calculated Green functions (travel time. amplitude. polarization). The maps are obtained by 3D ray tracing (wavefront construction) in the velocity model. The migration algorithm works with elementary and independent tasks. which is useful to process different kind of dataset (fixed or moving geophone antenna). The study has been followed with validations using asymptotic analytical solution. The ability of reconstruction in 3D borehole survey has been tested in the Overthrust synthetic model. The application to a real circular 3D VSP shows various problems like velocity model building, anisotropy factor and the preprocessing (deconvolution. wave mode separation) which can destroy seismic amplitudes. An isotropic 3 components preprocessing of the whole dataset allows a better lateral reconstruction. The choice of a big migration aperture can help the reconstruction of strong geological dip in spite of migration smiles. Finally, the methodology can be applied to PS converted waves. (author)

  2. High temperature elastic constant measurements: application to plutonium; Mesure des constantes elastiques a haute temperature application au plutonium

    Energy Technology Data Exchange (ETDEWEB)

    Bouchet, J M [Commissariat a l' Energie Atomique, Fontenay-aux-Roses (France). Centre d' Etudes Nucleaires

    1969-03-01

    We present an apparatus with which we have measured the Young's modulus and the Poisson's ratio of several compounds from the resonance frequency of cylinders in the temperature range 0 deg. C-700 deg. C. We especially studied the elastic constants of plutonium and measured for the first time to our knowledge the Young's modulus of Pu{sub {delta}} and Pu{sub {epsilon}}. E{sub {delta}} 360 deg. C = 1.6 10{sup 11} dy/cm{sup 2}; E{sub {epsilon}} 490 deg. C = 1.1 10{sup 11} dy/cm{sup 2}, {sigma}{sub {epsilon}} = 0.25 {+-} 0.03 Using our results, we have calculated the compressibility, the Debye temperature, the Grueneisen constant and the electronic specific heat of Pu{sub {epsilon}}. (author) [French] Nous decrivons un appareil qui permet de mesurer les constantes elastiques (module de Young et module de Poisson) jusqu'a 700 deg. C a partir des frequences de resonance de barreaux cylindriques. Nous avons plus specialement etudie le plutonium et determine pour la premiere fois a notre connaissance le module de Young des phases {delta} et {epsilon}: E{sub {delta}} 360 deg. C = 1.6 10{sup 11} dy/cm{sup 2}; E{sub {epsilon}} 490 deg. C = 1.1 10{sup 11} dy/cm{sup 2}, {sigma}{sub {epsilon}} = 0.25 {+-} 0.03 Nos mesures nous ont permis de calculer la compressibilite, la temperature de Debye, la constante de Gruneisen et la chaleur specifique electronique de Pu{sub {epsilon}}. (auteur)

  3. 3.55 GeV/c Kp elastic scattering near 180 deg; Diffusion elastique des K de 3.55 GeV/C par les protons, au voisinage de 180 deg

    Energy Technology Data Exchange (ETDEWEB)

    Duflo, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    Backward elastic K{sup +}p and K{sup -}p scattering has been measured in the angular interval 168 deg. < {theta}.c.m. < 177 deg. The experimental apparatus included optical spark chambers to measure the three prongs of the event scattering, a set of scintillation counters and Cerenkov counters to select events ant trigger chambers, and a magnet to determine the moment of the recoil proton. 106000 photographs was taken. Of these, 22 satisfied all requirements as elastic K{sup +}p scattering events. No event was found satisfy the kinematical criteria for K{sup -}p elastic scattering. The correspondent values of differential cross sections are: (d{sigma}/d{omega})K{sup +}p {yields} pK{sup +} = 17 {+-} 4 {mu}b/ster. (d{sigma}/d{omega})K{sup -}p {yields} pK{sup -} {<=} 0.6 {mu}b/ster. Contaminations by {pi}p backward scattering, and inelastic scattering were estimated. Elastic scattering K{sup +}p exhibits a backward peak. A reasonably satisfactory interpretation of our results is obtained by exchange models. There is, in fact, no definitely established particles which could intermediate the Kp{sup -} {yields} pK{sup -} process in exchange cannel. That is in good agreement with our small value of K{sup -}p backward elastic scattering cross section. Our results lend support to the conclusions of the interference model developed by Barger and Cline for the {pi}p backward scattering, and to qualitative previsions of the Quark models. (author) [French] La diffusion elastique en arriere des K{sup +} et K{sup -} par les protons a ete mesuree dans l'intervalle angulaire 168 deg. < {theta}cm < 177 deg. Le dispositif experimental comprenait des chambres a etincelles 'optiques' pour mesurer les trois trajectoires de l'evenement diffuse, un ensemble de compteurs a scintillations et de compteurs Cerenkov pour selectionner les evenements et declencher les chambres, et un aimant pour determiner le moment du proton de recul. 106000 photographies ont ete prises, dont 22 ont satisfait a

  4. 3.55 GeV/c Kp elastic scattering near 180 deg; Diffusion elastique des K de 3.55 GeV/C par les protons, au voisinage de 180 deg

    Energy Technology Data Exchange (ETDEWEB)

    Duflo, J. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    Backward elastic K{sup +}p and K{sup -}p scattering has been measured in the angular interval 168 deg. < {theta}.c.m. < 177 deg. The experimental apparatus included optical spark chambers to measure the three prongs of the event scattering, a set of scintillation counters and Cerenkov counters to select events ant trigger chambers, and a magnet to determine the moment of the recoil proton. 106000 photographs was taken. Of these, 22 satisfied all requirements as elastic K{sup +}p scattering events. No event was found satisfy the kinematical criteria for K{sup -}p elastic scattering. The correspondent values of differential cross sections are: (d{sigma}/d{omega})K{sup +}p {yields} pK{sup +} = 17 {+-} 4 {mu}b/ster. (d{sigma}/d{omega})K{sup -}p {yields} pK{sup -} {<=} 0.6 {mu}b/ster. Contaminations by {pi}p backward scattering, and inelastic scattering were estimated. Elastic scattering K{sup +}p exhibits a backward peak. A reasonably satisfactory interpretation of our results is obtained by exchange models. There is, in fact, no definitely established particles which could intermediate the Kp{sup -} {yields} pK{sup -} process in exchange cannel. That is in good agreement with our small value of K{sup -}p backward elastic scattering cross section. Our results lend support to the conclusions of the interference model developed by Barger and Cline for the {pi}p backward scattering, and to qualitative previsions of the Quark models. (author) [French] La diffusion elastique en arriere des K{sup +} et K{sup -} par les protons a ete mesuree dans l'intervalle angulaire 168 deg. < {theta}cm < 177 deg. Le dispositif experimental comprenait des chambres a etincelles 'optiques' pour mesurer les trois trajectoires de l'evenement diffuse, un ensemble de compteurs a scintillations et de compteurs Cerenkov pour selectionner les evenements et declencher les chambres, et un aimant pour determiner le moment du proton de recul. 106000 photographies ont ete prises

  5. Some properties of the Boltzmann elastic collision operator; Quelques proprietes particulieres de l'operateur de collision elastique de Boltzmann

    Energy Technology Data Exchange (ETDEWEB)

    Delcroix, J. L. [Ecole Normale Superieure (France); Salmon, J. [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1959-07-01

    The authors point out some properties (an important one is a variational property) of the Boltzmann elastic collision operator, valid in a more general framework than that of the Lorentz gas. Reprint of a paper published in 'Le journal de physique et le radium', tome 20, Jun 1959, p. 594-596 [French] Les auteurs mettent en evidence quelques proprietes (dont notamment une propriete variationnelle) de l'operateur de collision elastique de Boltzmann valables dans un cadre plus general que celui du gaz de Lorentz. Reproduction d'un article publie dans 'Le journal de physique et le radium', tome 20, Jun 1959, p. 594-596.

  6. Quantitative multi-waves migration in elastic anisotropic media; Migration quantitative multi-ondes en milieu elastique anisotrope

    Energy Technology Data Exchange (ETDEWEB)

    Borgne, H.

    2004-12-01

    modelling of waves propagation in anisotropic media. With the approximations of ray theory, 1 develop an expression of the geometrical spreading, the amplitude, and their reciprocity relations. I set up imaging formulas in order to reconstruct the reflection coefficients of the subsurface in elastic anisotropic media. In a first time, 1 salve the direct problem, by expressing the integral relation between the scattered wave field recorded by the receivers and the subsurface reflection coefficients. In a second time, 1 apply an elastic anisotropic quantitative migration method, based on the properties of the inverse Radon transforms (Beylkin's approach), in order to express the reflection coefficient in 2D, 2.5D and 3D media. 1 implemented these formulas in a new preserved amplitude migration algorithm, where the images are sorted by angle classes. At last, 1 apply these theoretical results to synthetic and real datasets. 1 show that migration is able to reconstruct the correct A V A behavior of anisotropic reflection coefficients if hath. modifications are achieved. Then, 1 degrade the process, by keeping an anisotropic ray tracing but using the classical isotropic imaging formula. F'or this commonly used configuration, 1 evaluate the error that can be expected in the A V A response of the migrated reflection coefficient. Methodological applications show the sensibility of the migration results to the velocity model smoothing and to an error on the anisotropic axis. (author)

  7. Phase-shift analysis of pion-nucleon elastic scattering below 1.6 GeV; Analyse en ondes partielles de la diffusion elastique meson {pi} - nucleon au-dessous de 1.6 GeV

    Energy Technology Data Exchange (ETDEWEB)

    Bareyre, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    Experimental results of pion-nucleon elastic scattering below 1.6 GeV (total cross sections, angular distributions of elastic scattering and recoil nucleon polarizations) have been described by a partial wave analysis. This analysis has been developed, one energy at a time, with a method of least squares fits. A single solution is extracted by continuity with energy of the different solutions. Resonating behaviour has been clearly established for several partial waves. In addition to these important effects some phase shifts show rapid variations with energy. Present experimental situation does not permit to say whether these variations are due to experimental biases or to physical effects. (author) [French] Les resultats experimentaux de la diffusion elastique meson {pi} - nucleon au-dessous de 1.6 GeV (sections efficaces totales, distributions angulaires de diffusion elastique et de polarisation du nucleon de recul) sont decrits a l'aide d'une analyse en ondes partielles. Cette analyse est developpee energie par energie au moyen d'une methode d'ajustement en moindres carres. Un critere empirique de continuite des solutions en fonction de l'energie a permis d'isoler une solution unique. Des resonances sont clairement etablies pour plusieurs ondes partielles, ainsi que certains petits effets moins caracteristiques. Pour ceux-ci, la situation experimentale presente ne permet pas d'affirmer s'ils sont dus a des effets physiques ou a des biais experimentaux. (auteur)

  8. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  9. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  10. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  11. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  12. Elastic and plastic properties of iron-aluminium alloys. Special problems raised by the brittleness of alloys of high aluminium content; Proprietes elastiques et plastiques des alliages fer-aluminium. Problemes particuliers poses par la fragilite des alliages a forte teneur en aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Mouturat, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-06-01

    The present study embodies the results obtained with iron-aluminium alloys whose composition runs from 0 to nearly 50 atoms per cent aluminium. Conditions of elaboration and transformation have been studied successively, as well as the Young's modulus and the flow stress; the last chapter embodies, a study of the Portevin-le-Chatelier effect in alloys of 40 atoms per cent of aluminium. I) The principal difficulty to clear up consisted in the intergranular brittleness of ordered alloys; this brittleness has been considerably reduced with appropriate conditions of elaboration and transformation. II) The studies upon the Young's modulus are in connection with iron-aluminium alloys; transformation temperatures are well shown up. The formation of covalent bonds on and after 25 atoms per cent show the highest values of the modulus. III) The analysis of variations of the flow stress according to the temperature show some connection with ordered structures, the existence of antiphase domains and the existence of sur-structure dislocations. IV) In the ordered Fe Al domain the kinetics of the Portevin-le-Chatelier effect could be explained by a mechanism of diffusion of vacancies. The role they play has been specified by the influence they exert upon the dislocations; this has led us to the inhomogeneous Rudman order; this inhomogeneous order could explain the shape of the traction curves. (author) [French] Cette etude comporte les resultats obtenus avec des alliages fer-aluminium dont la composition s'etend de 0 a pres de 50 atomes pour cent d'aluminium. Nous avons etudie successivement les conditions d'elaboration et de transformation, le module elastique et la limite elastique; un dernier chapitre est consacre a l'etude du phenomene Portevin-le-Chatelier dans les alliages a 40 atomes pour cent d'aluminium. I) La principale difficulte a resoudre residait dans la fragilite intergranulaire des alliages ordonnes; celle-ci a ete considerablement reduite par des conditions

  13. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  14. Elastic and plastic properties of iron-aluminium alloys. Special problems raised by the brittleness of alloys of high aluminium content; Proprietes elastiques et plastiques des alliages fer-aluminium. Problemes particuliers poses par la fragilite des alliages a forte teneur en aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Mouturat, P. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-06-01

    The present study embodies the results obtained with iron-aluminium alloys whose composition runs from 0 to nearly 50 atoms per cent aluminium. Conditions of elaboration and transformation have been studied successively, as well as the Young's modulus and the flow stress; the last chapter embodies, a study of the Portevin-le-Chatelier effect in alloys of 40 atoms per cent of aluminium. I) The principal difficulty to clear up consisted in the intergranular brittleness of ordered alloys; this brittleness has been considerably reduced with appropriate conditions of elaboration and transformation. II) The studies upon the Young's modulus are in connection with iron-aluminium alloys; transformation temperatures are well shown up. The formation of covalent bonds on and after 25 atoms per cent show the highest values of the modulus. III) The analysis of variations of the flow stress according to the temperature show some connection with ordered structures, the existence of antiphase domains and the existence of sur-structure dislocations. IV) In the ordered Fe Al domain the kinetics of the Portevin-le-Chatelier effect could be explained by a mechanism of diffusion of vacancies. The role they play has been specified by the influence they exert upon the dislocations; this has led us to the inhomogeneous Rudman order; this inhomogeneous order could explain the shape of the traction curves. (author) [French] Cette etude comporte les resultats obtenus avec des alliages fer-aluminium dont la composition s'etend de 0 a pres de 50 atomes pour cent d'aluminium. Nous avons etudie successivement les conditions d'elaboration et de transformation, le module elastique et la limite elastique; un dernier chapitre est consacre a l'etude du phenomene Portevin-le-Chatelier dans les alliages a 40 atomes pour cent d'aluminium. I) La principale difficulte a resoudre residait dans la fragilite intergranulaire des alliages ordonnes; celle-ci a ete

  15. Quantitative and qualitative applications of the neutron-gamma borehole logging

    International Nuclear Information System (INIS)

    Charbucinski, J.; Aylmer, J.A.; Eisler, P.L.; Borsaru, M.

    1989-01-01

    Two neutron-γ borehole logging applications are described. In a quantitative application of the prompt-gamma neutron-activation analysis (PGNAA) technique, research was carried out both in the laboratory and at a mine to establish a suitable borehole logging technology for manganese-grade predictions. As an example of the qualitative application of PGNAA, the use of this method has been demonstrated for the determination of lithology. (author)

  16. Quantitative and qualitative applications of the neutron-gamma borehole logging

    International Nuclear Information System (INIS)

    Charbucinski, J.; Eisler, P.L.; Borsaru, M.; Aylmer, J.A.

    1990-01-01

    Two examples of neutron-gamma borehole logging application are described. In the quantitative application of the PGNAA technique, research was carried out both in the laboratory and at a mine to establish a suitable borehole logging technology for Mn-grade predictions. As an example of qualitative application of PGNAA, use of this method has been demonstrated for determination of lithology. (author). 4 refs, 10 figs, 7 tabs

  17. MO-E-12A-01: Quantitative Imaging: Techniques, Applications, and Challenges

    International Nuclear Information System (INIS)

    Jackson, E; Jeraj, R; McNitt-Gray, M; Cao, Y

    2014-01-01

    The first symposium in the Quantitative Imaging Track focused on the introduction of quantitative imaging (QI) by illustrating the potential of QI in diagnostic and therapeutic applications in research and patient care, highlighting key challenges in implementation of such QI applications, and reviewing QI efforts of selected national and international agencies and organizations, including the FDA, NCI, NIST, and RSNA. This second QI symposium will focus more specifically on the techniques, applications, and challenges of QI. The first talk of the session will focus on modalityagnostic challenges of QI, beginning with challenges of the development and implementation of QI applications in single-center, single-vendor settings and progressing to the challenges encountered in the most general setting of multi-center, multi-vendor settings. The subsequent three talks will focus on specific QI challenges and opportunities in the modalityspecific settings of CT, PET/CT, and MR. Each talk will provide information on modality-specific QI techniques, applications, and challenges, including current efforts focused on solutions to such challenges. Learning Objectives: Understand key general challenges of QI application development and implementation, regardless of modality. Understand selected QI techniques and applications in CT, PET/CT, and MR. Understand challenges, and potential solutions for such challenges, for the applications presented for each modality

  18. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  19. Mathematics of quantitative kinetic PCR and the application of standard curves.

    Science.gov (United States)

    Rutledge, R G; Côté, C

    2003-08-15

    Fluorescent monitoring of DNA amplification is the basis of real-time PCR, from which target DNA concentration can be determined from the fractional cycle at which a threshold amount of amplicon DNA is produced. Absolute quantification can be achieved using a standard curve constructed by amplifying known amounts of target DNA. In this study, the mathematics of quantitative PCR are examined in detail, from which several fundamental aspects of the threshold method and the application of standard curves are illustrated. The construction of five replicate standard curves for two pairs of nested primers was used to examine the reproducibility and degree of quantitative variation using SYBER Green I fluorescence. Based upon this analysis the application of a single, well- constructed standard curve could provide an estimated precision of +/-6-21%, depending on the number of cycles required to reach threshold. A simplified method for absolute quantification is also proposed, in which quantitative scale is determined by DNA mass at threshold.

  20. Application of an image processing software for quantitative autoradiography

    International Nuclear Information System (INIS)

    Sobeslavsky, E.; Bergmann, R.; Kretzschmar, M.; Wenzel, U.

    1993-01-01

    The present communication deals with the utilization of an image processing device for quantitative whole-body autoradiography, cell counting and also for interpretation of chromatograms. It is shown that the system parameters allow an adequate and precise determination of optical density values. Also shown are the main error sources limiting the applicability of the system. (orig.)

  1. Influence of the anisotropy of expansion coefficients on the elastic properties of uranium of zirconium and of zinc; Influence de l'anisotropie des coefficients de dilatation sur les proprietes elastiques de l'uranium du zirconium et du zinc

    Energy Technology Data Exchange (ETDEWEB)

    Calais, Daniel; Saada, Georges; Simenel, Nicole [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1959-07-01

    The anisotropy of the expansion coefficients of uranium, zirconium and zinc provoke internal tensions in the course of cooling these metals. These tensions are eliminated in the case of zinc by restoration to room temperature, but persist in uranium and zirconium and are responsible for the absence of an elastic limit in these two metals. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 249, p. 1225-1227, sitting of 5 October 1959 [French] L'anisotropie des coefficients de dilatation de l'uranium, du zirconium et du zinc provoque au cours du refroidissement de ces metaux des tensions internes. Eliminees par restauration a la temperature ambiante dans le cas du zinc, ces tensions persistent pour l'uranium et le zirconium et sont responsable de l'absence de limite elastique dans ces deux metaux. Reproduction d'un article publie dans les Comptes rendus des seances de l'Academie des Sciences, t. 249, p. 1225-1227, seance du 5 octobre 1959.

  2. Automated quantitative micro-mineralogical characterization for environmental applications

    Science.gov (United States)

    Smith, Kathleen S.; Hoal, K.O.; Walton-Day, Katherine; Stammer, J.G.; Pietersen, K.

    2013-01-01

    Characterization of ore and waste-rock material using automated quantitative micro-mineralogical techniques (e.g., QEMSCAN® and MLA) has the potential to complement traditional acid-base accounting and humidity cell techniques when predicting acid generation and metal release. These characterization techniques, which most commonly are used for metallurgical, mineral-processing, and geometallurgical applications, can be broadly applied throughout the mine-life cycle to include numerous environmental applications. Critical insights into mineral liberation, mineral associations, particle size, particle texture, and mineralogical residence phase(s) of environmentally important elements can be used to anticipate potential environmental challenges. Resources spent on initial characterization result in lower uncertainties of potential environmental impacts and possible cost savings associated with remediation and closure. Examples illustrate mineralogical and textural characterization of fluvial tailings material from the upper Arkansas River in Colorado.

  3. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    NARCIS (Netherlands)

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from

  4. Quantitative Security Risk Assessment of Android Permissions and Applications

    OpenAIRE

    Wang , Yang; Zheng , Jun; Sun , Chen; Mukkamala , Srinivas

    2013-01-01

    Part 6: Mobile Computing; International audience; The booming of the Android platform in recent years has attracted the attention of malware developers. However, the permissions-based model used in Android system to prevent the spread of malware, has shown to be ineffective. In this paper, we propose DroidRisk, a framework for quantitative security risk assessment of both Android permissions and applications (apps) based on permission request patterns from benign apps and malware, which aims ...

  5. Quantitative Susceptibility Mapping: Contrast Mechanisms and Clinical Applications

    Science.gov (United States)

    Liu, Chunlei; Wei, Hongjiang; Gong, Nan-Jie; Cronin, Matthew; Dibb, Russel; Decker, Kyle

    2016-01-01

    Quantitative susceptibility mapping (QSM) is a recently developed MRI technique for quantifying the spatial distribution of magnetic susceptibility within biological tissues. It first uses the frequency shift in the MRI signal to map the magnetic field profile within the tissue. The resulting field map is then used to determine the spatial distribution of the underlying magnetic susceptibility by solving an inverse problem. The solution is achieved by deconvolving the field map with a dipole field, under the assumption that the magnetic field is a result of the superposition of the dipole fields generated by all voxels and that each voxel has its unique magnetic susceptibility. QSM provides improved contrast to noise ratio for certain tissues and structures compared to its magnitude counterpart. More importantly, magnetic susceptibility is a direct reflection of the molecular composition and cellular architecture of the tissue. Consequently, by quantifying magnetic susceptibility, QSM is becoming a quantitative imaging approach for characterizing normal and pathological tissue properties. This article reviews the mechanism generating susceptibility contrast within tissues and some associated applications. PMID:26844301

  6. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  7. Quantitative application study on the control system of contract progress

    International Nuclear Information System (INIS)

    Hu Xiaocong; Kang Rujie; Zhan Li

    2012-01-01

    Quantitative application study on the control system of contract progress, which is based on project management theory and PDCA cycle methods, provides a new way for the contract business management of enterprise, in line with the current situation and the nuclear power enterprise performance management needs. The concept of the system, system development, program design and development of ERP (VBA design) which come from the work experience summary of business managers are convenient and feasible in practical applications. By way of the applications in 2009, 2010, 2011 three-year overhaul contract management and continuous adjustment it has become an important business management tool, which not only effectively guaranteed the contract time and efficiency, but also combines the performance management and contract progress management. This study has provided useful reference for the enterprise management. (authors)

  8. Quantitative estimation of seafloor features from photographs and their application to nodule mining

    Digital Repository Service at National Institute of Oceanography (India)

    Sharma, R.

    Methods developed for quantitative estimation of seafloor features from seabed photographs and their application for estimation of nodule sizes, coverage, abundance, burial, sediment thickness, extent of rock exposure, density of benthic organisms...

  9. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  10. Brittleness and elastic limit of iron-aluminium 40 at high strain rates; Fragilite et limite elastique du fer-aluminium 40 aux grandes vitesses de deformation

    Energy Technology Data Exchange (ETDEWEB)

    Cottu, J P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    Iron-aluminium 40 - a B2 ordered solid solution - was tensile tested to provide information on the brittleness of this alloy and its dependence on strain rate and temperature. For slow strain rates (0.34 per cent s{sup -1}) cleaved fracture prevails when temperature is kept below 400 deg. C, while a ductile rupture is observed, with an almost 100 per cent necking at higher temperatures. In this case, recrystallization occurs during the deformation. For higher strain rates - 335 per cent s{sup -1}), a ductility reduction - owed to intergranular fracture - precedes the brittle-ductile transition. This property may be bound to the peak on the yield stress temperature curve, which is itself connected to the ordered structure of this alloy. (author) [French] Les essais de traction que nous avons effectues sur le fer-aluminium 40, solution solide ordonnee de type B2, ont pour but de preciser l'influence de la vitesse de deformation et de la temperature sur la fragilite de l'alliage. Pour les faibles vitesses (0,34 pour cent s{sup -1}), la rupture est surtout clivee si la temperature est inferieure a 400 deg. C, puis ductile avec une striction voisine de 100 pour cent aux temperatures superieures; la recristallisation intervient alors ou cours meme de la deformation. Aux vitesses elevees (335 pour cent s{sup -1}) la transition fragile-ductile est precedee d'une chute de ductilite liee a une decohesion intergranulaire. Nous avons associe cette derniere propriete a la presence d'un pic de limite elastique apparaissant a chaud, a vitesse elevee et pouvant etre relie au caractere ordonne de l'alliage. (auteur)

  11. Applications of phosphorus/silicon standards in quantitative autoradiography

    International Nuclear Information System (INIS)

    Treutler, H.Ch.; Freyer, K.

    1983-01-01

    Quantitative autoradiography requires a careful selection of suitable standard preparations. After several basic comments related to the problems of standardization in autoradiography an example is given of the autoradiographic study of semiconductor materials and it is used for describing the system of standardization using silicon discs with diffused phosphorus. These standardized samples are processed in the same manner as the evaluated samples, i.e., from activation to exposure to sensitive material whereby optimal comparability is obtained. All failures of the processing cycle caused by the fluctuation of the neutron flux in the reactor, deviations at the time of activation, afterglow, etc. are eliminated by this standardization procedure. Experience is presented obtained with the application of this procedure. (author)

  12. Quantitative carbon-14 autoradiography at the cellular level: principles and application for cell kinetic studies

    International Nuclear Information System (INIS)

    Doermer, P.

    1981-01-01

    Amounts of radio-labelled substances as low as 10 -18 moles incorporated into individual cells can be measured by utilizing techniques of quantitative autoradiography. The principles and application of quantitative carbon-14 autoradiography are reviewed. Silver grain densities can be counted by automated microphotometry allowing on-line data processing by an interfaced computer. Rate measurements of 14 C-thymidine incorporation into individual cells yield values of the DNA synthesis rate and the DNA synthesis time of a cell compartment can be derived. This is an essential time parameter for the evaluation of kinetic events in proliferating cell populations. This method is applicable to human cells without radiation hazard to man and provides an optimal source of detailed information on the kinetics of normal and diseased human haematopoiesis. Examples of application consist of thalassaemia, malaria infection, iron deficiency anaemia and acute myelogenous leukaemia. (author)

  13. Novel applications of quantitative MRI for the fetal brain

    Energy Technology Data Exchange (ETDEWEB)

    Clouchoux, Cedric [Children' s National Medical Center, Division of Diagnostic Imaging and Radiology, Washington, DC (United States); Limperopoulos, Catherine [Children' s National Medical Center, Division of Diagnostic Imaging and Radiology, Washington, DC (United States); McGill University, McConnell Brain Imaging Center, Montreal Neurological Institute, Montreal (Canada); McGill University, Department of Neurology and Neurosurgery, Montreal (Canada); Children' s National Medical Center, Division of Fetal and Transitional Medicine, Washington, DC (United States)

    2012-01-15

    The advent of ultrafast MRI acquisitions is offering vital insights into the critical maturational events that occur throughout pregnancy. Concurrent with the ongoing enhancement of ultrafast imaging has been the development of innovative image-processing techniques that are enabling us to capture and quantify the exuberant growth, and organizational and remodeling processes that occur during fetal brain development. This paper provides an overview of the role of advanced neuroimaging techniques to study in vivo brain maturation and explores the application of a range of new quantitative imaging biomarkers that can be used clinically to monitor high-risk pregnancies. (orig.)

  14. A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.

    Science.gov (United States)

    Rong, Xing; Frey, Eric C

    2013-08-01

    Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more

  15. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    Science.gov (United States)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  16. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    OpenAIRE

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from smartphones and survey (perception) data. We find that there are dimensions of domestication that explain how the use of smartphones affects our daily routines. Contributions are stronger for downloaded a...

  17. Contribution to the study of proton elastic and inelastic scattering on {sup 12}C; Contribution a l'etude des diffusions elastiques et inelastiques des protons sur le carbone 12

    Energy Technology Data Exchange (ETDEWEB)

    Sadeghi, A

    1966-07-01

    The results of absolute measurements of cross sections for the scattering of protons by {sup 12}C to the two first excited levels are given. The measurements were made from 4.6 to 11.4 MeV at 17 angles for (p,p) and at 15 angles for (p,p') (1. excited level) as well as 8 angles for (p,p'') (2. excited level). A gaseous target with differential pumping was used. The elastic scattering was analyzed using the R-matrix theory with the optical model. Then a new analysis of both (p,p) and (p,p') was achieved using the coupled-wave formalism. The information on the levels of the compound nucleus was completed and was confirmed. (author) [French] Cette these rapporte le resultat de mesures absolues des sections efficaces de diffusion p,p et pp' (conduisant aux deux premiers niveaux excites) de protons par '1'2C. Ces mesures ont ete faites de 4,6 a 11,4 MeV, a 17 angles pour (p,p), a 15 angles pour pp' (1er niveau excite) et a 8 angles pour pp'' (2eme niveau excite). Une chambre a cible gazeuse avec pompage differentiel a ete utilisee. La diffusion elastique a ete analysee au moyen de la theorie de la matrice R avec modele optique pour (p,p). Cette analyse a ete reprise en meme temps que celle de la diffusion inelastique par l'emploi d'equations couplees. Les resultats anterieurs sur les niveaux du noyau compose ont ete confirmes et completes. (auteur)

  18. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  19. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    Science.gov (United States)

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  20. Quantitative carbon-14 autoradiography at the cellular level: principles and application for cell kinetic studies. [Review

    Energy Technology Data Exchange (ETDEWEB)

    Doermer, P [Gesellschaft fuer Strahlen- und Umweltforschung m.b.H., Muenchen (Germany, F.R.). Inst. fuer Haematologie

    1981-03-01

    Amounts of radio-labelled substances as low as 10/sup -18/ moles incorporated into individual cells can be measured by utilizing techniques of quantitative autoradiography. The principles and application of quantitative carbon-14 autoradiography are reviewed. Silver grain densities can be counted by automated microphotometry allowing on-line data processing by an interfaced computer. Rate measurements of /sup 14/C-thymidine incorporation into individual cells yield values of the DNA synthesis rate and the DNA synthesis time of a cell compartment can be derived. This is an essential time parameter for the evaluation of kinetic events in proliferating cell populations. This method is applicable to human cells without radiation hazard to man and provides an optimal source of detailed information on the kinetics of normal and diseased human haematopoiesis. Examples of application consist of thalassaemia, malaria infection, iron deficiency anaemia and acute myelogenous leukaemia.

  1. Applications of quantitative remote sensing to hydrology

    NARCIS (Netherlands)

    Su, Z.; Troch, P.A.A.

    2003-01-01

    In order to quantify the rates of the exchanges of energy and matter among hydrosphere, biosphere and atmosphere, quantitative description of land surface processes by means of measurements at different scales are essential. Quantitative remote sensing plays an important role in this respect. The

  2. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  3. A New Apparatus for Inelastic, Quasi-Elastic and Elastic Cold Neutron Measurements; Un nouveau appareil pour les mesures de diffusion inelastique , quasi-elastique et elastique des neutrons lents; Novyj pribor dlya izmereniya neuprugogo, kvaziuprugogo i uprugogo rasseyaniya kholodnykh nejtrohov; Nuevo aparato para mediciones inelastic as, cuasi elasticas y elasticas de neutrones frios

    Energy Technology Data Exchange (ETDEWEB)

    Otnes, K; Palevsky, H [Brookhaven National Laboratory, Upton, NY (United States)

    1963-01-15

    despersion de la longueur d'onde (largeur totale a mi-hauteur) seront respectivement de 16 {mu}s et 0,16 A pour des neutrons incidents ayant une longueur d'onde de 4 A; l'intensite de la bouffee sur l'echantillon (4 x 1,6 cm) sera de 2 x 10{sup 6} n/s pouf mesures de diffusion quasi-elastique et elastique, la configuration a trois rotors conviendra parfaitement. La duree de la bouffee et la dispersion de longueur d'onde correspondante peuvent atteindre des valeurs aussi faibles que 8 {mu}s et 0, 04 A, ce qui donne une intensite de 10{sup 4} n/s sur un echantillon de 4 x 0, 8 cm. La longueur d'onde et la resolution en temps peuvent etre ajustees entre les; deux limites susmentionnees, de maniere a obtenir l'intensite de flux maximum pour une experience determinee. (author) [Spanish] Se esta construyendo un nuevo selector mecanico destinado al reactor de flujo intenso de Brookhaven. El aparato es del tipo de tres elementos rotores en fase. Los rotores, de 80 cm de diametro, giran a una velo- cidad maxima de 15000 rev/min y se han disenado de modo que emitan tres rafagas de neutrones monocromaticos por revolucion. Dos de los rotores giran alrededor de un eje horizontal, mientras que el tercero lo hace verticalmente. El sistma puede funcionar con uno, dos or tres elementos selectores, segun el tipo de medicion que se quiera efectuar. Para las mediciones inelasticas en que los neutrones ganan energia, lo mas indicado es utilizar un sistema de dos rotores. En este sistema, la duracion de las rafagas sera de 16 {mu}s y el ensanchamiento de longitudes de onda (amplitud plena a la mitad del valor maximo) de 0,16 A para neutrones incidentes de 4 A; la intensidd en la muestra (4 x 1,6 cm) sera de 2 x 10{sup 6} n/s. para las mediciones cuasi elasticas y elasticas resultara mas apropiado el sistema de tres rotores. La duracion de las rafagas y el ensanchamiento de longitudes de onda pueden llegar a un minimo de 8 {mu}s y 0,04 A, respectivamente, lo que representa una intensidad de 10{sup

  4. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.

  5. Application of a nitrocellulose immunoassay for quantitation of proteins secreted in cultured media

    International Nuclear Information System (INIS)

    LaDuca, F.M.; Dang, C.V.; Bell, W.R.

    1986-01-01

    A macro immunoassay was developed to quantitate proteins (antigens) secreted in the culture media of primary rat hepatocytes. Dilutions of protein standards and undiluted spent culture media were applied to numbered sheets of nitrocellulose (NC) paper by vacuum filtration (in volumes up to 1 ml) through a specially designed macrofiltration apparatus constructed of plexiglas. Sequential incubation of the NC with bovine serum albumin blocking buffer, monospecific antibody, and 125 I Protein A enabled quantitation of protein concentration by determination of NC bound radioactivity. Linear and reproducible standard curves were obtained with fibrinogen, albumin, transferrin, and haptoglobin. A high degree of coefficient of correlation between radioactivity (cmp) and protein concentration was found. Intra- and inter-test reproducibility was excellent. By using monospecific antibodies, single proteins (i.e., fibrinogen), as low as 32 ng/ml, could be quantified in heterogeneous protein mixtures and in spent culture media. The assay was sensitive to the difference of fibrinogen secretion under nonstimulatory (serum-free hormonally define medium, SFHD) and stimulatory (SFHD plus hydrocortisone) culture conditions. The procedure and techniques described are applicable to the quantitation of any protein in a suitable buffer

  6. Digital Holographic Microscopy: Quantitative Phase Imaging and Applications in Live Cell Analysis

    Science.gov (United States)

    Kemper, Björn; Langehanenberg, Patrik; Kosmeier, Sebastian; Schlichthaber, Frank; Remmersmann, Christian; von Bally, Gert; Rommel, Christina; Dierker, Christian; Schnekenburger, Jürgen

    The analysis of complex processes in living cells creates a high demand for fast and label-free methods for online monitoring. Widely used fluorescence methods require specific labeling and are often restricted to chemically fixated samples. Thus, methods that offer label-free and minimally invasive detection of live cell processes and cell state alterations are of particular interest. In combination with light microscopy, digital holography provides label-free, multi-focus quantitative phase imaging of living cells. In overview, several methods for digital holographic microscopy (DHM) are presented. First, different experimental setups for the recording of digital holograms and the modular integration of DHM into common microscopes are described. Then the numerical processing of digitally captured holograms is explained. This includes the description of spatial and temporal phase shifting techniques, spatial filtering based reconstruction, holographic autofocusing, and the evaluation of self-interference holograms. Furthermore, the usage of partial coherent light and multi-wavelength approaches is discussed. Finally, potentials of digital holographic microscopy for quantitative cell imaging are illustrated by results from selected applications. It is shown that DHM can be used for automated tracking of migrating cells and cell thickness monitoring as well as for refractive index determination of cells and particles. Moreover, the use of DHM for label-free analysis in fluidics and micro-injection monitoring is demonstrated. The results show that DHM is a highly relevant method that allows novel insights in dynamic cell biology, with applications in cancer research and for drugs and toxicity testing.

  7. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    Science.gov (United States)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  8. Radiation applications in art and archaeometry X-ray fluorescence applications to archaeometry. Possibility of obtaining non-destructive quantitative analyses

    International Nuclear Information System (INIS)

    Milazzo, Mario

    2004-01-01

    The possibility of obtaining quantitative XRF analysis in archaeometric applications is considered in the following cases: - Examinations of metallic objects with irregular surface: coins, for instance. - Metallic objects with a natural or artificial patina on the surface. - Glass or ceramic samples for which the problems for quantitative analysis rise from the non-detectability of matrix low Z elements. The fundamental parameter method for quantitative XRF analysis is based on a numerical procedure involving he relative values of XRF lines intensity. As a consequence it can be applied also to the experimental XRF spectra obtained for metallic objects if the correction for the irregular shape consists only in introducing a constant factor which does not affect the XRF intensity relative value. This is in fact possible in non-very-restrictive conditions for the experimental set up. The finenesses of coins with a superficial patina can be evaluated by resorting to the measurements of Rayleigh to Compton scattering intensity ratio at an incident energy higher than the one of characteristic X-ray. For glasses and ceramics the measurements of the Compton scattered intensity of the exciting radiation and the use of a proper scaling law make possible to evaluate the matrix absorption coefficients for all characteristic X-ray line energies

  9. In-focal-plane characterization of excitation distribution for quantitative fluorescence microscopy applications

    Science.gov (United States)

    Dietrich, Klaus; Brülisauer, Martina; ćaǧin, Emine; Bertsch, Dietmar; Lüthi, Stefan; Heeb, Peter; Stärker, Ulrich; Bernard, André

    2017-06-01

    The applications of fluorescence microscopy span medical diagnostics, bioengineering and biomaterial analytics. Full exploitation of fluorescent microscopy is hampered by imperfections in illumination, detection and filtering. Mainly, errors stem from deviations induced by real-world components inducing spatial or angular variations of propagation properties along the optical path, and they can be addressed through consistent and accurate calibration. For many applications, uniform signal to noise ratio (SNR) over the imaging area is required. Homogeneous SNR can be achieved by quantifying and compensating for the signal bias. We present a method to quantitatively characterize novel reference materials as a calibration reference for biomaterials analytics. The reference materials under investigation comprise thin layers of fluorophores embedded in polymer matrices. These layers are highly homogeneous in their fluorescence response, where cumulative variations do not exceed 1% over the field of view (1.5 x 1.1 mm). An automated and reproducible measurement methodology, enabling sufficient correction for measurement artefacts, is reported. The measurement setup is equipped with an autofocus system, ensuring that the measured film quality is not artificially increased by out-of-focus reduction of the system modulation transfer function. The quantitative characterization method is suitable for analysis of modified bio-materials, especially through patterned protein decoration. The imaging method presented here can be used to statistically analyze protein patterns, thereby increasing both precision and throughput. Further, the method can be developed to include a reference emitter and detector pair on the image surface of the reference object, in order to provide traceable measurements.

  10. Some experiments on the high-low transition of quartz; Recherches experimentales sur une transformation du quartz

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, G [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1959-12-15

    First section. - We expose on the one hand a theory of specific heat, thermal expansion and variations of elastic constants as functions of temperature, which is applicable only in the absence of transformation phenomena affecting symmetry or periodicity of the crystal lattice. On the other hand, we discuss some theories relative to the phenomena which accompany phase transformations. Second section. - We have gathered together numerical results concerning elastic, piezoelectric and optical properties of quartz. Some have been collected from the literature, other have been obtained in our laboratories with the help of experimental methods which we describe. As a result, we are able to present a complete picture of the evolution of these constants in a large temperature range containing the critical temperature of 574 deg. C at which these constants exhibit discontinuities. New phenomena have been observed, in the course of these studies. Third section. - We show that the evolution of the two piezoelectric and elastic constants which cancel out in the high temperature form is described by the same function. With the inclusion of one other function, it is possible to explain quantitatively the behaviour in the transformation range of all the other constants under study. With the help of crystallographic considerations and of hypotheses concerning the nature of the transformation entropy, we finally try to account for the experimental values of these two functions. (author) [French] Dans une premiere partie, nous exposons d'une part une theorie de la chaleur specifique, de la dilatation thermique et des variations des constantes elastiques des solides avec la temperature qui n'est valable qu'en l'absence de phenomenes de transformation affectant la symetrie ou la periodicite de l'edifice cristallin, et nous rappelons d'autre part quelques theories relatives aux phenomenes qui accompagnent les changements de phase. Dans une seconde partie, nous avons rassemble un grand

  11. Quantitative imaging of magnetic nanoparticles by magneto-relaxometric tomography for biomedical applications

    International Nuclear Information System (INIS)

    Liebl, Maik

    2016-01-01

    Current biomedical research focuses on the development of novel biomedical applications based on magnetic nanoparticles (MNPs), e.g. for local cancer treatment. These therapy approaches employ MNPs as remotely controlled drug carriers or local heat generators. Since location and quantity of MNPs determine drug enrichment and heat production, quantitative knowledge of the MNP distribution inside a body is essential for the development and success of these therapies. Magnetorelaxometry (MRX) is capable to provide such quantitative information based on the specific response of the MNPs after switching-off an applied magnetic field. Applying a uniform (homogeneous) magnetic field to a MNP distribution and measuring the MNP response by multiple sensors at different locations allows for spatially resolved MNP quantification. However, to reconstruct the MNP distribution from this spatially resolved MRX data, an ill posed inverse problem has to be solved. So far, the solution of this problem was stabilized incorporating a-priori knowledge in the forward model, e.g. by setting priors on the vertical position of the distribution using a 2D reconstruction grid or setting priors on the number and geometry of the MNP sources inside the body. MRX tomography represents a novel approach for quantitative 3D imaging of MNPs, where the inverse solution is stabilized by a series of MRX measurements. In MRX tomography, only parts of the MNP distribution are sequentially magnetized by the use of inhomogeneous magnetic fields. Each magnetizing is followed by detection of the response of the corresponding part of the distribution by multiple sensors. The 3D reconstruction of the MNP distribution is then accomplished by a common evaluation of the distinct MRX measurement series. In this thesis the first experimental setup for MRX tomography was developed for quantitative 3D imaging of biomedical MNP distributions. It is based on a multi-channel magnetizing unit which has been engineered to

  12. Elastic and inelastic scattering of 2 to 10 MeV protons by lithium isotopes; Diffusion elastique et inelastique des protons de 2 a 10 MeV par les isotopes du lithium

    Energy Technology Data Exchange (ETDEWEB)

    Laurat, M [Commissariat a l' Energie Atomique, Bruyeres-le-Chatel (France). Centre d' Etudes

    1969-07-01

    A description is given of the experimental set-up which has been devised for carrying out spectrometric and absolute cross-section measurements on the reactions induced by protons accelerated in a 12 MeV Van de Graaff Tandem. The particles are detected by silicon junctions; the weight of the targets (about ten {mu}g/cm{sup 2}) is determined by the quartz method. The experimental equipment has been controlled by a study of proton scattering by lithium-6, and has made it possible to evaluate the elastic and inelastic scattering (1. level excitation) by lithium 7 of 2 to 9 MeV protons. The most probable spin and parity values for the six levels of {sup 8}Be between 19 and 25 MeV excitation energy have been determined from a knowledge of the observed structure. (author) [French] Nous decrivons le dispositif experimental mis au point pour effectuer les mesures de spectrometrie et de section efficace absolue pour les reactions induites par des protons acceleres par un Van de Graaff Tandem 12 MeV. Les particules sont detectees par des jonctions au silicium, le poids des cibles (de l'ordre d'une dizaine de {mu}g/cm{sup 2}), mesure par la methode du quartz. L'ensemble de l'appareillage a ete controle par l'etude de la diffusion des protons par le lithium 6, et nous a permis de preciser les diffusions elastiques et inelastiques (excitation du 1er niveau) des protons de 2 a 9 MeV par le lithium 7. La structure observee a permis de determiner les spin et parite les plus probables de six niveaux du {sup 8}Be entre 19 et 25 MeV d'energie d'excitation. (auteur)

  13. Elastic and inelastic scattering of 2 to 10 MeV protons by lithium isotopes; Diffusion elastique et inelastique des protons de 2 a 10 MeV par les isotopes du lithium

    Energy Technology Data Exchange (ETDEWEB)

    Laurat, M. [Commissariat a l' Energie Atomique, Bruyeres-le-Chatel (France). Centre d' Etudes

    1969-07-01

    A description is given of the experimental set-up which has been devised for carrying out spectrometric and absolute cross-section measurements on the reactions induced by protons accelerated in a 12 MeV Van de Graaff Tandem. The particles are detected by silicon junctions; the weight of the targets (about ten {mu}g/cm{sup 2}) is determined by the quartz method. The experimental equipment has been controlled by a study of proton scattering by lithium-6, and has made it possible to evaluate the elastic and inelastic scattering (1. level excitation) by lithium 7 of 2 to 9 MeV protons. The most probable spin and parity values for the six levels of {sup 8}Be between 19 and 25 MeV excitation energy have been determined from a knowledge of the observed structure. (author) [French] Nous decrivons le dispositif experimental mis au point pour effectuer les mesures de spectrometrie et de section efficace absolue pour les reactions induites par des protons acceleres par un Van de Graaff Tandem 12 MeV. Les particules sont detectees par des jonctions au silicium, le poids des cibles (de l'ordre d'une dizaine de {mu}g/cm{sup 2}), mesure par la methode du quartz. L'ensemble de l'appareillage a ete controle par l'etude de la diffusion des protons par le lithium 6, et nous a permis de preciser les diffusions elastiques et inelastiques (excitation du 1er niveau) des protons de 2 a 9 MeV par le lithium 7. La structure observee a permis de determiner les spin et parite les plus probables de six niveaux du {sup 8}Be entre 19 et 25 MeV d'energie d'excitation. (auteur)

  14. Laser ablation ICP-MS for quantitative biomedical applications

    International Nuclear Information System (INIS)

    Konz, Ioana; Fernandez, Beatriz; Fernandez, M.L.; Pereiro, Rosario; Sanz-Medel, Alfredo

    2012-01-01

    LA-ICP-MS allows precise, relatively fast, and spatially resolved measurements of elements and isotope ratios at trace and ultratrace concentration levels with minimal sample preparation. Over the past few years this technique has undergone rapid development, and it has been increasingly applied in many different fields, including biological and medical research. The analysis of essential, toxic, and therapeutic metals, metalloids, and nonmetals in biomedical tissues is a key task in the life sciences today, and LA-ICP-MS has proven to be an excellent complement to the organic MS techniques that are much more commonly employed in the biomedical field. In order to provide an appraisal of the fast progress that is occurring in this field, this review critically describes new developments for LA-ICP-MS as well as the most important applications of LA-ICP-MS, with particular emphasis placed on the quantitative imaging of elements in biological tissues, the analysis of heteroatom-tagged proteins after their separation and purification by gel electrophoresis, and the analysis of proteins that do not naturally have ICP-MS-detectable elements in their structures, thus necessitating the use of labelling strategies. (orig.)

  15. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    Science.gov (United States)

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  16. Quantitative Phase Imaging Techniques for the Study of Cell Pathophysiology: From Principles to Applications

    Directory of Open Access Journals (Sweden)

    Hyunjoo Park

    2013-03-01

    Full Text Available A cellular-level study of the pathophysiology is crucial for understanding the mechanisms behind human diseases. Recent advances in quantitative phase imaging (QPI techniques show promises for the cellular-level understanding of the pathophysiology of diseases. To provide important insight on how the QPI techniques potentially improve the study of cell pathophysiology, here we present the principles of QPI and highlight some of the recent applications of QPI ranging from cell homeostasis to infectious diseases and cancer.

  17. Effects of ROI definition and reconstruction method on quantitative outcome and applicability in a response monitoring trial

    International Nuclear Information System (INIS)

    Krak, Nanda C.; Boellaard, R.; Hoekstra, Otto S.; Hoekstra, Corneline J.; Twisk, Jos W.R.; Lammertsma, Adriaan A.

    2005-01-01

    Quantitative measurement of tracer uptake in a tumour can be influenced by a number of factors, including the method of defining regions of interest (ROIs) and the reconstruction parameters used. The main purpose of this study was to determine the effects of different ROI methods on quantitative outcome, using two reconstruction methods and the standard uptake value (SUV) as a simple quantitative measure of FDG uptake. Four commonly used methods of ROI definition (manual placement, fixed dimensions, threshold based and maximum pixel value) were used to calculate SUV (SUV [MAN] , SUV 15 mm , SUV 50 , SUV 75 and SUV max , respectively) and to generate ''metabolic'' tumour volumes. Test-retest reproducibility of SUVs and of ''metabolic'' tumour volumes and the applicability of ROI methods during chemotherapy were assessed. In addition, SUVs calculated on ordered subsets expectation maximisation (OSEM) and filtered back-projection (FBP) images were compared. ROI definition had a direct effect on quantitative outcome. On average, SUV [MAN] , SUV 15 mm , SUV 50 and SUV 75 , were respectively 48%, 27%, 34% and 15% lower than SUV max when calculated on OSEM images. No statistically significant differences were found between SUVs calculated on OSEM and FBP reconstructed images. Highest reproducibility was found for SUV 15 mm and SUV [MAN] (ICC 0.95 and 0.94, respectively) and for ''metabolic'' volumes measured with the manual and 50% threshold ROIs (ICC 0.99 for both). Manual, 75% threshold and maximum pixel ROIs could be used throughout therapy, regardless of changes in tumour uptake or geometry. SUVs showed the same trend in relative change in FDG uptake after chemotherapy, irrespective of the ROI method used. The method of ROI definition has a direct influence on quantitative outcome. In terms of simplicity, user-independence, reproducibility and general applicability the threshold-based and fixed dimension methods are the best ROI methods. Threshold methods are in

  18. Quantitative imaging of magnetic nanoparticles by magneto-relaxometric tomography for biomedical applications; Quantitative Bildgebung magnetischer Nanopartikel mittels magnetrelaxometrischer Tomographie fuer biomedizinische Anwendungen

    Energy Technology Data Exchange (ETDEWEB)

    Liebl, Maik

    2016-11-18

    Current biomedical research focuses on the development of novel biomedical applications based on magnetic nanoparticles (MNPs), e.g. for local cancer treatment. These therapy approaches employ MNPs as remotely controlled drug carriers or local heat generators. Since location and quantity of MNPs determine drug enrichment and heat production, quantitative knowledge of the MNP distribution inside a body is essential for the development and success of these therapies. Magnetorelaxometry (MRX) is capable to provide such quantitative information based on the specific response of the MNPs after switching-off an applied magnetic field. Applying a uniform (homogeneous) magnetic field to a MNP distribution and measuring the MNP response by multiple sensors at different locations allows for spatially resolved MNP quantification. However, to reconstruct the MNP distribution from this spatially resolved MRX data, an ill posed inverse problem has to be solved. So far, the solution of this problem was stabilized incorporating a-priori knowledge in the forward model, e.g. by setting priors on the vertical position of the distribution using a 2D reconstruction grid or setting priors on the number and geometry of the MNP sources inside the body. MRX tomography represents a novel approach for quantitative 3D imaging of MNPs, where the inverse solution is stabilized by a series of MRX measurements. In MRX tomography, only parts of the MNP distribution are sequentially magnetized by the use of inhomogeneous magnetic fields. Each magnetizing is followed by detection of the response of the corresponding part of the distribution by multiple sensors. The 3D reconstruction of the MNP distribution is then accomplished by a common evaluation of the distinct MRX measurement series. In this thesis the first experimental setup for MRX tomography was developed for quantitative 3D imaging of biomedical MNP distributions. It is based on a multi-channel magnetizing unit which has been engineered to

  19. Quantitative neutron radiography using neutron absorbing honeycomb

    International Nuclear Information System (INIS)

    Tamaki, Masayoshi; Oda, Masahiro; Takahashi, Kenji; Ohkubo, Kohei; Tasaka, Kanji; Tsuruno, Akira; Matsubayashi, Masahito.

    1993-01-01

    This investigation concerns quantitative neutron radiography and computed tomography by using a neutron absorbing honeycomb collimator. By setting the neutron absorbing honeycomb collimator between object and imaging system, neutrons scattered in the object were absorbed by the honeycomb material and eliminated before coming to the imaging system, but the neutrons which were transmitted the object without interaction could reach the imaging system. The image by purely transmitted neutrons gives the quantitative information. Two honeycombs were prepared with coating of boron nitride and gadolinium oxide and evaluated for the quantitative application. The relation between the neutron total cross section and the attenuation coefficient confirmed that they were in a fairly good agreement. Application to quantitative computed tomography was also successfully conducted. The new neutron radiography method using the neutron-absorbing honeycomb collimator for the elimination of the scattered neutrons improved remarkably the quantitativeness of the neutron radiography and computed tomography. (author)

  20. {pi}{sup -}-p proton scattering at 516, 616, 710, 887 and 1085 MeV (1961); Diffusion de protons {pi}{sup -}-p aux energies de 516, 616, 710, 887 et 1085 MeV (1961)

    Energy Technology Data Exchange (ETDEWEB)

    Barloutaud, R; Choquet, C; Gaillard, J M; Heughebaert, J; Leveque, A; Lehmann, P; Meyer, J; Revel, D [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Grard, F; Heughebaert, J [I.I.S.N., Lab. des Hautes Energies, Bruxelles (Belgium); Grard, F; Macleod, G; Montanet, L [Conseil Europeen pour la recherche nucleaire, Lab. europeen pour la physique des particules, Geneve (Switzerland)

    1961-07-01

    {pi}{sup -}-p collisions at energies of 516, 616, 710, 887 and 1085 MeV were observed by means of the 20 cm Saclay bubble chamber. Angular distributions for elastic scattering were obtained and analyzed. Total cross sections for elastic and inelastic scattering for {pi}{sup -}-p collisions and for the T = 1/2 state were determined. (authors) [French] Nous avons etudie des collisions entre pions negatifs et protons aux energies de 516, 616, 710, 887 et 1085 MeV, au moyen de la chambre A bulles de 20 cm de Saclay. Les distributions angulaires de diffusion elastique ont ete obtenues et analysees. Nous avons determine les sections efficaces totales pour les diffusions elastiques et inelastiques {pi}{sup -}-p et pour ces processus dans l'etat T = 1/2. (auteurs)

  1. Proceedings First Workshop on Quantitative Formal Methods : theory and applications (QFM'09, Eindhoven, The Netherlands, November 3, 2009)

    NARCIS (Netherlands)

    Andova, S.; McIver, A.; D'Argenio, P.R.; Cuijpers, P.J.L.; Markovski, J.; Morgan, C.; Núñez, M.

    2009-01-01

    This volume contains the papers presented at the 1st workshop on Quantitative Formal Methods: Theory and Applications, which was held in Eindhoven on 3 November 2009 as part of the International Symposium on Formal Methods 2009. This volume contains the final versions of all contributions accepted

  2. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. The Application of the Photographic Plate to the Quantitative Determination of Activities by Track Counts

    International Nuclear Information System (INIS)

    Broda, E.

    1946-01-01

    This report was written by E. Broda at the Cavendish Laboratory (Cambridge) in August 1946 and is about the application of the photographic plate to the quantitative determination of activities by track counts. This report includes the experiment description and the discussion of the results and consists of 4 parts: 1) Introduction 2) Estimation of Concentrations 3) The uptake of U in different conditions 4) The upper limits of the fission Cross sections of Bi and Pb. (nowak)

  4. The Application of the Photographic Plate to the Quantitative Determination of Activities by Track Counts

    Energy Technology Data Exchange (ETDEWEB)

    Broda, E.

    1946-07-01

    This report was written by E. Broda at the Cavendish Laboratory (Cambridge) in August 1946 and is about the application of the photographic plate to the quantitative determination of activities by track counts. This report includes the experiment description and the discussion of the results and consists of 4 parts: 1) Introduction 2) Estimation of Concentrations 3) The uptake of U in different conditions 4) The upper limits of the fission Cross sections of Bi and Pb. (nowak)

  5. Effects of Single and Combined Application of Organic and Biological Fertilizers on Quantitative and Qualitative Yield of Anisum (Pimpinella anisum

    Directory of Open Access Journals (Sweden)

    N Kamayestani

    2015-07-01

    Full Text Available In order to study the effects of single and combined applications of biofertilazer and organic fertilizers on quantitative and qualitative characteristics of anisum (Pimpinella anisum, an experiment was conducted based on a Randomized Complete Block Design with three replications and fifteen treatments at Research Station, Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, in 2011 year. Treatments were: (1 mycorrhiza (Glomus intraradices, (2 mycorrhiza + cow manure, (3 mycorrhiza + vermicompost, (4 mycorrhiza+ compost, (5 mycorrhiza + chemical fertilizer, (6 biosulfur (Thiobacillus sp. + Bentonite, (7 biosulfur + chemical fertilizer, (8 biosulfur + cow manure, (9 biosulfur + vermicompost, (10 biosulfur+compost,11 (cow manure, (12 vermicompost, (13 chemical fertilizer (NPK, (14compost and (15 control. The results showed that application of fertilizer treatments had significant effect on most characteristics of anisum. The highest number of seed per umbelet (7.24, economic yield (1263.4kg/ha were obtained fram biosulfur treatment. The highest dry matter yield (4504.1 kg/ha resulted from combined application of biosulfur + chemical fertilizer and the highest harvest index (25.97% observed in biosulfur+cow manure. The combined application of mycorrhiza affected some qualification traits, as the highest number of umbel per plant (65.7, 1000 seed-weight (3.24 g and essential oil percentage (5.3% resulted from combined application of mycorrhiza+chemical fertilizer. In general, it can be concluded that application of organic and biological fertilizer particularly mycorrhiza and biosulfur had a significant effect on improving of quantitative and qualitative characteristics of anisum. Furthermore, the combined application of organic and biological fertilizer had higher positive effects than their single application.

  6. Quantitative real-time PCR approaches for microbial community studies in wastewater treatment systems: applications and considerations.

    Science.gov (United States)

    Kim, Jaai; Lim, Juntaek; Lee, Changsoo

    2013-12-01

    Quantitative real-time PCR (qPCR) has been widely used in recent environmental microbial ecology studies as a tool for detecting and quantifying microorganisms of interest, which aids in better understandings of the complexity of wastewater microbial communities. Although qPCR can be used to provide more specific and accurate quantification than other molecular techniques, it does have limitations that must be considered when applying it in practice. This article reviews the principle of qPCR quantification and its applications to microbial ecology studies in various wastewater treatment environments. Here we also address several limitations of qPCR-based approaches that can affect the validity of quantification data: template nucleic acid quality, nucleic acid extraction efficiency, specificity of group-specific primers and probes, amplification of nonviable DNA, gene copy number variation, and limited number of sequences in the database. Even with such limitations, qPCR is reportedly among the best methods for quantitatively investigating environmental microbial communities. The application of qPCR is and will continue to be increasingly common in studies of wastewater treatment systems. To obtain reliable analyses, however, the limitations that have often been overlooked must be carefully considered when interpreting the results. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Some experiments on the high-low transition of quartz; Recherches experimentales sur une transformation du quartz

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, G. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1959-12-15

    First section. - We expose on the one hand a theory of specific heat, thermal expansion and variations of elastic constants as functions of temperature, which is applicable only in the absence of transformation phenomena affecting symmetry or periodicity of the crystal lattice. On the other hand, we discuss some theories relative to the phenomena which accompany phase transformations. Second section. - We have gathered together numerical results concerning elastic, piezoelectric and optical properties of quartz. Some have been collected from the literature, other have been obtained in our laboratories with the help of experimental methods which we describe. As a result, we are able to present a complete picture of the evolution of these constants in a large temperature range containing the critical temperature of 574 deg. C at which these constants exhibit discontinuities. New phenomena have been observed, in the course of these studies. Third section. - We show that the evolution of the two piezoelectric and elastic constants which cancel out in the high temperature form is described by the same function. With the inclusion of one other function, it is possible to explain quantitatively the behaviour in the transformation range of all the other constants under study. With the help of crystallographic considerations and of hypotheses concerning the nature of the transformation entropy, we finally try to account for the experimental values of these two functions. (author) [French] Dans une premiere partie, nous exposons d'une part une theorie de la chaleur specifique, de la dilatation thermique et des variations des constantes elastiques des solides avec la temperature qui n'est valable qu'en l'absence de phenomenes de transformation affectant la symetrie ou la periodicite de l'edifice cristallin, et nous rappelons d'autre part quelques theories relatives aux phenomenes qui accompagnent les changements de phase. Dans une seconde partie

  8. Applicability of a set of tomographic reconstruction algorithms for quantitative SPECT on irradiated nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsson Svärd, Staffan, E-mail: staffan.jacobsson_svard@physics.uu.se; Holcombe, Scott; Grape, Sophie

    2015-05-21

    A fuel assembly operated in a nuclear power plant typically contains 100–300 fuel rods, depending on fuel type, which become strongly radioactive during irradiation in the reactor core. For operational and security reasons, it is of interest to experimentally deduce rod-wise information from the fuel, preferably by means of non-destructive measurements. The tomographic SPECT technique offers such possibilities through its two-step application; (1) recording the gamma-ray flux distribution around the fuel assembly, and (2) reconstructing the assembly's internal source distribution, based on the recorded radiation field. In this paper, algorithms for performing the latter step and extracting quantitative relative rod-by-rod data are accounted for. As compared to application of SPECT in nuclear medicine, nuclear fuel assemblies present a much more heterogeneous distribution of internal attenuation to gamma radiation than the human body, typically with rods containing pellets of heavy uranium dioxide surrounded by cladding of a zirconium alloy placed in water or air. This inhomogeneity severely complicates the tomographic quantification of the rod-wise relative source content, and the deduction of conclusive data requires detailed modelling of the attenuation to be introduced in the reconstructions. However, as shown in this paper, simplified models may still produce valuable information about the fuel. Here, a set of reconstruction algorithms for SPECT on nuclear fuel assemblies are described and discussed in terms of their quantitative performance for two applications; verification of fuel assemblies' completeness in nuclear safeguards, and rod-wise fuel characterization. It is argued that a request not to base the former assessment on any a priori information brings constraints to which reconstruction methods that may be used in that case, whereas the use of a priori information on geometry and material content enables highly accurate quantitative

  9. Validity of spherical quantitative refractometry: application to laser-produced plasmas

    International Nuclear Information System (INIS)

    Benattar, R.; Popovics, C.

    1983-01-01

    We report an experimental laser technique of quantitative Schlieren imaging of spherical plasmas combined with streak camera recording. We show that quantitative refractometry applies for small values of refraction angles, i.e., when the law giving the refraction angle versus the impact parameter of rays passing through the plasma is a linearly decreasing function

  10. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  11. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  12. Using Live-Crown Ratio to Control Wood Quality: An Example of Quantitative Silviculture

    Science.gov (United States)

    Thomas J. Dean

    1999-01-01

    Quantitative silviculture is the application of biological relationships in meeting specific, quantitative management objectives. It is a two-sided approach requiring the identification and application of biological relationships. An example of quantitative silviculture is presented that uses a relationship between average-live crown ratio and relative stand density...

  13. Application of quantitative and qualitative methods for determination ...

    African Journals Online (AJOL)

    This article covers the issues of integration of qualitative and quantitative methods applied when justifying management decision-making in companies implementing lean manufacturing. The authors defined goals and subgoals and justified the evaluation criteria which lead to the increased company value if achieved.

  14. Quantitative valuation of platform technology based intangibles companies

    OpenAIRE

    Achleitner, Ann-Kristin; Nathusius, Eva; Schraml, Stephanie

    2007-01-01

    In the course of raising external equity, e.g. from venture capitalists, a quantitative valuation is usually required for entrepreneurial ventures. This paper examines the challenges of quantitatively valuing platform technology based entrepreneurial ventures. The distinct characteristics of such companies pose specific requirements on the applicability of quantitative valuation methods. The entrepreneur can choose from a wide range of potential commercialization strategies to pursue in the c...

  15. A combined usage of stochastic and quantitative risk assessment methods in the worksites: Application on an electric power provider

    International Nuclear Information System (INIS)

    Marhavilas, P.K.; Koulouriotis, D.E.

    2012-01-01

    An individual method cannot build either a realistic forecasting model or a risk assessment process in the worksites, and future perspectives should focus on the combined forecasting/estimation approach. The main purpose of this paper is to gain insight into a risk prediction and estimation methodological framework, using the combination of three different methods, including the proportional quantitative-risk-assessment technique (PRAT), the time-series stochastic process (TSP), and the method of estimating the societal-risk (SRE) by F–N curves. In order to prove the usefulness of the combined usage of stochastic and quantitative risk assessment methods, an application on an electric power provider industry is presented to, using empirical data.

  16. [The study of tomato fruit weight quantitative trait locus and its application in genetics teaching].

    Science.gov (United States)

    Wang, Hai-yan

    2015-08-01

    The classical research cases, which have greatly promoted the development of genetics in history, can be combined with the content of courses in genetics teaching to train students' ability of scientific thinking and genetic analysis. The localization and clone of gene controlling tomato fruit weight is a pioneer work in quantitative trait locus (QTL) studies and represents a complete process of QTL research in plants. Application of this integrated case in genetics teaching, which showed a wonderful process of scientific discovery and the fascination of genetic research, has inspired students' interest in genetics and achieved a good teaching effect.

  17. Quantitative 3-D imaging topogrammetry for telemedicine applications

    Science.gov (United States)

    Altschuler, Bruce R.

    1994-01-01

    The technology to reliably transmit high-resolution visual imagery over short to medium distances in real time has led to the serious considerations of the use of telemedicine, telepresence, and telerobotics in the delivery of health care. These concepts may involve, and evolve toward: consultation from remote expert teaching centers; diagnosis; triage; real-time remote advice to the surgeon; and real-time remote surgical instrument manipulation (telerobotics with virtual reality). Further extrapolation leads to teledesign and telereplication of spare surgical parts through quantitative teleimaging of 3-D surfaces tied to CAD/CAM devices and an artificially intelligent archival data base of 'normal' shapes. The ability to generate 'topogrames' or 3-D surface numerical tables of coordinate values capable of creating computer-generated virtual holographic-like displays, machine part replication, and statistical diagnostic shape assessment is critical to the progression of telemedicine. Any virtual reality simulation will remain in 'video-game' realm until realistic dimensional and spatial relational inputs from real measurements in vivo during surgeries are added to an ever-growing statistical data archive. The challenges of managing and interpreting this 3-D data base, which would include radiographic and surface quantitative data, are considerable. As technology drives toward dynamic and continuous 3-D surface measurements, presenting millions of X, Y, Z data points per second of flexing, stretching, moving human organs, the knowledge base and interpretive capabilities of 'brilliant robots' to work as a surgeon's tireless assistants becomes imaginable. The brilliant robot would 'see' what the surgeon sees--and more, for the robot could quantify its 3-D sensing and would 'see' in a wider spectral range than humans, and could zoom its 'eyes' from the macro world to long-distance microscopy. Unerring robot hands could rapidly perform machine-aided suturing with

  18. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  19. Elastic and inelastic {alpha}-scattering cross-sections obtained with the 44 MeV fixed energy Saclay cyclotron on separated targets of {sup 24}Mg, {sup 25}Mg, {sup 26}Mg, {sup 40}Ca, {sup 46}Ti, {sup 48}Ti, {sup 50}Ti, {sup 52}Cr, {sup 54}Fe, {sup 56}Fe, {sup 58}Fe, {sup 58}Ni, {sup 60}Ni, {sup 62}Ni, {sup 64}Ni, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 112}Sn, {sup 114}Sn, {sup 116}Sn, {sup 118}Sn, {sup 120}Sn, {sup 122}Sn, {sup 124}Sn and {sup 208}Pb using the Saclay fixed-energy cyclotron; Sections efficaces differentielles elastiques et inelastiques obtenues par diffusion de particules {alpha} de 44 MeV sur des cibles de {sup 24}Mg, {sup 25}Mg, {sup 26}Mg, {sup 40}Ca, {sup 46}Ti, {sup 48}Ti, {sup 50}Ti, {sup 52}Cr, {sup 54}Fe, {sup 56}Fe, {sup 58}Fe, {sup 58}Ni, {sup 60}Ni, {sup 62}Ni, {sup 64}Ni, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 112}Sn, {sup 114}Sn, {sup 116}Sn, {sup 118}Sn, {sup 120}Sn, {sup 122}Sn, {sup 124}Sn et {sup 208}Pb au cyclotron a energie fixe de saclay

    Energy Technology Data Exchange (ETDEWEB)

    Bruge, G [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires. Departement de physique nucleaire, service de physique nucleaire a moyenne energie

    1967-01-01

    This report contains elastic and inelastic {alpha}-scattering cross-sections obtained with the 44 MeV fixed energy Saclay cyclotron on Mg, Ca, Ti, Cr, Fe, Ni, Co, Zn, Sn and Pb enriched targets. (author) [French] Ce rapport contient les tableaux des sections efficaces differentielles obtenues par diffusion elastique et inelastique des particules {alpha} de 44 MeV, fournies par le cyclotron a energie fixe de Saclay, sur des cibles d'isotopes separes de Mg, Ca, Ti, Cr, Fe, Ni, Co, Zn, Sn et Pb. (auteur)

  20. Application of quantitative DTI metrics in sporadic CJD

    Directory of Open Access Journals (Sweden)

    E. Caverzasi

    2014-01-01

    Full Text Available Diffusion Weighted Imaging is extremely important for the diagnosis of probable sporadic Jakob–Creutzfeldt disease, the most common human prion disease. Although visual assessment of DWI MRI is critical diagnostically, a more objective, quantifiable approach might more precisely identify the precise pattern of brain involvement. Furthermore, a quantitative, systematic tracking of MRI changes occurring over time might provide insights regarding the underlying histopathological mechanisms of human prion disease and provide information useful for clinical trials. The purposes of this study were: 1 to describe quantitatively the average cross-sectional pattern of reduced mean diffusivity, fractional anisotropy, atrophy and T1 relaxation in the gray matter (GM in sporadic Jakob–Creutzfeldt disease, 2 to study changes in mean diffusivity and atrophy over time and 3 to explore their relationship with clinical scales. Twenty-six sporadic Jakob–Creutzfeldt disease and nine control subjects had MRIs on the same scanner; seven sCJD subjects had a second scan after approximately two months. Cortical and subcortical gray matter regions were parcellated with Freesurfer. Average cortical thickness (or subcortical volume, T1-relaxiation and mean diffusivity from co-registered diffusion maps were calculated in each region for each subject. Quantitatively on cross-sectional analysis, certain brain regions were preferentially affected by reduced mean diffusivity (parietal, temporal lobes, posterior cingulate, thalamus and deep nuclei, but with relative sparing of the frontal and occipital lobes. Serial imaging, surprisingly showed that mean diffusivity did not have a linear or unidirectional reduction over time, but tended to decrease initially and then reverse and increase towards normalization. Furthermore, there was a strong correlation between worsening of patient clinical function (based on modified Barthel score and increasing mean diffusivity.

  1. Parametric biomedical imaging - what defines the quality of quantitative radiological approaches?

    International Nuclear Information System (INIS)

    Glueer, C.C.; Barkmann, R.; Bolte, H.; Heller, M.; Hahn, H.K.; Dicken, V.; Majumdar, S.; Eckstein, F.; Nickelsen, T.N.

    2006-01-01

    Quantitative parametric imaging approaches provide new perspectives for radiological imaging. These include quantitative 2D, 3D, and 4D visualization options along with the parametric depiction of biological tissue properties and tissue function. This allows the interpretation of radiological data from a biochemical, biomechanical, or physiological perspective. Quantification permits the detection of small changes that are not yet visually apparent, thus allowing application in early disease diagnosis and monitoring therapy with enhanced sensitivity. This review outlines the potential of quantitative parametric imaging methods and demonstrates this on the basis of a few exemplary applications. One field of particular interest, the use of these methods for investigational new drug application studies, is presented. Assessment criteria for judging the quality of quantitative imaging approaches are discussed in the context of the potential and the limitations of these methods. While quantitative parametric imaging methods do not replace but rather supplement established visual interpretation methods in radiology, they do open up new perspectives for diagnosis and prognosis and in particular for monitoring disease progression and therapy. (orig.)

  2. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  3. Results in pion proton scattering near the higher resonances (1961); Resultats pour la diffusion des mesons pi par les protons dans le domaine des hautes resonances (1962)

    Energy Technology Data Exchange (ETDEWEB)

    Falk-Vairant, P; Valladas, G [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1961-07-01

    We present briefly the available Information on the total cross sections for pion proton scattering in the energy region from 400 MeV to 1.5 GeV. We also have collected all results on total cross sections for particular channels like elastic scattering, inelastic scattering and charge exchange. Using new results on the total cross section for neutral events, we have plotted separately the cross section for elastic and for inelastic scattering in the T = 1/2 state. (authors) [French] On presente brievement les donnees connues concernant la section efficace totale pour la diffusion des mesons pi par les protons dans le domaine d'energie de 400 MeV a 1,5 GeV. On a egalement rassemble tous les resultats concernant les sections efficaces totales pour des canaux particuliers: diffusion elastique, diffusion inelastique et echange de charge. En partant des nouveaux resultats sur la section efficace pour la diffusion elastique et inelastique dans l'etat T = 1/2. (auteurs)

  4. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    Science.gov (United States)

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  5. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    Science.gov (United States)

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  6. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  7. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  8. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  9. Quantitative phase imaging using quadri-wave lateral shearing interferometry. Application to X-ray domain

    International Nuclear Information System (INIS)

    Rizzi, Julien

    2013-01-01

    Since Roentgen discovered X-rays, X-ray imaging systems are based on absorption contrast. This technique is inefficient for weakly absorbing objects. As a result, X-ray standard radiography can detect bones lesions, but cannot detect ligament lesions. However, phase contrast imaging can overcome this limitation. Since the years 2000, relying on former works of opticians, X-ray scientists are developing phase sensitive devices compatible with industrial applications such as medical imaging or non destructive control. Standard architectures for interferometry are challenging to implement in the X-ray domain. This is the reason why grating based interferometers became the most promising devices to envision industrial applications. They provided the first x-ray phase contrast images of living human samples. Nevertheless, actual grating based architectures require the use of at least two gratings, and are challenging to adapt on an industrial product. So, the aim of my thesis was to develop a single phase grating interferometer. I demonstrated that such a device can provide achromatic and propagation invariant interference patterns. I used this interferometer to perform quantitative phase contrast imaging of a biological fossil sample and x-ray at mirror metrology. (author)

  10. Quantitative nuclear medicine imaging: application of computers to the gamma camera and whole-body scanner

    International Nuclear Information System (INIS)

    Budinger, T.F.

    1974-01-01

    The following topics are reviewed: properties of computer systems for nuclear medicine quantitation; quantitative information concerning the relation between organ isotope concentration and detected projections of the isotope distribution; quantitation using two conjugate views; three-dimensional reconstruction from projections; quantitative cardiac radioangiography; and recent advances leading to quantitative nuclear medicine of clinical importance. (U.S.)

  11. Quantitative analysis of the pendulum test: application to multiple sclerosis patients treated with botulinum toxin.

    Science.gov (United States)

    Bianchi, L; Monaldi, F; Paolucci, S; Iani, C; Lacquaniti, F

    1999-01-01

    The aim of this study was to develop quantitative analytical methods in the application of the pendulum test to both normal and spastic subjects. The lower leg was released by a torque motor from different starting positions. The resulting changes in the knee angle were fitted by means of a time-varying model. Stiffness and viscosity coefficients were derived for each half-cycle oscillation in both flexion and extension, and for all knee starting positions. This method was applied to the assessment of the effects of Botulinum toxin A (BTX) in progressive multiple sclerosis patients in a follow-up study. About half of the patients showed a significant decrement in stiffness and viscosity coefficients.

  12. Stable isotope dimethyl labelling for quantitative proteomics and beyond

    Science.gov (United States)

    Hsu, Jue-Liang; Chen, Shu-Hui

    2016-01-01

    Stable-isotope reductive dimethylation, a cost-effective, simple, robust, reliable and easy-to- multiplex labelling method, is widely applied to quantitative proteomics using liquid chromatography-mass spectrometry. This review focuses on biological applications of stable-isotope dimethyl labelling for a large-scale comparative analysis of protein expression and post-translational modifications based on its unique properties of the labelling chemistry. Some other applications of the labelling method for sample preparation and mass spectrometry-based protein identification and characterization are also summarized. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644970

  13. Effect of Biofertilizers Application on the Quantitative and Qualitative Characteristics of Linseed (Linum usitatissimum L. Lines

    Directory of Open Access Journals (Sweden)

    B. Motalebizadeh

    2015-09-01

    Full Text Available In order to investigate the effect of bio-fertilizers on the yield and yield components of flax lines, a study was conducted during 2010 growing season at the Agricultural Research Station of Saatlo in Urmia. A split plot design based on randomized complete blocks with four replications was performed in this study. Main factor (a consisted of fertilizer application form (a1 = control without nitrogen fertilizer, a2 = nitrogen fertilizer, a3 = nitroxin + N, a4 = phosphate barvar 2 + N, and a5 = nitroxin + phosphate barvar 2 + N and sub factor (b consisted of five lines of oily flax (b1 = 97-26, b2 = 97-14, b3 = 97-3, b4 = 97-21, b5 = 97-19. Quantitative and qualitative traits such as number of sub stems, leaf weight, capsule weight per main stem and sub stems, seed yield, oil and protein content were calculated or estimated. Results showed that the main factor (fertilizer form had significant effect (at α=0.01 probability level on all the parameters which have been studied in this experiment. Sub factor (linseed lines and interaction between the two factors had statistically significant effects on all traits. The highest seed yield (4781 kg h-1 and the highest seed oil content (36.5% were obtained from applying nitroxin + phosphateye barvare 2 + N on 97-14 and 97-3 lines. Results showed that using of Nitroxin and Phosphateye barvare 2 biofertilizers could be effective in increasing grain yield of linseed. Therefore, application of Nitroxin and Phosphateye barvare 2 biofertilizers could be used to improve soil physio-chemical properties and to increase quantitative and qualitative yield parameters of linseed.

  14. Radiological interpretation 2020: Toward quantitative image assessment

    International Nuclear Information System (INIS)

    Boone, John M.

    2007-01-01

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  15. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  16. UV SPECTROPHOTOMETRY APPLICATION FOR QUANTITATIVE DETERMINATION OF VINPOCETINE IN DRUG FORMULATIONS

    Directory of Open Access Journals (Sweden)

    J. V. Monaykina

    2014-12-01

    procedure was successfully applied for the analysis of two new pharmaceutical formulations. The results obtained by applying the proposed procedure were statistically analyzed. Validation studies of the methods confirmed their proper precision and recovery (for cream: 99,73%, RSD% = 0.924, n = 9; for suppositories: 100,3, RSD% = 0,378, n = 9, linearity (for cream: r =0,9999, n=6; for suppositories: r =0,9998, n=6 The received parameters enable the use of developed methods in quantitative pharmaceutical analysis. Conclusions. The applicability of the new procedure is well established by vinpocetine assay in the new drug formulations, namely 0,01 suppositories and 0,5% nasal cream. The developed UV spectrophotometric methods are potentially useful because of their simplicity, rapidity and accuracy. The methods are valid according to the validation requirements of Ukrainian Pharmacopeia.

  17. Applications and limitations of quantitative sacroiliac joint scintigraphy

    International Nuclear Information System (INIS)

    Goldberg, R.P.; Genant, H.K.; Shimshak, R.; Shames, D.

    1978-01-01

    Evaluation of sacroiliac joint pathology by quantitative analysis of radionuclide bone scanning has been advocated as a useful technique. We have examined this technique in 61 patients and controls. The procedure was useful in detecting early sacroilitis but was of limited value in patients with advanced sacroiliac joint findings radiographically. False positive values were found in patients with metabolic bone disease or structural abnormalities in the low back. Normative data must be determined for each laboratory

  18. Application of quantitative real-time PCR compared to filtration methods for the enumeration of Escherichia coli in surface waters within Vietnam.

    Science.gov (United States)

    Vital, Pierangeli G; Van Ha, Nguyen Thi; Tuyet, Le Thi Hong; Widmer, Kenneth W

    2017-02-01

    Surface water samples in Vietnam were collected from the Saigon River, rural and suburban canals, and urban runoff canals in Ho Chi Minh City, Vietnam, and were processed to enumerate Escherichia coli. Quantification was done through membrane filtration and quantitative real-time polymerase chain reaction (PCR). Mean log colony-forming unit (CFU)/100 ml E. coli counts in the dry season for river/suburban canals and urban canals were log 2.8 and 3.7, respectively, using a membrane filtration method, while using Taqman quantitative real-time PCR they were log 2.4 and 2.8 for river/suburban canals and urban canals, respectively. For the wet season, data determined by the membrane filtration method in river/suburban canals and urban canals samples had mean counts of log 3.7 and 4.1, respectively. While mean log CFU/100 ml counts in the wet season using quantitative PCR were log 3 and 2, respectively. Additionally, the urban canal samples were significantly lower than those determined by conventional culture methods for the wet season. These results show that while quantitative real-time PCR can be used to determine levels of fecal indicator bacteria in surface waters, there are some limitations to its application and it may be impacted by sources of runoff based on surveyed samples.

  19. Cellular Phone-Based Image Acquisition and Quantitative Ratiometric Method for Detecting Cocaine and Benzoylecgonine for Biological and Forensic Applications

    OpenAIRE

    Cadle, Brian A.; Rasmus, Kristin C.; Varela, Juan A.; Leverich, Leah S.; O’Neill, Casey E.; Bachtell, Ryan K.; Cooper, Donald C.

    2010-01-01

    Here we describe the first report of using low-cost cellular or web-based digital cameras to image and quantify standardized rapid immunoassay strips as a new point-of-care diagnostic and forensics tool with health applications. Quantitative ratiometric pixel density analysis (QRPDA) is an automated method requiring end-users to utilize inexpensive (~ $1 USD/each) immunotest strips, a commonly available web or mobile phone camera or scanner, and internet or cellular service. A model is descri...

  20. ANALYSIS AND QUANTITATIVE ASSESSMENT FOR RESULTS OF EDUCATIONAL PROGRAMS APPLICATION BY MEANS OF DIAGNOSTIC TESTS

    Directory of Open Access Journals (Sweden)

    E. L. Kon

    2015-07-01

    Full Text Available Subject of Research.The problem actuality for creation, control and estimation of results for competence-oriented educational programs is formulated and proved. Competences elements and components, assembled in modules, course units and parts of educational program, are defined as objects of control. Specific tasks of proficiency examination for competences and their components are stated; subject matter of the paper is formulated. Methods of Research. Some adapted statements and methods of technical science are offered to be applied for control tasks solution, decoding and estimation of education results. The approach to quantitative estimation of testing results with the use of additive integrated differential criterion of estimation is offered. Main Results. Statements, defining conditions of certain and uncertain (indeterminacy decision-making about proficiency examination for elements of discipline components controlled by test according to test realization results, are formulated and proved. Probabilistic characteristicsof both decision-making variants are estimated. Variants of determinate and fuzzy logic mathematic methods application for decreasing decision-making indeterminancy are offered; further research direction is selected for development of methods and algorithms for results decoding of diagnostic tests set realization. Practical Relevance. It is shown, that proposed approach to quantitative estimation of testing results will give the possibility to automate the procedure of formation and analysis for education results, specified in the competence format.

  1. Approaches to quantitative risk assessment with applications to PP

    International Nuclear Information System (INIS)

    Geiger, G.; Schaefer, A.

    2002-01-01

    Full text: Experience with accidents such as Goiania in Brazil and indications of a considerable number of orphan sources suggest that improved protection would be desirable for some types of radioactive material of wide-spread use such as radiation sources for civil purposes. Regarding large potential health and economic consequences (in particular, if terrorists attacks cannot be excluded), significant costs of preventive actions, and large uncertainties about both the likelihood of occurrence and the potential consequences of PP safety and security incidents, an optimum relationship between preventive and mitigative efforts is likely to be a key issue for successful risk management in this field. Thus, possible violations of physical protection combined with threats of misuse of nuclear materials, including terrorist attack, pose considerable challenges to global security from various perspectives. In view of these challenges, recent advance in applied risk and decision analysis suggests methodological and procedural improvements in quantitative risk assessment, the demarcation of acceptable risk, and risk management. Advance is based on a recently developed model of optimal risky choice suitable for assessing and comparing the cumulative probability distribution functions attached to safety and security risks. Besides quantification of risk (e. g., in economic terms), the standardization of various risk assessment models frequently used in operations research can be approached on this basis. The paper explores possible applications of these improved methods to the safety and security management of nuclear materials, cost efficiency of risk management measures, and the establishment international safety and security standards of PP. Examples will be presented that are based on selected scenarios of misuse involving typical radioactive sources. (author)

  2. APPLICATION OF UV-SPECTROPHOTOMETRY FOR THE QUANTITATIVE DETERMINATION OF CAPTOPRIL IN DRUG

    Directory of Open Access Journals (Sweden)

    Yu. V. Monaykina

    2015-04-01

    formulations. The results obtained by applying the proposed method were statistically analyzed. Validation of the method confirmed its proper precision and recovery (for gel: 100,2%, RSD% = 0,572, n = 9; for suppositories: 99,87%, RSD% = 0,420, n = 9, linearity (for gel: r =0,9978, n=6; for suppositories: r =0,9982, n=6 The received parameters enable the developed procedure to be used in quantitative pharmaceutical analysis. Conclusions. The applicability of the new procedure is well established by the assay the new drug formulations of captopril – 0,05 suppositories and 2,5% nasal gel. The developed UV spectrophotometric method is potentially useful because of its simplicity, rapidity and accuracy. The procedure is valid according to the validation requirements of Ukrainian Pharmacopeia

  3. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  4. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  5. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  6. Real-time quantitative PCR of Staphylococcus aureus and application in restaurant meals.

    Science.gov (United States)

    Berrada, H; Soriano, J M; Mañes, J; Picó, Y

    2006-01-01

    Staphylococcus aureus is considered the second most common pathogen to cause outbreaks of food poisoning, exceeded only by Campylobacter. Consumption of foods containing this microorganism is often identified as the cause of illness. In this study, a rapid, reliable, and sensitive real-time quantitative PCR was developed and compared with conventional culture methods. Real-time quantitative PCR was carried out by purifying DNA extracts of S. aureus with a Staphylococcus sample preparation kit and quantifying it in the LightCycler system with hybridization probes. The assay was linear from a range of 10 to 10(6) S. aureus cells (r2 > 0.997). The PCR reaction presented an efficiency of >85%. Accuracy of the PCR-based assay, expressed as percent bias, was around 13%, and the precision, expressed as a percentage of the coefficient of variation, was 7 to 10%. Intraday and interday variability were studied at 10(2) CFU/g and was 12 and 14%, respectively. The proposed method was applied to the analysis of 77 samples of restaurant meals in Valencia (Spain). In 11.6% of samples S. aureus was detected by real-time quantitative PCR, as well as by the conventional microbiological method. An excellent correspondence between real-time quantitative PCR and microbiological numbers (CFU/g) was observed with deviations of < 28%.

  7. Determination of correction coefficients for quantitative analysis by mass spectrometry. Application to uranium impurities analysis; Recherche des coefficients de correction permettant l'analyse quantitative par spectrometrie de masse. Application a l'analyse d'impuretes dans l'uranium

    Energy Technology Data Exchange (ETDEWEB)

    Billon, J P [Commissariat a l' Energie Atomique, Bruyeres-le-Chatel (France). Centre d' Etudes

    1970-07-01

    Some of basic principles in spark source mass spectrometry are recalled. It is shown how this method can lead to quantitative analysis when attention is paid to some theoretical aspects. A time constant relation being assumed between the analysed solid sample and the ionic beam it gives we determined experimental relative sensitivity factors for impurities in uranium matrix. Results being in fairly good agreement with: an unelaborate theory on ionization yield in spark-source use of theoretically obtained relative sensitivity factors in uranium matrix has been developed. (author) [French] Apres avoir rappele quelques principes fondamentaux regissant la spectrometrie de masse a etincelles, nous avons montre que moyennant un certain nombre de precautions, il etait possible d'utiliser cette methode en analyse quantitative. Ayant admis qu'il existait une relation constante dans le temps entre l'echantillon solide analyse et le faisceau ionique qui en est issu, nous avons d'abord entrepris de determiner des coefficients de correction experimentaux pour des matrices d'uranium. Les premiers resultats pratiques semblant en accord avec une theorie simple relative au rendement d'ionisation dans la source a etincelles, nous avons etudie la possibilite d'appliquer directement les coefficients theoriques ainsi definis, l'application etant toujours faite sur des matrices d'uranium. (auteur)

  8. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    Science.gov (United States)

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.

  9. Potential application of microfocus X-ray techniques for quantitative analysis of bone structure

    International Nuclear Information System (INIS)

    Takahashi, Kenta

    2006-01-01

    With the progress of micro-focused X-ray computed tomography (micro-CT), it has become possible to evaluate the bone structure quantitatively and three-dimensionally. The advantages of micro-CT are that sample preparations are not required and that it provides not only two-dimensional parameters but also three-dimensional stereological indices. This study was carried out to evaluate the potential application of the micro-focus X-ray techniques for quantitative analysis of the new bone produced inside of a hollow chamber of the experimental titanium miniature implant. Twenty-five male wistar rats (9-weeks of age) received experimental titanium miniature implant that had a hollow chamber inside in the left side of the femur. The rats were sacrificed, then the femurs were excised at 4 weeks or 8 weeks after implantation. Micro-CT analysis was performed on the femur samples and the volume of the new bone induced in the hollow chamber of implant was calculated. Percentages of new bone area on the undecalcified histological slides were also measured, linear regression analysis was carried out. In order to evaluate the correlation between pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. New bone formation occurred in experimental titanium miniature implant with a hollow chamber. The volume of new bone was measured by micro CT and the area percentage of new bone area against hollow chamber was calculated on the undecalcified slide. Linear regression analysis showed a high correlation between the pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. Consequently, the new bone produced inside of the hollow chamber of the experimental titanium miniature implant could be quantified as three-dimensional stereological by micro-CT and its precision was supported by the high correlation between the measurement by micro-CT and conservative two-dimensional measurement of histological slide. (author)

  10. Quantitative reliability assessment for safety critical system software

    International Nuclear Information System (INIS)

    Chung, Dae Won; Kwon, Soon Man

    2005-01-01

    An essential issue in the replacement of the old analogue I and C to computer-based digital systems in nuclear power plants is the quantitative software reliability assessment. Software reliability models have been successfully applied to many industrial applications, but have the unfortunate drawback of requiring data from which one can formulate a model. Software which is developed for safety critical applications is frequently unable to produce such data for at least two reasons. First, the software is frequently one-of-a-kind, and second, it rarely fails. Safety critical software is normally expected to pass every unit test producing precious little failure data. The basic premise of the rare events approach is that well-tested software does not fail under normal routine and input signals, which means that failures must be triggered by unusual input data and computer states. The failure data found under the reasonable testing cases and testing time for these conditions should be considered for the quantitative reliability assessment. We will present the quantitative reliability assessment methodology of safety critical software for rare failure cases in this paper

  11. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    Science.gov (United States)

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  12. Water volume quantitation using nuclear magnetic resonance imaging: application to cerebrospinal fluid

    International Nuclear Information System (INIS)

    Lecouffe, P.; Huglo, D.; Dubois, P.; Rousseau, J.; Marchandise, X.

    1990-01-01

    Quantitation in proton NMR imaging is applied to cerebrospinal fluid (CSF). Total intracranial CSF volume was measured from Condon's method: CSF signal was compared with distilled water standard signal in a single sagittal thick slice. Brain signal was reduced to minimum using a 5000/360/400 sequence. Software constraints did not permit easy implementing on imager and uniformity correction was performed on a microcomputer. Accuracy was better than 4%. Total intracranial CSF was found between 91 and 164 ml in 5 healthy volunteers. Extraventricular CSF quantitation appears very improved by this method, but planimetric methods seem better in order to quantify ventricular CSF. This technique is compared to total lung water measurement from proton density according to Mac Lennan's method. Water volume quantitation confirms ability of NMR imaging to quantify biologic parameters but image defects have to be known by strict quality control [fr

  13. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  15. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  16. Miniaturization of Fresnel lenses for solar concentration: a quantitative investigation.

    Science.gov (United States)

    Duerr, Fabian; Meuret, Youri; Thienpont, Hugo

    2010-04-20

    Sizing down the dimensions of solar concentrators for photovoltaic applications offers a number of promising advantages. It provides thinner modules and smaller solar cells, which reduces thermal issues. In this work a plane Fresnel lens design is introduced that is first analyzed with geometrical optics. Because of miniaturization, pure ray tracing may no longer be valid to determine the concentration performance. Therefore, a quantitative wave optical analysis of the miniaturization's influence on the obtained concentration performance is presented. This better quantitative understanding of the impact of diffraction in microstructured Fresnel lenses might help to optimize the design of several applications in nonimaging optics.

  17. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    Science.gov (United States)

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  18. Quantitative reconstruction from a single diffraction-enhanced image

    International Nuclear Information System (INIS)

    Paganin, D.M.; Lewis, R.A.; Kitchen, M.

    2003-01-01

    Full text: We develop an algorithm for using a single diffraction-enhanced image (DEI) to obtain a quantitative reconstruction of the projected thickness of a single-material sample which is embedded within a substrate of approximately constant thickness. This algorithm is used to quantitatively map inclusions in a breast phantom, from a single synchrotron DEI image. In particular, the reconstructed images quantitatively represent the projected thickness in the bulk of the sample, in contrast to DEI images which greatly emphasise sharp edges (high spatial frequencies). In the context of an ultimate aim of improved methods for breast cancer detection, the reconstructions are potentially of greater diagnostic value compared to the DEI data. Lastly, we point out that the methods of analysis presented here are also applicable to the quantitative analysis of differential interference contrast (DIC) images

  19. Development and application of a quantitative multiplexed small GTPase activity assay using targeted proteomics.

    Science.gov (United States)

    Zhang, Cheng-Cheng; Li, Ru; Jiang, Honghui; Lin, Shujun; Rogalski, Jason C; Liu, Kate; Kast, Juergen

    2015-02-06

    Small GTPases are a family of key signaling molecules that are ubiquitously expressed in various types of cells. Their activity is often analyzed by western blot, which is limited by its multiplexing capability, the quality of isoform-specific antibodies, and the accuracy of quantification. To overcome these issues, a quantitative multiplexed small GTPase activity assay has been developed. Using four different binding domains, this assay allows the binding of up to 12 active small GTPase isoforms simultaneously in a single experiment. To accurately quantify the closely related small GTPase isoforms, a targeted proteomic approach, i.e., selected/multiple reaction monitoring, was developed, and its functionality and reproducibility were validated. This assay was successfully applied to human platelets and revealed time-resolved coactivation of multiple small GTPase isoforms in response to agonists and differential activation of these isoforms in response to inhibitor treatment. This widely applicable approach can be used for signaling pathway studies and inhibitor screening in many cellular systems.

  20. Quantitative microbiological risk assessment in food industry: Theory and practical application.

    Science.gov (United States)

    Membré, Jeanne-Marie; Boué, Géraldine

    2018-04-01

    The objective of this article is to bring scientific background as well as practical hints and tips to guide risk assessors and modelers who want to develop a quantitative Microbiological Risk Assessment (MRA) in an industrial context. MRA aims at determining the public health risk associated with biological hazards in a food. Its implementation in industry enables to compare the efficiency of different risk reduction measures, and more precisely different operational settings, by predicting their effect on the final model output. The first stage in MRA is to clearly define the purpose and scope with stakeholders, risk assessors and modelers. Then, a probabilistic model is developed; this includes schematically three important phases. Firstly, the model structure has to be defined, i.e. the connections between different operational processing steps. An important step in food industry is the thermal processing leading to microbial inactivation. Growth of heat-treated surviving microorganisms and/or post-process contamination during storage phase is also important to take into account. Secondly, mathematical equations are determined to estimate the change of microbial load after each processing step. This phase includes the construction of model inputs by collecting data or eliciting experts. Finally, the model outputs are obtained by simulation procedures, they have to be interpreted and communicated to targeted stakeholders. In this latter phase, tools such as what-if scenarios provide an essential added value. These different MRA phases are illustrated through two examples covering important issues in industry. The first one covers process optimization in a food safety context, the second one covers shelf-life determination in a food quality context. Although both contexts required the same methodology, they do not have the same endpoint: up to the human health in the foie gras case-study illustrating here a safety application, up to the food portion in the

  1. UK quantitative WB-DWI technical workgroup: consensus meeting recommendations on optimisation, quality control, processing and analysis of quantitative whole-body diffusion-weighted imaging for cancer.

    Science.gov (United States)

    Barnes, Anna; Alonzi, Roberto; Blackledge, Matthew; Charles-Edwards, Geoff; Collins, David J; Cook, Gary; Coutts, Glynn; Goh, Vicky; Graves, Martin; Kelly, Charles; Koh, Dow-Mu; McCallum, Hazel; Miquel, Marc E; O'Connor, James; Padhani, Anwar; Pearson, Rachel; Priest, Andrew; Rockall, Andrea; Stirling, James; Taylor, Stuart; Tunariu, Nina; van der Meulen, Jan; Walls, Darren; Winfield, Jessica; Punwani, Shonit

    2018-01-01

    Application of whole body diffusion-weighted MRI (WB-DWI) for oncology are rapidly increasing within both research and routine clinical domains. However, WB-DWI as a quantitative imaging biomarker (QIB) has significantly slower adoption. To date, challenges relating to accuracy and reproducibility, essential criteria for a good QIB, have limited widespread clinical translation. In recognition, a UK workgroup was established in 2016 to provide technical consensus guidelines (to maximise accuracy and reproducibility of WB-MRI QIBs) and accelerate the clinical translation of quantitative WB-DWI applications for oncology. A panel of experts convened from cancer centres around the UK with subspecialty expertise in quantitative imaging and/or the use of WB-MRI with DWI. A formal consensus method was used to obtain consensus agreement regarding best practice. Questions were asked about the appropriateness or otherwise on scanner hardware and software, sequence optimisation, acquisition protocols, reporting, and ongoing quality control programs to monitor precision and accuracy and agreement on quality control. The consensus panel was able to reach consensus on 73% (255/351) items and based on consensus areas made recommendations to maximise accuracy and reproducibly of quantitative WB-DWI studies performed at 1.5T. The panel were unable to reach consensus on the majority of items related to quantitative WB-DWI performed at 3T. This UK Quantitative WB-DWI Technical Workgroup consensus provides guidance on maximising accuracy and reproducibly of quantitative WB-DWI for oncology. The consensus guidance can be used by researchers and clinicians to harmonise WB-DWI protocols which will accelerate clinical translation of WB-DWI-derived QIBs.

  2. The method of quantitative X-ray microanalysis of fine inclusions in copper

    International Nuclear Information System (INIS)

    Morawiec, H.; Kubica, L.; Piszczek, J.

    1978-01-01

    The method of correction for the matrix effect in quantitative x-ray microanalysis was presented. The application of the method was discussed on the example of quantitative analysis of fine inclusions of Cu 2 S and Cu 2 O in copper. (author)

  3. Studying learning in the healthcare setting: the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples’ experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the

  4. Contribution to the elastic and inelastic scattering study polarized protons; Contribution a l'etude de la diffusion elastique et inelastique avec un faisceau de protons polarises

    Energy Technology Data Exchange (ETDEWEB)

    Swiniarski, R de [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-10-01

    The elastic and inelastic scattering of the 18.6 MeV polarized proton beam from the Saclay variable energy cyclotron has been studied for the following nuclei: {sup 48}Ti, {sup 50}Ti, {sup 52}C, {sup 54}Fe, {sup 56}Fe, {sup 58}Ni, {sup 62}Ni, {sup 64}Ni, {sup 63}Cu: the targets {sup 52}Cr, {sup 60}Ni and {sup 62}Ni have also been investigated at 16.5 MeV. The measured asymmetries for the strong l = 2 transitions tend to fall into two categories, distinguished by the magnitude of the asymmetries at 30 degrees and 90 degrees. For the transitions studied, only those to the first 2+ state of the 28-neutron nuclei present large asymmetries at these angles. Strong l = 3 and l = 4 transitions show also interesting variations. When the entire optical potential is deformed, coupled channels or DWBA calculations predict the 'small' l = 2 asymmetry reasonably well, but only an abnormal increase of the strength of the spin-orbit distortion or the introduction of an imaginary and negative spin-orbit potential can reproduce the amplitude of the large asymmetries. Calculations with a microscopic model indicate that the asymmetry is sensitive to the form factor and no important differences were found between S=0 and S=1 predictions. (author) [French] Nous avons etudie la diffusion elastique et inelastique a l'aide du faisceau de protons polarises du cyclotron a energie variable de Saclay a 18.6 MeV pour les cibles suivantes: {sup 48}Ti, {sup 50}Ti, {sup 52}Cr, {sup 54}Fe, {sup 56}Fe, {sup 58}Ni, {sup 62}Ni, {sup 63}Cu et {sup 64}Ni: les cibles {sup 52}Cr, {sup 60}Ni et {sup 62}Ni ont egalement ete etudiees a 16.5 MeV. Les asymetries mesurees pour les transitions fortement excitees l = 2 se divisent en deux groupes differant par l'amplitude de l'asymetrie a 30 degres et 90 degres. Seules les asymetries mesurees pour les premiers niveaux 2+ des noyaux a couche complete en neutrons (N=28) sont tres grandes a ces angles. Les asymetries mesurees pour les niveaux 3{sup -} et 4{sup

  5. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  6. Development and standardization of multiplexed antibody microarrays for use in quantitative proteomics

    Directory of Open Access Journals (Sweden)

    Sorette M

    2004-12-01

    Full Text Available Abstract Background Quantitative proteomics is an emerging field that encompasses multiplexed measurement of many known proteins in groups of experimental samples in order to identify differences between groups. Antibody arrays are a novel technology that is increasingly being used for quantitative proteomics studies due to highly multiplexed content, scalability, matrix flexibility and economy of sample consumption. Key applications of antibody arrays in quantitative proteomics studies are identification of novel diagnostic assays, biomarker discovery in trials of new drugs, and validation of qualitative proteomics discoveries. These applications require performance benchmarking, standardization and specification. Results Six dual-antibody, sandwich immunoassay arrays that measure 170 serum or plasma proteins were developed and experimental procedures refined in more than thirty quantitative proteomics studies. This report provides detailed information and specification for manufacture, qualification, assay automation, performance, assay validation and data processing for antibody arrays in large scale quantitative proteomics studies. Conclusion The present report describes development of first generation standards for antibody arrays in quantitative proteomics. Specifically, it describes the requirements of a comprehensive validation program to identify and minimize antibody cross reaction under highly multiplexed conditions; provides the rationale for the application of standardized statistical approaches to manage the data output of highly replicated assays; defines design requirements for controls to normalize sample replicate measurements; emphasizes the importance of stringent quality control testing of reagents and antibody microarrays; recommends the use of real-time monitors to evaluate sensitivity, dynamic range and platform precision; and presents survey procedures to reveal the significance of biomarker findings.

  7. Applicability of integrated cell culture reverse transcriptase quantitative PCR (ICC-RTqPCR) for the simultaneous detection of the four human enteric enterovirus species in disinfection studies

    Science.gov (United States)

    A newly developed integrated cell culture reverse transcriptase quantitative PCR (ICC-RTqPCR) method and its applicability in UV disinfection studies is described. This method utilizes a singular cell culture system coupled with four RTqPCR assays to detect infectious serotypes t...

  8. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  9. Quantitative and Qualitative Extensions of Event Structures

    NARCIS (Netherlands)

    Katoen, Joost P.

    1996-01-01

    An important application of formal methods is the specification, design, and analysis of functional aspects of (distributed) systems. Recently the study of quantitative aspects of such systems based on formal methods has come into focus. Several extensions of formal methods where the occurrence of

  10. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  11. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  12. Shedding quantitative fluorescence light on novel regulatory mechanisms in skeletal biomedicine and biodentistry.

    Science.gov (United States)

    Lee, Ji-Won; Iimura, Tadahiro

    2017-02-01

    Digitalized fluorescence images contain numerical information such as color (wavelength), fluorescence intensity and spatial position. However, quantitative analyses of acquired data and their validation remained to be established. Our research group has applied quantitative fluorescence imaging on tissue sections and uncovered novel findings in skeletal biomedicine and biodentistry. This review paper includes a brief background of quantitative fluorescence imaging and discusses practical applications by introducing our previous research. Finally, the future perspectives of quantitative fluorescence imaging are discussed.

  13. Quantitative sample preparation of some heavy elements

    International Nuclear Information System (INIS)

    Jaffey, A.H.

    1977-01-01

    A discussion is given of some techniques that have been useful in quantitatively preparing and analyzing samples used in the half-life determinations of some plutonium and uranium isotopes. Application of these methods to the preparation of uranium and plutonium samples used in neutron experiments is discussed

  14. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Application of droplet digital PCR for quantitative detection of Spiroplasma citri in comparison with real time PCR.

    Directory of Open Access Journals (Sweden)

    Yogita Maheshwari

    Full Text Available Droplet digital polymerase chain reaction (ddPCR is a method for performing digital PCR that is based on water-oil emulsion droplet technology. It is a unique approach to measure the absolute copy number of nucleic acid targets without the need of external standards. This study evaluated the applicability of ddPCR as a quantitative detection tool for the Spiroplasma citri, causal agent of citrus stubborn disease (CSD in citrus. Two sets of primers, SP1, based on the spiral in housekeeping gene, and a multicopy prophage gene, SpV1 ORF1, were used to evaluate ddPCR in comparison with real time (quantitative PCR (qPCR for S. citri detection in citrus tissues. Standard curve analyses on tenfold dilution series showed that both ddPCR and qPCR exhibited good linearity and efficiency. However, ddPCR had a tenfold greater sensitivity than qPCR and accurately quantified up to one copy of spiralin gene. Receiver operating characteristic analysis indicated that the ddPCR methodology was more robust for diagnosis of CSD and the area under the curve was significantly broader compared to qPCR. Field samples were used to validate ddPCR efficacy and demonstrated that it was equal or better than qPCR to detect S. citri infection in fruit columella due to a higher pathogen titer. The ddPCR assay detected both the S. citri spiralin and the SpV1 ORF1 targets quantitatively with high precision and accuracy compared to qPCR assay. The ddPCR was highly reproducible and repeatable for both the targets and showed higher resilience to PCR inhibitors in citrus tissue extract for the quantification of S. citri compare to qPCR.

  16. ffect of Nitrogen and Zinc Foliar Application on Quantitative Traits of Tea Rosslle (Hibiscus sabdariffa in Jiroft Zone

    Directory of Open Access Journals (Sweden)

    abdolreza raisi sarbijan

    2017-02-01

    Full Text Available Introduction: Nitrogen is an essential element forplants and in combination withelements such as carbon, oxygen, hydrogen and sulfur results ineven more valuable materials such as amino acids, nucleic acids, alkaloids. Hibiscus tea (Hibiscus sabdariffa from Malvaceaefamily is known by different names in different parts of the world. In Iran it is calledthe Maki tea, tea Meccaorred tea.As an important plant,it is decided to investigate its growth and development in Jiroft. Materials and Methods The experiment was conducted as factorial based on randomized complete block design with three replications in farm research of Islamic Azad University of Jiroft during 2010. The first factor was nitrogen foliar application in four levels (0, 1, 2 and 3 percent and second factor was foliar application of zinc at twolevels (0 and 1 percent. The measured quantitative characteristics were stem diameter, plant height, calycle fresh weight,calycle dry weight, plant fresh weight,plant dry weight, leaf fresh weight,leaf dry weight, mucilage percentage and mucilage yield. Results and Discussion:The results of ANOVA showed that nitrogen foliar application on leaf dry weight, calycle fresh and dry weight was effective. Plant fresh weight, dry weight, stem diameter, plant height, mucilage percentageandmucilage yield showedsignificanteffects. Zinc foliar application significantly affected leaf fresh weight,leafdry weight, calycle fresh weight, plant fresh weight,plant dry weight, mucilage percentage andmucilage yield.The interaction effect of nitrogen and zinc on leaf dry weight, plant freshweight and plant dry weight was also significant. The mean comparison of studied characteristics revealed that byincreasing the amount of nitrogen up to N2 level, the stem diameter, plant height, leaf dry weight, calycle dry weight, mucilage percentage and yield increased but there was no significant difference between N2 and N3 levels. Plant fresh weight and plantdry weight

  17. Integrated quantitative pharmacology for treatment optimization in oncology

    NARCIS (Netherlands)

    Hasselt, J.G.C. van

    2014-01-01

    This thesis describes the development and application of quantitative pharmacological models in oncology for treatment optimization and for the design and analysis of clinical trials with respect to pharmacokinetics, toxicity, efficacy and cost-effectiveness. A recurring theme throughout this

  18. A general method for bead-enhanced quantitation by flow cytometry

    Science.gov (United States)

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  19. Quantitative determination of phases by X-ray diffraction

    International Nuclear Information System (INIS)

    Azevedo, A.L.T.

    1979-01-01

    The internal standard method for the quantitative determination of phases by X-ray diffraction is presented. The method is applicable to multi-phase materials which may be treated as powder. A discussion on sample preparation and some examples follow. (Author) [pt

  20. Integrated quantitative pharmacology for treatment optimization in oncology

    NARCIS (Netherlands)

    van Hasselt, J.G.C.

    2014-01-01

    This thesis describes the development and application of quantitative pharmacological models in oncology for treatment optimization and for the design and analysis of clinical trials with respect to pharmacokinetics, toxicity, efficacy and cost-effectiveness. A recurring theme throughout this thesis

  1. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  2. Risk prediction, safety analysis and quantitative probability methods - a caveat

    International Nuclear Information System (INIS)

    Critchley, O.H.

    1976-01-01

    Views are expressed on the use of quantitative techniques for the determination of value judgements in nuclear safety assessments, hazard evaluation, and risk prediction. Caution is urged when attempts are made to quantify value judgements in the field of nuclear safety. Criteria are given the meaningful application of reliability methods but doubts are expressed about their application to safety analysis, risk prediction and design guidances for experimental or prototype plant. Doubts are also expressed about some concomitant methods of population dose evaluation. The complexities of new designs of nuclear power plants make the problem of safety assessment more difficult but some possible approaches are suggested as alternatives to the quantitative techniques criticized. (U.K.)

  3. Clinical application of quantitative 99Tcm-pertechnetate thyroid imaging

    International Nuclear Information System (INIS)

    Gao Yongju; Xie Jian; Yan Xinhui; Wand Jiebin; Zhu Xuanmin; Liu Lin; Sun Haizhou

    2002-01-01

    Objective: To investigate the clinical value of quantitative 99 Tc m -pertechnetate thyroid imaging for the diagnosis and therapeutic evaluation in patients with thyroid disease. Methods: With the Siemens Orbit SPECT, 99 Tc m sodium pertechnetate thyroid imaging was performed on a control group and 108 patients with Graves' disease, 58 patients with Hashimoto's disease, 41 patients with subacute thyroiditis. Three functional parameters were calculated as follows: AR=5 min thyroid count/1 min thyroid count; UI=20 min thyroid count/thigh count; T d =imaging interval between carotid and thyroid. Results: 1) Three functional parameters were basically concordant with serological parameters in patients with Graves' disease. While uptake was high in patients who had contracted Graves' disease for ≤0.5 year, for those whose disease relapsed within 2 years the 99 Tc m thyroid uptake increased when the antithyroid medication was stopped. 2) Thyroid images of hyperthyroid patients with Hashimoto's disease showed increased perfusion and 99 Tc m uptake, a pattern similar to that found in Graves' disease. Differences in T d , AR , UI were not significant among euthyroid, subclinical hypothyroid patients with Hashimoto's disease, so uptake ratios could indicate the thyroid activity. 3) Delayed thyroid image and diffuse uptake decrease were found in hyperthyroid patients with SAT, however, focal damages were observed in euthyroid patients. Conclusion: Quantitative 99 Tc m -pertechnetate thyroid imaging is a significantly helpful technique in the diagnosis and treatment for common thyroid disorders

  4. A fortran program for elastic scattering of deuterons with an optical model containing tensorial potentials; Programme fortran pour la diffusion elastique de deutons avec un modele optique contenant des termes tensoriels

    Energy Technology Data Exchange (ETDEWEB)

    Raynal, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1963-07-01

    The optical model has been applied with success to the elastic scattering of particles of spin 0 and 1/2 and to a lesser degree to that of deuterons. For particles of spin l/2, an LS coupling term is ordinarily used; this term is necessary to obtain a polarization; for deuterons, this coupling has been already introduced, but the possible forms of potentials are more numerous (in this case, scalar products of a second rank spin tensor with a tensor of the same rank in space or momentum can occur). These terms which may be necessary are primarily important for the tensor polarization. This problem is of particular interest at Saclay since a beam of polarized deuterons has become available. The FORTRAN program SPM 037 permits the study of the effect of tensorial potentials constructed from the distance of the deuteron from the target and its angular momentum with respect to it. This report should make possible the use and even the modification of the program. It consists of: a description of the problem and of the notation employed, a presentation of the methods adopted, an indication of the necessary data and how they should be introduced, and finally tables of symbols which are in equivalence or common statements: these tables must be considered when making any modification. (author) [French] Le modele optique a ete applique avec succes a la diffusion elastique des particules de spin nul et 1/2 et dans une moindre mesure a celle des deutons. Pour les particules de spin 1/2, on utilise habituellement un couplage LS, necessaire pour calculer la polarisation; pour les deutons, ce couplage a deja ete introduit, mais les formes de potentiel possibles sont plus nombreuses (intervention de produits scalaires d'un tenseur d'ordre 2 de spin avec un tenseur du meme ordre d'espace ou d'impulsion) et celles qui peuvent etre eventuellement necessaires ont une importance capitale pour la polarisation tensorielle. Ce probleme revet a Saclay un interet particulier depuis la mise

  5. A fortran program for elastic scattering of deuterons with an optical model containing tensorial potentials; Programme fortran pour la diffusion elastique de deutons avec un modele optique contenant des termes tensoriels

    Energy Technology Data Exchange (ETDEWEB)

    Raynal, J. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1963-07-01

    The optical model has been applied with success to the elastic scattering of particles of spin 0 and 1/2 and to a lesser degree to that of deuterons. For particles of spin l/2, an LS coupling term is ordinarily used; this term is necessary to obtain a polarization; for deuterons, this coupling has been already introduced, but the possible forms of potentials are more numerous (in this case, scalar products of a second rank spin tensor with a tensor of the same rank in space or momentum can occur). These terms which may be necessary are primarily important for the tensor polarization. This problem is of particular interest at Saclay since a beam of polarized deuterons has become available. The FORTRAN program SPM 037 permits the study of the effect of tensorial potentials constructed from the distance of the deuteron from the target and its angular momentum with respect to it. This report should make possible the use and even the modification of the program. It consists of: a description of the problem and of the notation employed, a presentation of the methods adopted, an indication of the necessary data and how they should be introduced, and finally tables of symbols which are in equivalence or common statements: these tables must be considered when making any modification. (author) [French] Le modele optique a ete applique avec succes a la diffusion elastique des particules de spin nul et 1/2 et dans une moindre mesure a celle des deutons. Pour les particules de spin 1/2, on utilise habituellement un couplage LS, necessaire pour calculer la polarisation; pour les deutons, ce couplage a deja ete introduit, mais les formes de potentiel possibles sont plus nombreuses (intervention de produits scalaires d'un tenseur d'ordre 2 de spin avec un tenseur du meme ordre d'espace ou d'impulsion) et celles qui peuvent etre eventuellement necessaires ont une importance capitale pour la polarisation tensorielle. Ce probleme revet a Saclay un interet

  6. Maths meets myths quantitative approaches to ancient narratives

    CERN Document Server

    MacCarron, Máirín; MacCarron, Pádraig

    2017-01-01

    With an emphasis on exploring measurable aspects of ancient narratives, Maths Meets Myths sets out to investigate age-old material with new techniques. This book collects, for the first time, novel quantitative approaches to studying sources from the past, such as chronicles, epics, folktales, and myths. It contributes significantly to recent efforts in bringing together natural scientists and humanities scholars in investigations aimed at achieving greater understanding of our cultural inheritance. Accordingly, each contribution reports on a modern quantitative approach applicable to narrative sources from the past, or describes those which would be amenable to such treatment and why they are important. This volume is a unique state-of-the-art compendium on an emerging research field which also addresses anyone with interests in quantitative approaches to humanities.

  7. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  8. Recombinant plasmid-based quantitative Real-Time PCR analysis of Salmonella enterica serotypes and its application to milk samples.

    Science.gov (United States)

    Gokduman, Kurtulus; Avsaroglu, M Dilek; Cakiris, Aris; Ustek, Duran; Gurakan, G Candan

    2016-03-01

    The aim of the current study was to develop, a new, rapid, sensitive and quantitative Salmonella detection method using a Real-Time PCR technique based on an inexpensive, easy to produce, convenient and standardized recombinant plasmid positive control. To achieve this, two recombinant plasmids were constructed as reference molecules by cloning the two most commonly used Salmonella-specific target gene regions, invA and ttrRSBC. The more rapid detection enabled by the developed method (21 h) compared to the traditional culture method (90 h) allows the quantitative evaluation of Salmonella (quantification limits of 10(1)CFU/ml and 10(0)CFU/ml for the invA target and the ttrRSBC target, respectively), as illustrated using milk samples. Three advantages illustrated by the current study demonstrate the potential of the newly developed method to be used in routine analyses in the medical, veterinary, food and water/environmental sectors: I--The method provides fast analyses including the simultaneous detection and determination of correct pathogen counts; II--The method is applicable to challenging samples, such as milk; III--The method's positive controls (recombinant plasmids) are reproducible in large quantities without the need to construct new calibration curves. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Quantum formalism in gravitation quantitative application to the Titus-Bode law

    International Nuclear Information System (INIS)

    Louise, R.

    1982-01-01

    Quantum conception of virtual energy exchange between masses leads to the Newton's law. A guide wave similar to the De Broglie's one (1923) is able to account quantitatively for the Titius-Bode law occurring in the planetary system as well as the satellite systems. (Auth.)

  10. Improving quantitative gas chromatography-electron ionization mass spectrometry results using a modified ion source: demonstration for a pharmaceutical application.

    Science.gov (United States)

    D'Autry, Ward; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Van Schepdael, Ann

    2011-07-01

    Gas chromatography-mass spectrometry is a well established analytical technique. However, mass spectrometers with electron ionization sources may suffer from signal drifts, hereby negatively influencing quantitative performance. To demonstrate this phenomenon for a real application, a static headspace-gas chromatography method in combination with electron ionization-quadrupole mass spectrometry was optimized for the determination of residual dichloromethane in coronary stent coatings. Validating the method, the quantitative performance of an original stainless steel ion source was compared to that of a modified ion source. Ion source modification included the application of a gold coating on the repeller and exit plate. Several validation aspects such as limit of detection, limit of quantification, linearity and precision were evaluated using both ion sources. It was found that, as expected, the stainless steel ion source suffered from signal drift. As a consequence, non-linearity and high RSD values for repeated analyses were obtained. An additional experiment was performed to check whether an internal standard compound would lead to better results. It was found that the signal drift patterns of the analyte and internal standard were different, consequently leading to high RSD values for the response factor. With the modified ion source however, a more stable signal was observed resulting in acceptable linearity and precision. Moreover, it was also found that sensitivity improved compared to the stainless steel ion source. Finally, the optimized method with the modified ion source was applied to determine residual dichloromethane in the coating of coronary stents. The solvent was detected but found to be below the limit of quantification. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. SU-E-T-661: Quantitative MRI Assessment of a Novel Direction-Modulated Brachytherapy Tandem Applicator for Cervical Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Soliman, A; Elzibak, A; Fatemi, A; Safigholi, H; Leung, E; Ravi, A; Song, W [Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Han, D [Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); University of California, San Diego, La Jolla, CA (United States)

    2015-06-15

    Purpose: To quantitatively evaluate the MR image quality of a novel direction modulated brachytherapy (DMBT) tandem applicator for cervical cancer, using the clinical MRI scanning protocol for image guided brachytherapy. Methods: The tungsten alloy-based applicator was placed in a water phantom and clinical imaging protocol was performed. Axial images were acquired using 2D turbo-spin echo (TSE) T2-weighted sequence on a 1.5T GE 450w MR scanner and an 8-channel body coil. As multi-channel receiver coil was used, inhomogeneities in the B1 receive field must be considered before performing the quantification process. Therefore the applicator was removed from the phantom and the whole imaging session was performed again for the water phantom with the same parameters. Images from the two scans were then subtracted, resulting in a difference image that only shows the applicator with its surrounding magnetic susceptibility dipole artifact. Line profiles were drawn and plotted on the difference image at various angles and locations along the tandem. Full width at half maximum (FWHM) was measured at all the line profiles to quantify the extent of the artifact. Additionally, the extent of the artifact along the diameter of the tandem was measured at various angles and locations. Results: After removing the background inhomogeneities of the receiver coil, FWHM of the tandem measured 5.75 ± 0.35 mm (the physical tandem diameter is 5.4 mm). The average extent of the artifacts along the diameter of the tandem measured is 2.14 ± 0.56 mm. In contrast to CT imaging of the same applicator (not shown here), the tandem can be easily identified without additional correction algorithms. Conclusion: This work demonstrated that the novel DMBT tandem applicator has minimal susceptibility artifact in T2-weighted images employed in clinical practice for MRI-guided brachytherapy of cervical cancer.

  12. [Application of target restoration space quantity and quantitative relation in precise esthetic prosthodontics].

    Science.gov (United States)

    Haiyang, Yu; Tian, Luo

    2016-06-01

    Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.

  13. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  14. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  15. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  16. Quantitative Preparation in Doctoral Education Programs: A Mixed-Methods Study of Doctoral Student Perspectives on their Quantitative Training

    Directory of Open Access Journals (Sweden)

    Sarah L Ferguson

    2017-07-01

    Full Text Available Aim/Purpose: The purpose of the current study is to explore student perceptions of their own doctoral-level education and quantitative proficiency. Background: The challenges of preparing doctoral students in education have been discussed in the literature, but largely from the perspective of university faculty and program administrators. The current study directly explores the student voice on this issue. Methodology: Utilizing a sequential explanatory mixed-methods research design, the present study seeks to better understand doctoral-level education students’ perceptions of their quantitative methods training at a large public university in the southwestern United States. Findings: Results from both phases present the need for more application and consistency in doctoral-level quantitative courses. Additionally, there was a consistent theme of internal motivation in the responses, suggesting students perceive their quantitative training to be valuable beyond their personal interest in the topic. Recommendations for Practitioners: Quantitative methods instructors should emphasize practice in their quantitative courses and consider providing additional support for students through the inclusion of lab sections, tutoring, and/or differentiation. Pre-testing statistical ability at the start of a course is also suggested to better meet student needs. Impact on Society: The ultimate goal of quantitative methods in doctoral education is to produce high-quality educational researchers who are prepared to apply their knowledge to problems and research in education. Results of the present study can inform faculty and administrator decisions in doctoral education to best support this goal. Future Research: Using the student perspectives presented in the present study, future researchers should continue to explore effective instructional strategies and curriculum design within education doctoral programs. The inclusion of student voice can strengthen

  17. Implantation of the method of quantitative analysis by proton induced X-ray analysis and application to the analysis of aerosols

    International Nuclear Information System (INIS)

    Margulis, W.

    1977-09-01

    Fundamental aspects for the implementation of the method of quantitative analysis by proton induced X-ray spectroscopy are discussed. The calibration of the system was made by determining a response coefficient for selected elements, both by irradiating known amounts of these elements as well as by the use of theoretical and experimental parameters. The results obtained by these two methods agree within 5% for the analysed elements. A computer based technique of spectrum decomposition was developed to facilitate routine analysis. Finally, aerosol samples were measured as an example of a possible application of the method, and the results are discussed. (Author) [pt

  18. Potential Application of Quantitative Prostate-specific Antigen Analysis in Forensic Examination of Seminal Stains

    Directory of Open Access Journals (Sweden)

    Zhenping Liu

    2015-01-01

    Full Text Available The aims of this study are to use quantitative analysis of the prostate-specific antigen (PSA in the seminal stain examination and to explore the practical value of this analysis in forensic science. For a comprehensive analysis, vaginal swabs from 48 rape cases were tested both by a PSA fluorescence analyzer (i-CHROMA Reader and by a conventional PSA strip test. To confirm the results of these PSA tests, seminal DNA was tested following differential extraction. Compared to the PSA strip test, the PSA rapid quantitative fluorescence analyzer provided the more accurate and sensitive results. More importantly, individualized schemes based on quantitative PSA results of samples can be developed to improve the quality and procedural efficiency in the forensic seminal inspection of samples prior to DNA analysis.

  19. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    Science.gov (United States)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  20. Quantitative measurement of mixtures by terahertz time–domain ...

    Indian Academy of Sciences (India)

    Administrator

    earth and space science, quality control of food and agricultural products and global environmental monitoring. In quantitative applications, terahertz technology has been widely used for studying dif- ferent kinds of mixtures, such as amino acids,. 8 ter- nary chemical mixtures,. 9 pharmaceuticals,. 10 racemic compounds. 11.

  1. Improving the quantitative accuracy of optical-emission computed tomography by incorporating an attenuation correction: application to HIF1 imaging

    Science.gov (United States)

    Kim, E.; Bowsher, J.; Thomas, A. S.; Sakhalkar, H.; Dewhirst, M.; Oldham, M.

    2008-10-01

    Optical computed tomography (optical-CT) and optical-emission computed tomography (optical-ECT) are new techniques for imaging the 3D structure and function (including gene expression) of whole unsectioned tissue samples. This work presents a method of improving the quantitative accuracy of optical-ECT by correcting for the 'self'-attenuation of photons emitted within the sample. The correction is analogous to a method commonly applied in single-photon-emission computed tomography reconstruction. The performance of the correction method was investigated by application to a transparent cylindrical gelatin phantom, containing a known distribution of attenuation (a central ink-doped gelatine core) and a known distribution of fluorescing fibres. Attenuation corrected and uncorrected optical-ECT images were reconstructed on the phantom to enable an evaluation of the effectiveness of the correction. Significant attenuation artefacts were observed in the uncorrected images where the central fibre appeared ~24% less intense due to greater attenuation from the surrounding ink-doped gelatin. This artefact was almost completely removed in the attenuation-corrected image, where the central fibre was within ~4% of the others. The successful phantom test enabled application of attenuation correction to optical-ECT images of an unsectioned human breast xenograft tumour grown subcutaneously on the hind leg of a nude mouse. This tumour cell line had been genetically labelled (pre-implantation) with fluorescent reporter genes such that all viable tumour cells expressed constitutive red fluorescent protein and hypoxia-inducible factor 1 transcription-produced green fluorescent protein. In addition to the fluorescent reporter labelling of gene expression, the tumour microvasculature was labelled by a light-absorbing vasculature contrast agent delivered in vivo by tail-vein injection. Optical-CT transmission images yielded high-resolution 3D images of the absorbing contrast agent, and

  2. Improving the quantitative accuracy of optical-emission computed tomography by incorporating an attenuation correction: application to HIF1 imaging

    International Nuclear Information System (INIS)

    Kim, E; Bowsher, J; Thomas, A S; Sakhalkar, H; Dewhirst, M; Oldham, M

    2008-01-01

    Optical computed tomography (optical-CT) and optical-emission computed tomography (optical-ECT) are new techniques for imaging the 3D structure and function (including gene expression) of whole unsectioned tissue samples. This work presents a method of improving the quantitative accuracy of optical-ECT by correcting for the 'self'-attenuation of photons emitted within the sample. The correction is analogous to a method commonly applied in single-photon-emission computed tomography reconstruction. The performance of the correction method was investigated by application to a transparent cylindrical gelatin phantom, containing a known distribution of attenuation (a central ink-doped gelatine core) and a known distribution of fluorescing fibres. Attenuation corrected and uncorrected optical-ECT images were reconstructed on the phantom to enable an evaluation of the effectiveness of the correction. Significant attenuation artefacts were observed in the uncorrected images where the central fibre appeared ∼24% less intense due to greater attenuation from the surrounding ink-doped gelatin. This artefact was almost completely removed in the attenuation-corrected image, where the central fibre was within ∼4% of the others. The successful phantom test enabled application of attenuation correction to optical-ECT images of an unsectioned human breast xenograft tumour grown subcutaneously on the hind leg of a nude mouse. This tumour cell line had been genetically labelled (pre-implantation) with fluorescent reporter genes such that all viable tumour cells expressed constitutive red fluorescent protein and hypoxia-inducible factor 1 transcription-produced green fluorescent protein. In addition to the fluorescent reporter labelling of gene expression, the tumour microvasculature was labelled by a light-absorbing vasculature contrast agent delivered in vivo by tail-vein injection. Optical-CT transmission images yielded high-resolution 3D images of the absorbing contrast agent

  3. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  4. An Inside View: The Utility of Quantitative Observation in Understanding College Educational Experiences

    Science.gov (United States)

    Campbell, Corbin M.

    2017-01-01

    This article describes quantitative observation as a method for understanding college educational experiences. Quantitative observation has been used widely in several fields and in K-12 education, but has had limited application to research in higher education and student affairs to date. The article describes the central tenets of quantitative…

  5. Ion-solid interaction at low energies: principles and application of quantitative ISS

    International Nuclear Information System (INIS)

    Niehus, H.; Spitzl, R.

    1991-01-01

    Quantitative surface analysis with low-energy (500-5000 eV) ion scattering spectroscopy is known to be difficult, most often because of strong charge transfer and multiple scattering effects occurring during ion-surface interaction. In order to avoid neutralization problems, either alkali primary ions or noble gas ions in combination with the detection of all scattered particles was applied. Multiple scattering occurs predominantly at forward scattering and might confound the analysis. Backward scattering (i.e. 180 o impact collision ion scattering) bypasses strongly the multiple scattering complication and has been used successfully for the analysis of a number of surface structures for metals, semiconductors and binary alloys. A simple triangulation concept gives access to mass-selective qualitative surface crystallography. Quantitative surface structures were determined by comparison with computer simulations. (author)

  6. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  7. New quantitative safety standards: different techniques, different results?

    International Nuclear Information System (INIS)

    Rouvroye, J.L.; Brombacher, A.C.

    1999-01-01

    Safety Instrumented Systems (SIS) are used in the process industry to perform safety functions. Many factors can influence the safety of a SIS like system layout, diagnostics, testing and repair. In standards like the German DIN no quantitative analysis is demanded (DIN V 19250 Grundlegende Sicherheitsbetrachtungen fuer MSR-Schutzeinrichtungen, Berlin, 1994; DIN/VDE 0801 Grundsaetze fuer Rechner in Systemen mit Sicherheitsaufgaben, Berlin, 1990). The analysis according to these standards is based on expert opinion and qualitative analysis techniques. New standards like the IEC 61508 (IEC 61508 Functional safety of electrical/electronic/programmable electronic safety-related systems, IEC, Geneve, 1997) and the ISA-S84.01 (ISA-S84.01.1996 Application of Safety Instrumented Systems for the Process Industries, Instrument Society of America, Research Triangle Park, 1996) require quantitative risk analysis but do not prescribe how to perform the analysis. Earlier publications of the authors (Rouvroye et al., Uncertainty in safety, new techniques for the assessment and optimisation of safety in process industry, D W. Pyatt (ed), SERA-Vol. 4, Safety engineering and risk analysis, ASME, New York 1995; Rouvroye et al., A comparison study of qualitative and quantitative analysis techniques for the assessment of safety in industry, P.C. Cacciabue, I.A. Papazoglou (eds), Proceedings PSAM III conference, Crete, Greece, June 1996) have shown that different analysis techniques cover different aspects of system behaviour. This paper shows by means of a case study, that different (quantitative) analysis techniques may lead to different results. The consequence is that the application of the standards to practical systems will not always lead to unambiguous results. The authors therefore propose a technique to overcome this major disadvantage

  8. A new trend to determine biochemical parameters by quantitative FRET assays.

    Science.gov (United States)

    Liao, Jia-yu; Song, Yang; Liu, Yan

    2015-12-01

    Förster resonance energy transfer (FRET) has been widely used in biological and biomedical research because it can determine molecule or particle interactions within a range of 1-10 nm. The sensitivity and efficiency of FRET strongly depend on the distance between the FRET donor and acceptor. Historically, FRET assays have been used to quantitatively deduce molecular distances. However, another major potential application of the FRET assay has not been fully exploited, that is, the use of FRET signals to quantitatively describe molecular interactive events. In this review, we discuss the use of quantitative FRET assays for the determination of biochemical parameters, such as the protein interaction dissociation constant (K(d)), enzymatic velocity (k(cat)) and K(m). We also describe fluorescent microscopy-based quantitative FRET assays for protein interaction affinity determination in cells as well as fluorimeter-based quantitative FRET assays for protein interaction and enzymatic parameter determination in solution.

  9. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  10. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  11. Quantitative multimodality imaging in cancer research and therapy.

    Science.gov (United States)

    Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad

    2014-11-01

    Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.

  12. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  13. Survey of bayesian belif nets for quantitative reliability assessment of safety critical software used in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Sung, T.Y.; Jeong, H.S.; Park, J.H.; Kang, H.G.; Lee, K

    2001-03-01

    As part of the Probabilistic Safety Assessment of safety grade digital systems used in Nuclear Power plants research, measures and methodologies applicable to quantitative reliability assessment of safety critical software were surveyed. Among the techniques proposed in the literature we selected those which are in use widely and investigated their limitations in quantitative software reliability assessment. One promising methodology from the survey is Bayesian Belief Nets (BBN) which has a formalism and can combine various disparate evidences relevant to reliability into final decision under uncertainty. Thus we analyzed BBN and its application cases in digital systems assessment area and finally studied the possibility of its application to the quantitative reliability assessment of safety critical software.

  14. Survey of bayesian belif nets for quantitative reliability assessment of safety critical software used in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H. S.; Sung, T. Y.; Jeong, H. S.; Park, J. H.; Kang, H. G.; Lee, K.

    2001-03-01

    As part of the Probabilistic Safety Assessment of safety grade digital systems used in Nuclear Power plants research, measures and methodologies applicable to quantitative reliability assessment of safety critical software were surveyed. Among the techniques proposed in the literature we selected those which are in use widely and investigated their limitations in quantitative software reliability assessment. One promising methodology from the survey is Bayesian Belief Nets (BBN) which has a formalism and can combine various disparate evidences relevant to reliability into final decision under uncertainty. Thus we analyzed BBN and its application cases in digital systems assessment area and finally studied the possibility of its application to the quantitative reliability assessment of safety critical software

  15. Application of a series of artificial neural networks to on-site quantitative analysis of lead into real soil samples by laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Bruyère, D.; Ismaël, A.; Gallou, G.; Laperche, V.; Michel, K.; Canioni, L.; Bousquet, B.

    2014-01-01

    Artificial neural networks were applied to process data from on-site LIBS analysis of soil samples. A first artificial neural network allowed retrieving the relative amounts of silicate, calcareous and ores matrices into soils. As a consequence, each soil sample was correctly located inside the ternary diagram characterized by these three matrices, as verified by ICP-AES. Then a series of artificial neural networks were applied to quantify lead into soil samples. More precisely, two models were designed for classification purpose according to both the type of matrix and the range of lead concentrations. Then, three quantitative models were locally applied to three data subsets. This complete approach allowed reaching a relative error of prediction close to 20%, considered as satisfying in the case of on-site analysis. - Highlights: • Application of a series of artificial neural networks (ANN) to quantitative LIBS • Matrix-based classification of the soil samples by ANN • Concentration-based classification of the soil samples by ANN • Series of quantitative ANN models dedicated to the analysis of data subsets • Relative error of prediction lower than 20% for LIBS analysis of soil samples

  16. A quantitative infrared spectral library of vapor phase chemicals: applications to environmental monitoring and homeland defense

    Science.gov (United States)

    Sharpe, Steven W.; Johnson, Timothy J.; Sams, Robert L.

    2004-12-01

    The utility of infrared spectroscopy for monitoring and early warning of accidental or deliberate chemical releases to the atmosphere is well documented. Regardless of the monitoring technique (open-path or extractive) or weather the spectrometer is passive or active (Fourier transform or lidar) a high quality, quantitative reference library is essential for meaningful interpretation of the data. Pacific Northwest National Laboratory through the support of the Department of Energy has been building a library of pure, vapor phase chemical species for the last 4 years. This infrared spectral library currently contains over 300 chemicals and is expected to grow to over 400 chemicals before completion. The library spectra are based on a statistical fit to many spectra at different concentrations, allowing for rigorous error analysis. The contents of the library are focused on atmospheric pollutants, naturally occurring chemicals, toxic industrial chemicals and chemicals specifically designed to do damage. Applications, limitations and technical details of the spectral library will be discussed.

  17. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  18. Label-free quantitative cell division monitoring of endothelial cells by digital holographic microscopy

    Science.gov (United States)

    Kemper, Björn; Bauwens, Andreas; Vollmer, Angelika; Ketelhut, Steffi; Langehanenberg, Patrik; Müthing, Johannes; Karch, Helge; von Bally, Gert

    2010-05-01

    Digital holographic microscopy (DHM) enables quantitative multifocus phase contrast imaging for nondestructive technical inspection and live cell analysis. Time-lapse investigations on human brain microvascular endothelial cells demonstrate the use of DHM for label-free dynamic quantitative monitoring of cell division of mother cells into daughter cells. Cytokinetic DHM analysis provides future applications in toxicology and cancer research.

  19. Factors Influencing Students' Perceptions of Their Quantitative Skills

    Science.gov (United States)

    Matthews, Kelly E.; Hodgson, Yvonne; Varsavsky, Cristina

    2013-01-01

    There is international agreement that quantitative skills (QS) are an essential graduate competence in science. QS refer to the application of mathematical and statistical thinking and reasoning in science. This study reports on the use of the Science Students Skills Inventory to capture final year science students' perceptions of their QS across…

  20. Flipping interferometry and its application for quantitative phase microscopy in a micro-channel.

    Science.gov (United States)

    Roitshtain, Darina; Turko, Nir A; Javidi, Bahram; Shaked, Natan T

    2016-05-15

    We present a portable, off-axis interferometric module for quantitative phase microscopy of live cells, positioned at the exit port of a coherently illuminated inverted microscope. The module creates on the digital camera an interference pattern between the image of the sample and its flipped version. The proposed simplified module is based on a retro-reflector modification in an external Michelson interferometer. The module does not contain any lenses, pinholes, or gratings and its alignment is straightforward. Still, it allows full control of the off-axis angle and does not suffer from ghost images. As experimentally demonstrated, the module is useful for quantitative phase microscopy of live cells rapidly flowing in a micro-channel.

  1. Study of the reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p at 2.77 GeV/c for low momentum transfer of the proton. Application to the Chew-Low extrapolation method for the {pi}{sup -}{pi}{sup 0} elastic scattering; Etude de la reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p a 2.77 GeV/c pour de faibles impulsions du proton diffuse. Application de la methode d'extrapolation de Chew et Low a la diffusion elastiques {pi}{sup -}{pi}{sup 0}

    Energy Technology Data Exchange (ETDEWEB)

    Baton, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-05-01

    Study of the reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p at 2.77 GeV/c carried out in the CERN 2 meter large liquid hydrogen bubble chamber at the proton synchrotron, shows that 70 per cent of this reaction goes through {pi}{sup -}p {yields} {rho}{sup -}p channel. The high statistics allow us to specify the mass and the width of the {rho}{sup -} resonance. In other hand, if the {rho}{sup -} production parameters are independent of the {rho}{sup -} width, it is not the same case for the decay parameters. In the second part, the Chew-Low extrapolation method allows us to determine the {pi}{sup -}{pi}{sup 0} elastic cross section to the pole, and the phase shifts of the P waves in the isospin 1 state and S waves in the isospin 2 state. (author) [French] L'etude de la reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p a 2.77 GeV/c, effectuee a l'aide de la chambre a bulles a hydrogene liquide de 2 metres du CERN, exposee aupres du synchrotron a protons, montre que 70 pour cent de cette reaction passe par la voie {pi}{sup -}p {yields} {rho}{sup -}p. L'abondance de la statistique a permis de preciser la masse et la largeur de la resonance {rho}{sup -}. D'autre part, si les parametres de la production du {rho}{sup -} sont independants de la largeur de la resonance, il n'en est pas de meme des parametres de la desintegration. Dans la deuxieme partie, la methode d'extrapolation de Chew et Low permet de determiner la section efficace de diffusion elastique {pi}{sup -}{pi}{sup 0} au pole, ainsi que les dephasages des ondes P dans l'etat d'isospin 1 et S dans l'etat d'isospin 2. (auteur)

  2. The quantitation of buffering action II. Applications of the formal & general approach

    Science.gov (United States)

    Schmitt, Bernhard M

    2005-01-01

    Background The paradigm of "buffering" originated in acid-base physiology, but was subsequently extended to other fields and is now used for a wide and diverse set of phenomena. In the preceding article, we have presented a formal and general approach to the quantitation of buffering action. Here, we use that buffering concept for a systematic treatment of selected classical and other buffering phenomena. Results H+ buffering by weak acids and "self-buffering" in pure water represent "conservative buffered systems" whose analysis reveals buffering properties that contrast in important aspects from classical textbook descriptions. The buffering of organ perfusion in the face of variable perfusion pressure (also termed "autoregulation") can be treated in terms of "non-conservative buffered systems", the general form of the concept. For the analysis of cytoplasmic Ca++ concentration transients (also termed "muffling"), we develop a related unit that is able to faithfully reflect the time-dependent quantitative aspect of buffering during the pre-steady state period. Steady-state buffering is shown to represent the limiting case of time-dependent muffling, namely for infinitely long time intervals and infinitely small perturbations. Finally, our buffering concept provides a stringent definition of "buffering" on the level of systems and control theory, resulting in four absolute ratio scales for control performance that are suited to measure disturbance rejection and setpoint tracking, and both their static and dynamic aspects. Conclusion Our concept of buffering provides a powerful mathematical tool for the quantitation of buffering action in all its appearances. PMID:15771784

  3. Semi-automatic quantitative measurements of intracranial internal carotid artery stenosis and calcification using CT angiography

    International Nuclear Information System (INIS)

    Bleeker, Leslie; Berg, Rene van den; Majoie, Charles B.; Marquering, Henk A.; Nederkoorn, Paul J.

    2012-01-01

    Intracranial carotid artery atherosclerotic disease is an independent predictor for recurrent stroke. However, its quantitative assessment is not routinely performed in clinical practice. In this diagnostic study, we present and evaluate a novel semi-automatic application to quantitatively measure intracranial internal carotid artery (ICA) degree of stenosis and calcium volume in CT angiography (CTA) images. In this retrospective study involving CTA images of 88 consecutive patients, intracranial ICA stenosis was quantitatively measured by two independent observers. Stenoses were categorized with cutoff values of 30% and 50%. The calcification in the intracranial ICA was qualitatively categorized as absent, mild, moderate, or severe and quantitatively measured using the semi-automatic application. Linear weighted kappa values were calculated to assess the interobserver agreement of the stenosis and calcium categorization. The average and the standard deviation of the quantitative calcium volume were calculated for the calcium categories. For the stenosis measurements, the CTA images of 162 arteries yielded an interobserver correlation of 0.78 (P < 0.001). Kappa values of the categorized stenosis measurements were moderate: 0.45 and 0.58 for cutoff values of 30% and 50%, respectively. The kappa value for the calcium categorization was 0.62, with a good agreement between the qualitative and quantitative calcium assessment. Quantitative degree of stenosis measurement of the intracranial ICA on CTA is feasible with a good interobserver agreement ICA. Qualitative calcium categorization agrees well with quantitative measurements. (orig.)

  4. HERMES docking/berthing system pilot study. Quantitative assessment

    International Nuclear Information System (INIS)

    Munoz Blasco, J.; Goicoechea Sanchez, F.J.

    1993-01-01

    This study falls within the framework of the incorporation of quantitative risk assessment to the activities planned for the ESA-HERMES project (ESA/ CNES). The main objective behind the study was the analysis and evaluation of the potential contribution of so-called probabilistic or quantitative safety analysis to the optimization of the safety development process for the systems carrying out the safety functions required by the new and complex HERMES Space Vehicle. For this purpose, a pilot study was considered a good start in quantitative safety assessments (QSA), as this approach has been frequently used in the past to establish a solid base in large-scale QSA application programs while avoiding considerable economic risks. It was finally decided to select the HERMES docking/berthing system with Man Tender Free Flyer as the case-study. This report describes the different steps followed in the study, along with the main insights obtained and the general conclusions drawn from the study results. (author)

  5. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    International Nuclear Information System (INIS)

    Fernandez-Ruiz, R.; Garcia-Heras, M.

    2008-01-01

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies

  6. Analysis of archaeological ceramics by total-reflection X-ray fluorescence: Quantitative approaches

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez-Ruiz, R. [Servicio Interdepartamental de Investigacion, Facultad de Ciencias, Universidad Autonoma de Madrid, Modulo C-9, Laboratorio de TXRF, Crta. Colmenar, Km 15, Cantoblanco, E-28049, Madrid (Spain)], E-mail: ramon.fernandez@uam.es; Garcia-Heras, M. [Grupo de Arqueometria de Vidrios y Materiales Ceramicos, Instituto de Historia, Centro de Ciencias Humanas y Sociales, CSIC, C/ Albasanz, 26-28, 28037 Madrid (Spain)

    2008-09-15

    This paper reports the quantitative methodologies developed for the compositional characterization of archaeological ceramics by total-reflection X-ray fluorescence at two levels. A first quantitative level which comprises an acid leaching procedure, and a second selective level, which seeks to increase the number of detectable elements by eliminating the iron present in the acid leaching procedure. Total-reflection X-ray fluorescence spectrometry has been compared, at a quantitative level, with Instrumental Neutron Activation Analysis in order to test its applicability to the study of this kind of materials. The combination of a solid chemical homogenization procedure previously reported with the quantitative methodologies here presented allows the total-reflection X-ray fluorescence to analyze 29 elements with acceptable analytical recoveries and accuracies.

  7. Quantitative applications of gamma densitometry in the coal industry: a critique

    International Nuclear Information System (INIS)

    Shea, P.; Sher, R.; Gozani, T.

    1982-01-01

    This paper discusses the use of gamma densitometry to quantitatively assay bulk samples of coal on a continuous basis. Devices using these principles to determine mass flows are on the market, and work is progressing in several countries on instruments to determine ash content. The theoretical limits of applicability and inherent assumptions of these techniques are discussed, primarily as applied to dry bulk coal, but with some discussion of the more complicated problems of slurried coal. Gamma rays are generated by sources, usually a single radioactive element. These have several advantages over XRF, the main one being that no power is required to generate gammas. However, there are a limited number of gamma sources with useful energies, long enough half-lives to be economically useful, and clean spectra (that is, relatively few energies emitted by the source in question). Gamma densitometry measurements by single and multiple-energy transmission and backscatter measurements are discussed. A general formalism for analyzing multiple-energy systems is presented. While multi-energy systems can, in principle, pick out as many groups of elements as energies used, the matrices involved are ill-conditioned and thus require accurate measures of count rate (i.e., long counting times or high source intensities) to achieve acceptable errors. Changes in coal composition and profile of coal on a belt were also seen to be important sources of error. Transmission measurements are more amenable to analysis than backscatter, which are essentially transmission measurements made on a distributed source. In addition, transmission measurements are not restricted to low energy gamma sources, and can survey the entire bulk of coal rather than just the upper portion. The special problems of slurried coal measurements are briefly discussed

  8. Quantitative Tools for Dissection of Hydrogen-Producing Metabolic Networks-Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Rabinowitz, Joshua D.; Dismukes, G.Charles.; Rabitz, Herschel A.; Amador-Noguez, Daniel

    2012-10-19

    During this project we have pioneered the development of integrated experimental-computational technologies for the quantitative dissection of metabolism in hydrogen and biofuel producing microorganisms (i.e. C. acetobutylicum and various cyanobacteria species). The application of these new methodologies resulted in many significant advances in the understanding of the metabolic networks and metabolism of these organisms, and has provided new strategies to enhance their hydrogen or biofuel producing capabilities. As an example, using mass spectrometry, isotope tracers, and quantitative flux-modeling we mapped the metabolic network structure in C. acetobutylicum. This resulted in a comprehensive and quantitative understanding of central carbon metabolism that could not have been obtained using genomic data alone. We discovered that biofuel production in this bacterium, which only occurs during stationary phase, requires a global remodeling of central metabolism (involving large changes in metabolite concentrations and fluxes) that has the effect of redirecting resources (carbon and reducing power) from biomass production into solvent production. This new holistic, quantitative understanding of metabolism is now being used as the basis for metabolic engineering strategies to improve solvent production in this bacterium. In another example, making use of newly developed technologies for monitoring hydrogen and NAD(P)H levels in vivo, we dissected the metabolic pathways for photobiological hydrogen production by cyanobacteria Cyanothece sp. This investigation led to the identification of multiple targets for improving hydrogen production. Importantly, the quantitative tools and approaches that we have developed are broadly applicable and we are now using them to investigate other important biofuel producers, such as cellulolytic bacteria.

  9. Software reliability for safety-critical applications

    International Nuclear Information System (INIS)

    Everett, B.; Musa, J.

    1994-01-01

    In this talk, the authors address the question open-quotes Can Software Reliability Engineering measurement and modeling techniques be applied to safety-critical applications?close quotes Quantitative techniques have long been applied in engineering hardware components of safety-critical applications. The authors have seen a growing acceptance and use of quantitative techniques in engineering software systems but a continuing reluctance in using such techniques in safety-critical applications. The general case posed against using quantitative techniques for software components runs along the following lines: safety-critical applications should be engineered such that catastrophic failures occur less frequently than one in a billion hours of operation; current software measurement/modeling techniques rely on using failure history data collected during testing; one would have to accumulate over a billion operational hours to verify failure rate objectives of about one per billion hours

  10. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  11. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  12. A new standardless quantitative electron probe microanalysis technique applied to III-V compound semiconductors

    International Nuclear Information System (INIS)

    Zangalis, K.P.; Christou, A.

    1982-01-01

    The present paper introduces a new standardless quantitative scheme for off-line electron microprobe analysis applications. The analysis is based on standard equations of the type Isub(i)=Csub(i)fsub(ZAF)βsub(i) and is specifically suitable for compound semiconductors. The roots to the resultant nth-degree polynomial are the unknown concentrations. Methods for computing Csub(i) when coefficients βsub(i) are unknown are also outlined. Applications of standardless analysis to GaAs and InP specimens are compared with results obtained by Auger electron spectroscopy and quantitative electron probe analysis with standards. (Auth.)

  13. Quantitative MR imaging in fracture dating--Initial results.

    Science.gov (United States)

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  14. PETROGRAPHY AND APPLICATION OF THE RIETVELD METHOD TO THE QUANTITATIVE ANALYSIS OF PHASES OF NATURAL CLINKER GENERATED BY COAL SPONTANEOUS COMBUSTION

    Directory of Open Access Journals (Sweden)

    Pinilla A. Jesús Andelfo

    2010-06-01

    Full Text Available

    Fine-grained and mainly reddish color, compact and slightly breccious and vesicular pyrometamorphic rocks (natural clinker are associated to the spontaneous combustion of coal seams of the Cerrejón Formation exploited by Carbones del Cerrejón Limited in La Guajira Peninsula (Caribbean Region of Colombia. These rocks constitute remaining inorganic materials derived from claystones, mudstones and sandstones originally associated with the coal and are essentially a complex mixture of various amorphous and crystalline inorganic constituents. In this paper, a petrographic characterization of natural clinker, aswell as the application of the X-ray diffraction (Rietveld method by mean of quantitative analysis of its mineral phases were carried out. The RIQAS program was used for the refinement of X ray powder diffraction profiles, analyzing the importance of using the correct isostructural models for each of the existing phases, which were obtained from the Inorganic Crystal Structure Database (ICSD. The results obtained in this investigation show that the Rietveld method can be used as a powerful tool in the quantitative analysis of phases in polycrystalline samples, which has been a traditional problem in geology.

  15. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  16. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  17. New Approach to Quantitative Analysis by Laser-induced Breakdown Spectroscopy

    International Nuclear Information System (INIS)

    Lee, D. H.; Kim, T. H.; Yun, J. I.; Jung, E. C.

    2009-01-01

    Laser-induced breakdown spectroscopy (LIBS) has been studied as the technique of choice in some particular situations like screening, in situ measurement, process monitoring, hostile environments, etc. Especially, LIBS can fulfill the qualitative and quantitative analysis for radioactive high level waste (HLW) glass in restricted experimental conditions. Several ways have been suggested to get quantitative information from LIBS. The one approach is to use the absolute intensities of each element. The other approach is to use the elemental emission intensities relative to the intensity of the internal standard element whose concentration is known already in the specimen. But these methods are not applicable to unknown samples. In the present work, we introduce new approach to LIBS quantitative analysis by using H α (656.28 nm) emission line as external standard

  18. A method of quantitative prediction for sandstone type uranium deposit in Russia and its application

    International Nuclear Information System (INIS)

    Chang Shushuai; Jiang Minzhong; Li Xiaolu

    2008-01-01

    The paper presents the foundational principle of quantitative predication for sandstone type uranium deposits in Russia. Some key methods such as physical-mathematical model construction and deposits prediction are described. The method has been applied to deposits prediction in Dahongshan region of Chaoshui basin. It is concluded that the technique can fortify the method of quantitative predication for sandstone type uranium deposits, and it could be used as a new technique in China. (authors)

  19. Quantitative imaging of bilirubin by photoacoustic microscopy

    Science.gov (United States)

    Zhou, Yong; Zhang, Chi; Yao, Da-Kang; Wang, Lihong V.

    2013-03-01

    Noninvasive detection of both bilirubin concentration and its distribution is important for disease diagnosis. Here we implemented photoacoustic microscopy (PAM) to detect bilirubin distribution. We first demonstrate that our PAM system can measure the absorption spectra of bilirubin and blood. We also image bilirubin distributions in tissuemimicking samples, both without and with blood mixed. Our results show that PAM has the potential to quantitatively image bilirubin in vivo for clinical applications.

  20. The Spectra Count Label-free Quantitation in Cancer Proteomics

    OpenAIRE

    Zhou, Weidong; Liotta, Lance A.; Petricoin, Emanuel F.

    2012-01-01

    Mass spectrometry is used routinely for large-scale protein identification from complex biological mixtures. Recently, relative quantitation approach on the basis of spectra count has been applied in several cancer proteomic studies. In this review, we examine the mechanism of this technique and highlight several important parameters associated with its application.

  1. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    Science.gov (United States)

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  2. Potential application of a semi-quantitative method for mercury determination in soils, sediments and gold mining residues

    International Nuclear Information System (INIS)

    Yallouz, A.V.; Cesar, R.G.; Egler, S.G.

    2008-01-01

    An alternative, low cost method for analyzing mercury in soil, sediment and gold mining residues was developed, optimized and applied to 30 real samples. It is semiquantitative, performed using an acid extraction pretreatment step, followed by mercury reduction and collection in a detecting paper containing cuprous iodide. A complex is formed with characteristic color whose intensity is proportional to mercury concentration in the original sample. The results are reported as range of concentration and the minimum detectable is 100 ng/g. Method quality assurance was performed by comparing results obtained using the alternative method and the Cold Vapor Atomic Absorption Spectrometry techniques. The average results from duplicate analysis by CVAAS were 100% coincident with alternative method results. The method is applicable for screening tests and can be used in regions where a preliminary diagnosis is necessary, at programs of environmental surveillance or by scientists interested in investigating mercury geochemistry. - Semi-quantitative low-cost method for mercury determination in soil, sediments and mining residues

  3. Quantitative in situ magnetization reversal studies in Lorentz microscopy and electron holography

    International Nuclear Information System (INIS)

    Rodríguez, L.A.; Magén, C.; Snoeck, E.; Gatel, C.; Marín, L.; Serrano-Ramón, L.

    2013-01-01

    A generalized procedure for the in situ application of magnetic fields by means of the excitation of the objective lens for magnetic imaging experiments in Lorentz microscopy and electron holography is quantitatively described. A protocol for applying magnetic fields with arbitrary in-plane magnitude and orientation is presented, and a freeware script for Digital Micrograph ™ is provided to assist the operation of the microscope. Moreover, a method to accurately reconstruct hysteresis loops is detailed. We show that the out-of-plane component of the magnetic field cannot be always neglected when performing quantitative measurements of the local magnetization. Several examples are shown to demonstrate the accuracy and functionality of the methods. - Highlights: • Generalized procedure for application of magnetic fields with the TEM objective lens. • Arbitrary in-plane magnetic field magnitude and orientation can be applied. • Method to accurately reconstruct hysteresis loops by electron holography. • Out-of-plane field component should be considered in quantitative measurements. • Examples to illustrate the method in Lorentz microscopy and electron holography

  4. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  5. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  6. Method for quantitative assessment of nuclear safety computer codes

    International Nuclear Information System (INIS)

    Dearien, J.A.; Davis, C.B.; Matthews, L.J.

    1979-01-01

    A procedure has been developed for the quantitative assessment of nuclear safety computer codes and tested by comparison of RELAP4/MOD6 predictions with results from two Semiscale tests. This paper describes the developed procedure, the application of the procedure to the Semiscale tests, and the results obtained from the comparison

  7. Quantitation of anti-tetanus and anti-diphtheria antibodies by enzymoimmunoassay: methodology and applications.

    Science.gov (United States)

    Virella, G; Hyman, B

    1991-01-01

    We have developed enzymoimmunoassays (EIA) for the quantitation of antibodies (Ab) to tetanus and diphtheria toxoids (TT, DT) using Immulon I plates coated with the appropriate toxoid. A preparation of human tetanus immunoglobulin with a known concentration of anti-TT Ab was used as calibrator of the anti-TT antibody assay. The assay of anti-DT Ab is calibrated with a pool of human sera whose anti-DT Ab concentration was determined by quantitative immunoelectrophoresis, using a horse anti-DT with known Ab concentration as calibrator. A peroxidase-conjugated anti-human IgG was used in both assays. ABTS was used as substrate, and the reaction was stopped after 1 min incubation with citric acid and the OD measured at 414 nm on a Vmax reader. The assays have been applied to a variety of clinical situations. In patients suspected of having tetanus, the quantitation of antibodies has been helpful in establishing a diagnosis. In patients with a history of hypersensitivity to tetanus toxoid, verification of the levels of anti-TT antibody may prevent unnecessary and potentially harmful immunizations. The assays have also been used for the diagnostic evaluation of the humoral immune response to TT and DT, both in pediatric patients and in immunosuppressed patients. Several non-responders have been detected, and we have recently used the assay to monitor the effects of fish oil administration on the humoral immune response.(ABSTRACT TRUNCATED AT 250 WORDS)

  8. Probing myocardium biomechanics using quantitative optical coherence elastography

    Science.gov (United States)

    Wang, Shang; Lopez, Andrew L.; Morikawa, Yuka; Tao, Ge; Li, Jiasong; Larina, Irina V.; Martin, James F.; Larin, Kirill V.

    2015-03-01

    We present a quantitative optical coherence elastographic method for noncontact assessment of the myocardium elasticity. The method is based on shear wave imaging optical coherence tomography (SWI-OCT), where a focused air-puff system is used to induce localized tissue deformation through a low-pressure short-duration air stream and a phase-sensitive OCT system is utilized to monitor the propagation of the induced tissue displacement with nanoscale sensitivity. The 1-D scanning of M-mode OCT imaging and the application of optical phase retrieval and mapping techniques enable the reconstruction and visualization of 2-D depth-resolved shear wave propagation in tissue with ultra-high frame rate. The feasibility of this method in quantitative elasticity measurement is demonstrated on tissue-mimicking phantoms with the estimated Young's modulus compared with uniaxial compression tests. We also performed pilot experiments on ex vivo mouse cardiac muscle tissues with normal and genetically altered cardiomyocytes. Our results indicate this noncontact quantitative optical coherence elastographic method can be a useful tool for the cardiac muscle research and studies.

  9. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  10. Quantitative evaluation of the enamel caries which were treated with ...

    African Journals Online (AJOL)

    Objectives: The aim of this in vivo study was to quantitatively evaluate the remineralization of the enamel caries on smooth and occlusal surfaces using DIAGNOdent, after daily application of casein phosphopeptide‑amorphous calcium fluoride phosphate (CPP‑ACFP). Materials and Methods: Thirty volunteers, aged 18–30 ...

  11. A potential quantitative method for assessing individual tree performance

    Science.gov (United States)

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  12. Propagation of error from parameter constraints in quantitative MRI: Example application of multiple spin echo T2 mapping.

    Science.gov (United States)

    Lankford, Christopher L; Does, Mark D

    2018-02-01

    Quantitative MRI may require correcting for nuisance parameters which can or must be constrained to independently measured or assumed values. The noise and/or bias in these constraints propagate to fitted parameters. For example, the case of refocusing pulse flip angle constraint in multiple spin echo T 2 mapping is explored. An analytical expression for the mean-squared error of a parameter of interest was derived as a function of the accuracy and precision of an independent estimate of a nuisance parameter. The expression was validated by simulations and then used to evaluate the effects of flip angle (θ) constraint on the accuracy and precision of T⁁2 for a variety of multi-echo T 2 mapping protocols. Constraining θ improved T⁁2 precision when the θ-map signal-to-noise ratio was greater than approximately one-half that of the first spin echo image. For many practical scenarios, constrained fitting was calculated to reduce not just the variance but the full mean-squared error of T⁁2, for bias in θ⁁≲6%. The analytical expression derived in this work can be applied to inform experimental design in quantitative MRI. The example application to T 2 mapping provided specific cases, depending on θ⁁ accuracy and precision, in which θ⁁ measurement and constraint would be beneficial to T⁁2 variance or mean-squared error. Magn Reson Med 79:673-682, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  13. Theory of quantitative trend analysis and its application to the South African elections

    CSIR Research Space (South Africa)

    Greben, JM

    2006-02-28

    Full Text Available In this paper the author discusses a quantitative theory of trend analysis. Often trends are based on qualitative considerations and subjective assumptions. In the current approach the author makes use of extensive data bases to optimise the so...

  14. Quantitation without Calibration: Response Profile as an Indicator of Target Amount.

    Science.gov (United States)

    Debnath, Mrittika; Farace, Jessica M; Johnson, Kristopher D; Nesterova, Irina V

    2018-06-21

    Quantitative assessment of biomarkers is essential in numerous contexts from decision-making in clinical situations to food quality monitoring to interpretation of life-science research findings. However, appropriate quantitation techniques are not as widely addressed as detection methods. One of the major challenges in biomarker's quantitation is the need to have a calibration for correlating a measured signal to a target amount. The step complicates the methodologies and makes them less sustainable. In this work we address the issue via a new strategy: relying on position of response profile rather than on an absolute signal value for assessment of a target's amount. In order to enable the capability we develop a target-probe binding mechanism based on a negative cooperativity effect. A proof-of-concept example demonstrates that the model is suitable for quantitative analysis of nucleic acids over a wide concentration range. The general principles of the platform will be applicable toward a variety of biomarkers such as nucleic acids, proteins, peptides, and others.

  15. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    Science.gov (United States)

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  16. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  17. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  18. Review of quantitative phase-digital holographic microscopy: promising novel imaging technique to resolve neuronal network activity and identify cellular biomarkers of psychiatric disorders

    KAUST Repository

    Marquet, Pierre

    2014-09-22

    Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed.

  19. Advances in quantitative UV-visible spectroscopy for clinical and pre-clinical application in cancer.

    Science.gov (United States)

    Brown, J Quincy; Vishwanath, Karthik; Palmer, Gregory M; Ramanujam, Nirmala

    2009-02-01

    Methods of optical spectroscopy that provide quantitative, physically or physiologically meaningful measures of tissue properties are an attractive tool for the study, diagnosis, prognosis, and treatment of various cancers. Recent development of methodologies to convert measured reflectance and fluorescence spectra from tissue to cancer-relevant parameters such as vascular volume, oxygenation, extracellular matrix extent, metabolic redox states, and cellular proliferation have significantly advanced the field of tissue optical spectroscopy. The number of publications reporting quantitative tissue spectroscopy results in the UV-visible wavelength range has increased sharply in the past three years, and includes new and emerging studies that correlate optically measured parameters with independent measures such as immunohistochemistry, which should aid in increased clinical acceptance of these technologies.

  20. Quantitative Indicators for Behaviour Drift Detection from Home Automation Data.

    Science.gov (United States)

    Veronese, Fabio; Masciadri, Andrea; Comai, Sara; Matteucci, Matteo; Salice, Fabio

    2017-01-01

    Smart Homes diffusion provides an opportunity to implement elderly monitoring, extending seniors' independence and avoiding unnecessary assistance costs. Information concerning the inhabitant behaviour is contained in home automation data, and can be extracted by means of quantitative indicators. The application of such approach proves it can evidence behaviour changes.

  1. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  2. Application of Least-Squares Support Vector Machines for Quantitative Evaluation of Known Contaminant in Water Distribution System Using Online Water Quality Parameters

    Directory of Open Access Journals (Sweden)

    Kexin Wang

    2018-03-01

    Full Text Available In water-quality, early warning systems and qualitative detection of contaminants are always challenging. There are a number of parameters that need to be measured which are not entirely linearly related to pollutant concentrations. Besides the complex correlations between variable water parameters that need to be analyzed also impairs the accuracy of quantitative detection. In aspects of these problems, the application of least-squares support vector machines (LS-SVM is used to evaluate the water contamination and various conventional water quality sensors quantitatively. The various contaminations may cause different correlative responses of sensors, and also the degree of response is related to the concentration of the injected contaminant. Therefore to enhance the reliability and accuracy of water contamination detection a new method is proposed. In this method, a new relative response parameter is introduced to calculate the differences between water quality parameters and their baselines. A variety of regression models has been examined, as result of its high performance, the regression model based on genetic algorithm (GA is combined with LS-SVM. In this paper, the practical application of the proposed method is considered, controlled experiments are designed, and data is collected from the experimental setup. The measured data is applied to analyze the water contamination concentration. The evaluation of results validated that the LS-SVM model can adapt to the local nonlinear variations between water quality parameters and contamination concentration with the excellent generalization ability and accuracy. The validity of the proposed approach in concentration evaluation for potassium ferricyanide is proven to be more than 0.5 mg/L in water distribution systems.

  3. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    Directory of Open Access Journals (Sweden)

    Saeed Alexander I

    2008-12-01

    Full Text Available Abstract Background Mass spectrometry (MS based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007 described a modified spectral counting technique, Absolute Protein Expression (APEX, which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi. This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a

  4. High performance liquid chromatographic assay for the quantitation of total glutathione in plasma

    Science.gov (United States)

    Abukhalaf, Imad K.; Silvestrov, Natalia A.; Menter, Julian M.; von Deutsch, Daniel A.; Bayorh, Mohamed A.; Socci, Robin R.; Ganafa, Agaba A.

    2002-01-01

    A simple and widely used homocysteine HPLC procedure was applied for the HPLC identification and quantitation of glutathione in plasma. The method, which utilizes SBDF as a derivatizing agent utilizes only 50 microl of sample volume. Linear quantitative response curve was generated for glutathione over a concentration range of 0.3125-62.50 micromol/l. Linear regression analysis of the standard curve exhibited correlation coefficient of 0.999. Limit of detection (LOD) and limit of quantitation (LOQ) values were 5.0 and 15 pmol, respectively. Glutathione recovery using this method was nearly complete (above 96%). Intra-assay and inter-assay precision studies reflected a high level of reliability and reproducibility of the method. The applicability of the method for the quantitation of glutathione was demonstrated successfully using human and rat plasma samples.

  5. Quantitative analysis of macro-ARG using IP system

    International Nuclear Information System (INIS)

    Nakajima, Eiichi; Kawai, Kenji; Furuta, Yoshitake

    1997-01-01

    Recent progress in imaging plate (IP) system allow us to analyze autoradiographic images quantitatively. In the 'whole-body autoradiography', a method clarify the distribution of radioisotope or labeled compounds in the tissues and organs in a freeze-dried whole-body section of small animals such as rats and mice, the sections are pressed against a IP for exposure and the IP is scanned by Bio-Imaging Analyzer (Fuji Photo Film Co., Ltd) and a digital autoradiographic image is given. Quantitative data concerning the activity in different tissues can be obtained using an isotope scale as a reference source. Fading effect, application of IP system for distribution of receptor binding ARG, analysis of radio-spots on TLC and radioactive concentration in liquid such as blood are also discussed. (author)

  6. Quantitation of cerebral blood flow using HMPAO tomography

    International Nuclear Information System (INIS)

    Bruyant, P.; Mallet, J.J.; Sau, J.; Teyssier, R.; Bonmartin, A.

    1997-01-01

    A method has been developed to quantitate regional cerebral blood flow (rCBF) using 99m Tc-HMPAO. It relies on the application of the bolus distribution principle. The rCBF is determined using compartmental analysis, by measuring the amount of tracer retained in the parenchyma and the input function. The values for blood: brain partition coefficient and for the conversion rate from the lipophilic to the hydrophilic form of the tracer are taken from the literature. Mean values for rCBF in eight patients are 41.1 ± 6.4 et 25.6 ± 5.8 mL.min -1 for the grey matter and for the white matter respectively (mean±standard deviation). This method allows to quantitate rCBF with one SPET scan and one venous blood sample. (authors)

  7. A quantitative study of nanoparticle skin penetration with interactive segmentation.

    Science.gov (United States)

    Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook

    2016-10-01

    In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.

  8. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  9. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  10. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    International Nuclear Information System (INIS)

    Waters, Michael; Jackson, Marcus

    2008-01-01

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  11. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  12. Quantitative analysis of terahertz spectra for illicit drugs using adaptive-range micro-genetic algorithm

    Science.gov (United States)

    Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin

    2011-08-01

    In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.

  13. Quantitative digital radiography with two dimensional flat panels

    International Nuclear Information System (INIS)

    Dinten, J.M.; Robert-Coutant, C.; Darboux, M.

    2003-01-01

    Purpose: Attenuation law relates radiographic images to irradiated object thickness and chemical composition. Film radiography exploits qualitatively this property for diagnosis. Digital radiographic flat panels present large dynamic range, reproducibility and linearity properties which open the gate for quantification. We will present, through two applications (mammography and bone densitometry), an approach to extract quantitative information from digital 2D radiographs. Material and method: The main difficulty for quantification is X-rays scatter, which superimposes to acquisition data. Because of multiple scatterings and 3D geometry dependence, it cannot be directly exploited through an exact analytical model. Therefore we have developed an approach for its estimation and subtraction from medical radiographs, based on approximations and derivations of analytical models of scatter formation in human tissues. Results: In digital mammography, the objective is to build a map of the glandular tissue thickness. Its separation from fat tissue is based on two equations: height of compression and attenuation. This last equation needs X-Rays scatter correction. In bone densitometry, physicians look for quantitative bone mineral density. Today, clinical DEXA systems use collimated single or linear detectors to eliminate scatter. This scanning technology induces poor image quality. By applying our scatter correction approach, we have developed a bone densitometer using a digital flat panel (Lexxos, DMS). It provides with accurate and reproducible measurements while presenting radiological image quality. Conclusion: These applications show how information processing, and especially X-Rays scatter processing, enables to extract quantitative information from digital radiographs. This approach, associated to Computer Aided Diagnosis algorithms or reconstructions algorithms, gives access to useful information for diagnosis. (author)

  14. Linking descriptive geology and quantitative machine learning through an ontology of lithological concepts

    Science.gov (United States)

    Klump, J. F.; Huber, R.; Robertson, J.; Cox, S. J. D.; Woodcock, R.

    2014-12-01

    Despite the recent explosion of quantitative geological data, geology remains a fundamentally qualitative science. Numerical data only constitute a certain part of data collection in the geosciences. In many cases, geological observations are compiled as text into reports and annotations on drill cores, thin sections or drawings of outcrops. The observations are classified into concepts such as lithology, stratigraphy, geological structure, etc. These descriptions are semantically rich and are generally supported by more quantitative observations using geochemical analyses, XRD, hyperspectral scanning, etc, but the goal is geological semantics. In practice it has been difficult to bring the different observations together due to differing perception or granularity of classification in human observation, or the partial observation of only some characteristics using quantitative sensors. In the past years many geological classification schemas have been transferred into ontologies and vocabularies, formalized using RDF and OWL, and published through SPARQL endpoints. Several lithological ontologies were compiled by stratigraphy.net and published through a SPARQL endpoint. This work is complemented by the development of a Python API to integrate this vocabulary into Python-based text mining applications. The applications for the lithological vocabulary and Python API are automated semantic tagging of geochemical data and descriptions of drill cores, machine learning of geochemical compositions that are diagnostic for lithological classifications, and text mining for lithological concepts in reports and geological literature. This combination of applications can be used to identify anomalies in databases, where composition and lithological classification do not match. It can also be used to identify lithological concepts in the literature and infer quantitative values. The resulting semantic tagging opens new possibilities for linking these diverse sources of data.

  15. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    Science.gov (United States)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  16. Quantitative criteria in the application of the principle of precaution

    International Nuclear Information System (INIS)

    Touzet, Rodolfo; Ferrari, Jorge

    2008-01-01

    The Principle of Precaution establishes that 'when an activity represents a threat or damage for the human health or the environment, it is necessary to take precautionary measures, even when it could not have been scientifically demonstrated the cause-effect relationship in conclusive form' This declaration implies acting even in presence of uncertainty, deriving the responsibility and the safety to who create the risk, to analyze the possible alternatives and to use participative methods for taking decisions. This presents practically two dilemmas: How make a cost-benefit analysis when is not yet established the relationship between cause and effect for the health of the exposed persons? (With regard to the ionizing radiations a major information does exist and a factor a is in use, that represents the economic cost of the dose gotten by a person) What criteria to use in the case if it were demonstrated that non ionizing radiations act in synergic form with the ionizing ones? How to integrate a quantitative criterion of optimization with a qualitative criterion of precaution? They will have to appear some temporary hypotheses in order to be able to perform the quantitative corresponding evaluations. In the case of low frequencies the situation was exactly equal in the past, but the epidemiological studies, as well as the experiments in vivo and in vitro, demonstrated that the exposure can increase the risk of cancer in children and induce other health problems in children and adults. A possible temporary hypothesis for the radio frequencies is to assume that the effects are similar in magnitude to those caused by the low frequency fields. In this case it is possible to demonstrate that if this were really true the epidemiological statistics would not even allow demonstrate it, due to the quantity of persons involved, the time devoted to the studied populations and the latency times of the leukaemia. The use of some work hypothesis to make the cost-benefit studies

  17. Quantitative magnetic resonance imaging phantoms: A review and the need for a system phantom.

    Science.gov (United States)

    Keenan, Kathryn E; Ainslie, Maureen; Barker, Alex J; Boss, Michael A; Cecil, Kim M; Charles, Cecil; Chenevert, Thomas L; Clarke, Larry; Evelhoch, Jeffrey L; Finn, Paul; Gembris, Daniel; Gunter, Jeffrey L; Hill, Derek L G; Jack, Clifford R; Jackson, Edward F; Liu, Guoying; Russek, Stephen E; Sharma, Samir D; Steckner, Michael; Stupic, Karl F; Trzasko, Joshua D; Yuan, Chun; Zheng, Jie

    2018-01-01

    The MRI community is using quantitative mapping techniques to complement qualitative imaging. For quantitative imaging to reach its full potential, it is necessary to analyze measurements across systems and longitudinally. Clinical use of quantitative imaging can be facilitated through adoption and use of a standard system phantom, a calibration/standard reference object, to assess the performance of an MRI machine. The International Society of Magnetic Resonance in Medicine AdHoc Committee on Standards for Quantitative Magnetic Resonance was established in February 2007 to facilitate the expansion of MRI as a mainstream modality for multi-institutional measurements, including, among other things, multicenter trials. The goal of the Standards for Quantitative Magnetic Resonance committee was to provide a framework to ensure that quantitative measures derived from MR data are comparable over time, between subjects, between sites, and between vendors. This paper, written by members of the Standards for Quantitative Magnetic Resonance committee, reviews standardization attempts and then details the need, requirements, and implementation plan for a standard system phantom for quantitative MRI. In addition, application-specific phantoms and implementation of quantitative MRI are reviewed. Magn Reson Med 79:48-61, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Quantitative investigation of two metallohydrolases by X-ray absorption spectroscopy near-edge spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, W. [Hefei National Laboratory for Physical Sciences at Microscale and School of Life Sciences, University of Science and Technology of China, Hefei, Anhui 230027 (China); Chu, W.S.; Yang, F.F.; Yu, M.J.; Chen, D.L.; Guo, X.Y. [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Zhou, D.W.; Shi, N. [Hefei National Laboratory for Physical Sciences at Microscale and School of Life Sciences, University of Science and Technology of China, Hefei, Anhui 230027 (China); Marcelli, A. [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali di Frascati, P.O. Box 13, Frascati 00044 (Italy); Niu, L.W.; Teng, M.K. [Hefei National Laboratory for Physical Sciences at Microscale and School of Life Sciences, University of Science and Technology of China, Hefei, Anhui 230027 (China); Gong, W.M. [Institute of Biophysics, Chinese Academy of Sciences, Beijing 100101 (China); Benfatto, M. [Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali di Frascati, P.O. Box 13, Frascati 00044 (Italy); Wu, Z.Y. [Institute of High Energy Physics, Chinese Academy of Sciences, Beijing 100049 (China); Istituto Nazionale di Fisica Nucleare, Laboratori Nazionali di Frascati, P.O. Box 13, Frascati 00044 (Italy)], E-mail: wuzy@ihep.ac.cn

    2007-09-21

    The last several years have witnessed a tremendous increase in biological applications using X-ray absorption spectroscopy (BioXAS), thanks to continuous advancements in synchrotron radiation (SR) sources and detector technology. However, XAS applications in many biological systems have been limited by the intrinsic limitations of the Extended X-ray Absorption Fine Structure (EXAFS) technique e.g., the lack of sensitivity to bond angles. As a consequence, the application of the X-ray absorption near-edge structure (XANES) spectroscopy changed this scenario that is now continuously changing with the introduction of the first quantitative XANES packages such as Minut XANES (MXAN). Here we present and discuss the XANES code MXAN, a novel XANES-fitting package that allows a quantitative analysis of experimental data applied to Zn K-edge spectra of two metalloproteins: Leptospira interrogans Peptide deformylase (LiPDF) and acutolysin-C, a representative of snake venom metalloproteinases (SVMPs) from Agkistrodon acutus venom. The analysis on these two metallohydrolases reveals that proteolytic activities are correlated to subtle conformation changes around the zinc ion. In particular, this quantitative study clarifies the occurrence of the LiPDF catalytic mechanism via a two-water-molecules model, whereas in the acutolysin-C we have observed a different proteolytic activity correlated to structural changes around the zinc ion induced by pH variations.

  19. Quantitative investigation of two metallohydrolases by X-ray absorption spectroscopy near-edge spectroscopy

    International Nuclear Information System (INIS)

    Zhao, W.; Chu, W.S.; Yang, F.F.; Yu, M.J.; Chen, D.L.; Guo, X.Y.; Zhou, D.W.; Shi, N.; Marcelli, A.; Niu, L.W.; Teng, M.K.; Gong, W.M.; Benfatto, M.; Wu, Z.Y.

    2007-01-01

    The last several years have witnessed a tremendous increase in biological applications using X-ray absorption spectroscopy (BioXAS), thanks to continuous advancements in synchrotron radiation (SR) sources and detector technology. However, XAS applications in many biological systems have been limited by the intrinsic limitations of the Extended X-ray Absorption Fine Structure (EXAFS) technique e.g., the lack of sensitivity to bond angles. As a consequence, the application of the X-ray absorption near-edge structure (XANES) spectroscopy changed this scenario that is now continuously changing with the introduction of the first quantitative XANES packages such as Minut XANES (MXAN). Here we present and discuss the XANES code MXAN, a novel XANES-fitting package that allows a quantitative analysis of experimental data applied to Zn K-edge spectra of two metalloproteins: Leptospira interrogans Peptide deformylase (LiPDF) and acutolysin-C, a representative of snake venom metalloproteinases (SVMPs) from Agkistrodon acutus venom. The analysis on these two metallohydrolases reveals that proteolytic activities are correlated to subtle conformation changes around the zinc ion. In particular, this quantitative study clarifies the occurrence of the LiPDF catalytic mechanism via a two-water-molecules model, whereas in the acutolysin-C we have observed a different proteolytic activity correlated to structural changes around the zinc ion induced by pH variations

  20. Quantitative techniques for musculoskeletal MRI at 7 Tesla.

    Science.gov (United States)

    Bangerter, Neal K; Taylor, Meredith D; Tarbox, Grayson J; Palmer, Antony J; Park, Daniel J

    2016-12-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems.

  1. Application of myocardial perfusion quantitative imaging for the evaluation of therapeutic effect in canine with myocardial infarction

    International Nuclear Information System (INIS)

    Liang Hong; Chen Ju; Liu Sheng; Zeng Shiquan

    2000-01-01

    Myocardial blood perfusion (MBP) ECT and quantitative analysis were performed in 10 canines with experimental acute myocardial infarct (AMI). The accuracy of main myocardial quantitative index, including defect volume (DV) and defect fraction (DF), was estimated and correlated with histochemical staining (HS) of infarcted area. Other 21/AMI canines were divided into Nd:YAG laser trans-myocardial revascularization treated group LTMR and control group. All canines were performed MBP ECT after experimental AMI. Results found that the infarcted volume (IV) measured by HS has well correlated (r 0.88) with DV estimated by myocardial quantitative analysis. But the DF values calculated by both methods was not significantly different (t = 1.28 P > 0.05). In LTMR group 27.5% +- 3.9%, the DF is smaller than control group 32.1% +- 4.6% (t = 2.49 P 99m Tc-MIBI myocardial perfusion SPECT and quantitative study can accurately predict the myocardial blood flow and magnitude of injured myocardium. Nd:YAG LTMR could improve myocardial blood perfusion of ischemic myocardium and decrease effectively the infarct areas

  2. Quantitative label-free sperm imaging by means of transport of intensity

    Science.gov (United States)

    Poola, Praveen Kumar; Pandiyan, Vimal Prabhu; Jayaraman, Varshini; John, Renu

    2016-03-01

    Most living cells are optically transparent which makes it difficult to visualize them under bright field microscopy. Use of contrast agents or markers and staining procedures are often followed to observe these cells. However, most of these staining agents are toxic and not applicable for live cell imaging. In the last decade, quantitative phase imaging has become an indispensable tool for morphological characterization of the phase objects without any markers. In this paper, we report noninterferometric quantitative phase imaging of live sperm cells by solving transport of intensity equations with recorded intensity measurements along optical axis on a commercial bright field microscope.

  3. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  4. Immune chromatography: a quantitative radioimmunological assay

    International Nuclear Information System (INIS)

    Davis, J.W.; Demetriades, M.; Bowen, J.M.

    1984-01-01

    Immune chromatography, a radioimmunological binding assay, employs paper chromatography to separate immune complexes from free antigen and antibodies. During chromatography free antigen and antibodies become distributed throughout the paper, while immune complexes remain near the bottoms of the strips. The chromatographic differences can be made quantitative by using either iodinated antigens or antibodies. Under these conditions nanogram quantities of antigen can be detected or antibodies in sera diluted several 1000-fold. The immune chromatography assay can also be performed as an indirect assay, since the paper strips are cut from nitrocellulose paper. In this case the immune components are absorbed by the paper during chromatography. Antigen is then detected with an iodinated second antibody. The indirect immune chromatography assay is particularly useful for identifying different sera that react with the same antigen. Reaction with the first serum before chromatography reduces the amount of antigen available to the second serum following chromatography. In addition to characterizing the immune chromatography procedure, we discuss the possible applications of chromatography assays for the quantitation of other types of molecular binding interactions. (Auth.)

  5. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  6. Quantitative Campylobacter spp., antibiotic resistance genes, and veterinary antibiotics in surface and ground water following manure application: Influence of tile drainage control.

    Science.gov (United States)

    Frey, Steven K; Topp, Edward; Khan, Izhar U H; Ball, Bonnie R; Edwards, Mark; Gottschall, Natalie; Sunohara, Mark; Lapen, David R

    2015-11-01

    This work investigated chlortetracycline, tylosin, and tetracycline (plus transformation products), and DNA-based quantitative Campylobacter spp. and Campylobacter tetracycline antibiotic resistant genes (tet(O)) in tile drainage, groundwater, and soil before and following a liquid swine manure (LSM) application on clay loam plots under controlled (CD) and free (FD) tile drainage. Chlortetracycline/tetracycline was strongly bound to manure solids while tylosin dominated in the liquid portion of manure. The chlortetracycline transformation product isochlortetracycline was the most persistent analyte in water. Rhodamine WT (RWT) tracer was mixed with manure and monitored in tile and groundwater. RWT and veterinary antibiotic (VA) concentrations were strongly correlated in water which supported the use of RWT as a surrogate tracer. While CD reduced tile discharge and eliminated application-induced VA movement (via tile) to surface water, total VA mass loading to surface water was not affected by CD. At both CD and FD test plots, the biggest 'flush' of VA mass and highest VA concentrations occurred in response to precipitation received 2d after application, which strongly influenced the flow abatement capacity of CD on account of highly elevated water levels in field initiating overflow drainage for CD systems (when water level tile and groundwater became very low within 10d following application. Both Campylobacter spp. and Campylobacter tet(O) genes were present in groundwater and soil prior to application, and increased thereafter. Unlike the VA compounds, Campylobacter spp. and Campylobacter tet(O) gene loadings in tile drainage were reduced by CD, in relation to FD. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  7. The Design of a Quantitative Western Blot Experiment

    Directory of Open Access Journals (Sweden)

    Sean C. Taylor

    2014-01-01

    Full Text Available Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013 and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting.

  8. Dominance in Domestic Dogs : A Quantitative Analysis of Its Behavioural Measures

    NARCIS (Netherlands)

    van der Borg, Joanne A M; Schilder, Matthijs B H; Vinke, Claudia M; de Vries, Han

    2015-01-01

    A dominance hierarchy is an important feature of the social organisation of group living animals. Although formal and/or agonistic dominance has been found in captive wolves and free-ranging dogs, applicability of the dominance concept in domestic dogs is highly debated, and quantitative data are

  9. Quantitative targeted proteomics for understanding the blood-brain barrier: towards pharmacoproteomics.

    Science.gov (United States)

    Ohtsuki, Sumio; Hirayama, Mio; Ito, Shingo; Uchida, Yasuo; Tachikawa, Masanori; Terasaki, Tetsuya

    2014-06-01

    The blood-brain barrier (BBB) is formed by brain capillary endothelial cells linked together via complex tight junctions, and serves to prevent entry of drugs into the brain. Multiple transporters are expressed at the BBB, where they control exchange of materials between the circulating blood and brain interstitial fluid, thereby supporting and protecting the CNS. An understanding of the BBB is necessary for efficient development of CNS-acting drugs and to identify potential drug targets for treatment of CNS diseases. Quantitative targeted proteomics can provide detailed information on protein expression levels at the BBB. The present review highlights the latest applications of quantitative targeted proteomics in BBB research, specifically to evaluate species and in vivo-in vitro differences, and to reconstruct in vivo transport activity. Such a BBB quantitative proteomics approach can be considered as pharmacoproteomics.

  10. Quantitative analysis by microchip capillary electrophoresis – current limitations and problem-solving strategies

    NARCIS (Netherlands)

    Revermann, T.; Götz, S.; Künnemeyer, Jens; Karst, U.

    2008-01-01

    Obstacles and possible solutions for the application of microchip capillary electrophoresis in quantitative analysis are described and critically discussed. Differences between the phenomena occurring during conventional capillary electrophoresis and microchip-based capillary electrophoresis are

  11. Quantitative Structure-Activity Relationships and Docking Studies of Calcitonin Gene-Related Peptide Antagonists

    DEFF Research Database (Denmark)

    Jenssen, Håvard; Mehrabian, Mohadeseh; Kyani, Anahita

    2012-01-01

    Defining the role of calcitonin gene-related peptide in migraine pathogenesis could lead to the application of calcitonin gene-related peptide antagonists as novel migraine therapeutics. In this work, quantitative structure-activity relationship modeling of biological activities of a large range...... of calcitonin gene-related peptide antagonists was performed using a panel of physicochemical descriptors. The computational studies evaluated different variable selection techniques and demonstrated shuffling stepwise multiple linear regression to be superior over genetic algorithm-multiple linear regression....... The linear quantitative structure-activity relationship model revealed better statistical parameters of cross-validation in comparison with the non-linear support vector regression technique. Implementing only five peptide descriptors into this linear quantitative structure-activity relationship model...

  12. The quantitative evaluation of false colour photography with application of a red filter.

    NARCIS (Netherlands)

    Clevers, J.G.P.W.; Stokkom, van H.T.C.

    1992-01-01








































    Abstract


    For monitoring (homogeneous) agricultural crops, a quantitative analysis of

  13. Qualitative and quantitative assessment of interior moisture buffering by enclosures

    DEFF Research Database (Denmark)

    Janssen, Hans; Roels, Staf

    2009-01-01

    The significance of interior humidity in attaining sustainable, durable, healthy and comfortable buildings is increasingly recognised. Given their significant interaction, interior humidity appraisals need a qualitative and/or quantitative assessment of interior moisture buffering. While the effe......The significance of interior humidity in attaining sustainable, durable, healthy and comfortable buildings is increasingly recognised. Given their significant interaction, interior humidity appraisals need a qualitative and/or quantitative assessment of interior moisture buffering. While...... the effective moisture penetration depth and effective capacitance models allow quantified assessment, their reliance on the ‘moisture penetration depth’ necessitates comprehensive material properties and hampers their application to multi-dimensional interior objects. On the other hand, while various recently...... an alternative basis for quantitative evaluation of interior moisture buffering by the effective moisture penetration depth and effective capacitance models. The presented methodology uses simple and fast measurements only and can also be applied to multimaterial and/or multidimensional interior elements....

  14. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  15. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  16. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  17. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  18. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    Science.gov (United States)

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-11

    Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  19. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    Directory of Open Access Journals (Sweden)

    Agnieszka Gizak

    2016-01-01

    Full Text Available Molecular and cellular biology methodology is traditionally based on the reasoning called “the mechanistic explanation”. In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems’ complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites, and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  20. Relating interesting quantitative time series patterns with text events and text features

    Science.gov (United States)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other

  1. Quantitative Accelerated Life Testing of MEMS Accelerometers.

    Science.gov (United States)

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-11-20

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing thereliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shownin this paper and an attempt to assess the reliability level for a batch of MEMSaccelerometers is reported. The testing plan is application-driven and contains combinedtests: thermal (high temperature) and mechanical stress. Two variants of mechanical stressare used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tiltingand high temperature is used. Tilting is appropriate as application-driven stress, because thetilt movement is a natural environment for devices used for automotive and aerospaceapplications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The testresults demonstrated the excellent reliability of the studied devices, the failure rate in the"worst case" being smaller than 10 -7 h -1 .

  2. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  3. Quantitative determination of peripheral arterio-venous shunts by means of radioactively labelled microspheres

    International Nuclear Information System (INIS)

    Friese, K.H.

    1981-01-01

    In the present work a nuclear method of quantitative measurement of peripheral arterio-venous shunts with a whole-body scanner is standardized. This method, developed at the beginning of the 70s at Tuebingen, stands out in contrast with earlier measuring methods by the application of the theory of quantitative scintiscanning. This means that the scintigram obtained after injection of sup(99m)technetium-labelled human albumin microspheres into an artery before the shunt is corrected for the quantitative shunt calculation by several factors using a computer, to avoid systematic mistakes. For the standardization of the method, 182 scintigrams were taken during model experiments and experiments on animals and human beings. This method, having a deviation of 10% at most, is excellently suited for the quantitative determination of peripheral arterio-venous shunts. Already for a pulmonary activity of 3% a peripheral shunt is proved with 97.5% probability. (orig./MG) [de

  4. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com [Airbus Group Innovations, Munich (Germany); Grosse, Christian, E-mail: Grosse@tum.de [Technical University Munich (Germany)

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.

  5. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    International Nuclear Information System (INIS)

    Kiefel, Denis; Stoessel, Rainer; Grosse, Christian

    2015-01-01

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT) system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented

  6. Quantitative fluorescence microscopy and image deconvolution.

    Science.gov (United States)

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  7. Quantitative analysis of iodine in thyroidin. I. Methods of ''dry'' and ''wet'' mineralization

    International Nuclear Information System (INIS)

    Listov, S.A.; Arzamastsev, A.P.

    1986-01-01

    The relative investigations on the quantitative determination of iodine in thyroidin using different modifications of the ''dry'' and ''wet'' mineralization show that in using these methods the difficulties due to the characteristic features of the object of investigation itself and the mineralization method as a whole must be taken into account. The studies show that the most applicable method for the analysis of thyroidin is the method of ''dry'' mineralization with potassium carbonate. A procedure is proposed for a quantitative determination of iodine in thyroidin

  8. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  9. Development and validation of a high throughput LC–MS/MS method for simultaneous quantitation of pioglitazone and telmisartan in rat plasma and its application to a pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Pinaki Sengupta

    2017-12-01

    Full Text Available Management of cardiovascular risk factors in diabetes demands special attention due to their co-existence. Pioglitazone (PIO and telmisartan (TLM combination can be beneficial in effective control of cardiovascular complication in diabetes. In this research, we developed and validated a high throughput LC–MS/MS method for simultaneous quantitation of PIO and TLM in rat plasma. This developed method is more sensitive and can quantitate the analytes in relatively shorter period of time compared to the previously reported methods for their individual quantification. Moreover, till date, there is no bioanalytical method available to simultaneously quantitate PIO and TLM in a single run. The method was validated according to the USFDA guidelines for bioanalytical method validation. A linear response of the analytes was observed over the range of 0.005–10 µg/mL with satisfactory precision and accuracy. Accuracy at four quality control levels was within 94.27%–106.10%. The intra- and inter-day precision ranged from 2.32%–10.14 and 5.02%–8.12%, respectively. The method was reproducible and sensitive enough to quantitate PIO and TLM in rat plasma samples of a preclinical pharmacokinetic study. Due to the potential of PIO-TLM combination to be therapeutically explored, this method is expected to have significant usefulness in future. Keywords: LC–MS/MS, Rat plasma, Pharmacokinetic applicability, Telmisartan, Pioglitazone, Pharmacokinetic application

  10. 75 FR 41235 - Receipt of Applications for Permit

    Science.gov (United States)

    2010-07-15

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... the species. Multiple Applicants The following applicants each request a permit to import the sport...

  11. Qualitative and quantitative combined nonlinear dynamics model and its application in analysis of price, supply–demand ratio and selling rate

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    The qualitative and quantitative combined nonlinear dynamics model proposed in this paper fill the gap in nonlinear dynamics model in terms of qualitative and quantitative combined methods, allowing the qualitative model and quantitative model to perfectly combine and overcome their weaknesses by learning from each other. These two types of models use their strengths to make up for the other’s deficiencies. The qualitative and quantitative combined models can surmount the weakness that the qualitative model cannot be applied and verified in a quantitative manner, and the high costs and long time of multiple construction as well as verification of the quantitative model. The combined model is more practical and efficient, which is of great significance for nonlinear dynamics. The qualitative and quantitative combined modeling and model analytical method raised in this paper is not only applied to nonlinear dynamics, but can be adopted and drawn on in the modeling and model analysis of other fields. Additionally, the analytical method of qualitative and quantitative combined nonlinear dynamics model proposed in this paper can satisfactorily resolve the problems with the price system’s existing nonlinear dynamics model analytical method. The three-dimensional dynamics model of price, supply–demand ratio and selling rate established in this paper make estimates about the best commodity prices using the model results, thereby providing a theoretical basis for the government’s macro-control of price. Meanwhile, this model also offer theoretical guidance to how to enhance people’s purchasing power and consumption levels through price regulation and hence to improve people’s living standards.

  12. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  13. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  14. Optimization and automation of quantitative NMR data extraction.

    Science.gov (United States)

    Bernstein, Michael A; Sýkora, Stan; Peng, Chen; Barba, Agustín; Cobas, Carlos

    2013-06-18

    NMR is routinely used to quantitate chemical species. The necessary experimental procedures to acquire quantitative data are well-known, but relatively little attention has been applied to data processing and analysis. We describe here a robust expert system that can be used to automatically choose the best signals in a sample for overall concentration determination and determine analyte concentration using all accepted methods. The algorithm is based on the complete deconvolution of the spectrum which makes it tolerant of cases where signals are very close to one another and includes robust methods for the automatic classification of NMR resonances and molecule-to-spectrum multiplets assignments. With the functionality in place and optimized, it is then a relatively simple matter to apply the same workflow to data in a fully automatic way. The procedure is desirable for both its inherent performance and applicability to NMR data acquired for very large sample sets.

  15. Quantitative ptychographic reconstruction by applying a probe constraint

    Science.gov (United States)

    Reinhardt, J.; Schroer, C. G.

    2018-04-01

    The coherent scanning technique X-ray ptychography has become a routine tool for high-resolution imaging and nanoanalysis in various fields of research such as chemistry, biology or materials science. Often the ptychographic reconstruction results are analysed in order to yield absolute quantitative values for the object transmission and illuminating probe function. In this work, we address a common ambiguity encountered in scaling the object transmission and probe intensity via the application of an additional constraint to the reconstruction algorithm. A ptychographic measurement of a model sample containing nanoparticles is used as a test data set against which to benchmark in the reconstruction results depending on the type of constraint used. Achieving quantitative absolute values for the reconstructed object transmission is essential for advanced investigation of samples that are changing over time, e.g., during in-situ experiments or in general when different data sets are compared.

  16. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  17. Genomic value prediction for quantitative traits under the epistatic model

    Directory of Open Access Journals (Sweden)

    Xu Shizhong

    2011-01-01

    Full Text Available Abstract Background Most quantitative traits are controlled by multiple quantitative trait loci (QTL. The contribution of each locus may be negligible but the collective contribution of all loci is usually significant. Genome selection that uses markers of the entire genome to predict the genomic values of individual plants or animals can be more efficient than selection on phenotypic values and pedigree information alone for genetic improvement. When a quantitative trait is contributed by epistatic effects, using all markers (main effects and marker pairs (epistatic effects to predict the genomic values of plants can achieve the maximum efficiency for genetic improvement. Results In this study, we created 126 recombinant inbred lines of soybean and genotyped 80 makers across the genome. We applied the genome selection technique to predict the genomic value of somatic embryo number (a quantitative trait for each line. Cross validation analysis showed that the squared correlation coefficient between the observed and predicted embryo numbers was 0.33 when only main (additive effects were used for prediction. When the interaction (epistatic effects were also included in the model, the squared correlation coefficient reached 0.78. Conclusions This study provided an excellent example for the application of genome selection to plant breeding.

  18. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  19. Quantitative PET imaging with the 3T MR-BrainPET

    International Nuclear Information System (INIS)

    Weirich, C.; Scheins, J.; Lohmann, P.; Tellmann, L.; Byars, L.; Michel, C.; Rota Kops, E.; Brenner, D.; Herzog, H.; Shah, N.J.

    2013-01-01

    The new hybrid imaging technology of MR-PET allows for simultaneous acquisition of versatile MRI contrasts and the quantitative metabolic imaging with PET. In order to achieve the quantification of PET images with minimal residual error the application of several corrections is crucial. In this work we present our results on quantification with the 3T MR BrainPET scanner

  20. Quantitative and qualitative research across cultures and languages: cultural metrics and their application.

    Science.gov (United States)

    Wagner, Wolfgang; Hansen, Karolina; Kronberger, Nicole

    2014-12-01

    Growing globalisation of the world draws attention to cultural differences between people from different countries or from different cultures within the countries. Notwithstanding the diversity of people's worldviews, current cross-cultural research still faces the challenge of how to avoid ethnocentrism; comparing Western-driven phenomena with like variables across countries without checking their conceptual equivalence clearly is highly problematic. In the present article we argue that simple comparison of measurements (in the quantitative domain) or of semantic interpretations (in the qualitative domain) across cultures easily leads to inadequate results. Questionnaire items or text produced in interviews or via open-ended questions have culturally laden meanings and cannot be mapped onto the same semantic metric. We call the culture-specific space and relationship between variables or meanings a 'cultural metric', that is a set of notions that are inter-related and that mutually specify each other's meaning. We illustrate the problems and their possible solutions with examples from quantitative and qualitative research. The suggested methods allow to respect the semantic space of notions in cultures and language groups and the resulting similarities or differences between cultures can be better understood and interpreted.

  1. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  2. Quantitative Morphometric Analysis of Terrestrial Glacial Valleys and the Application to Mars

    Science.gov (United States)

    Allred, Kory

    Although the current climate on Mars is very cold and dry, it is generally accepted that the past environments on the planet were very different. Paleo-environments may have been warm and wet with oceans and rivers. And there is abundant evidence of water ice and glaciers on the surface as well. However, much of that comes from visual interpretation of imagery and other remote sensing data. For example, some of the characteristics that have been utilized to distinguish glacial forms are the presence of landscape features that appear similar to terrestrial glacial landforms, constraining surrounding topography, evidence of flow, orientation, elevation and valley shape. The main purpose of this dissertation is to develop a model that uses quantitative variables extracted from elevation data that can accurately categorize a valley basin as either glacial or non-glacial. The application of this model will limit the inherent subjectivity of image analysis by human interpretation. The model developed uses hypsometric attributes (elevation-area relationship), a newly defined variable similar to the equilibrium line altitude for an alpine glacier, and two neighborhood search functions intended to describe the valley cross-sectional curvature, all based on a digital elevation model (DEM) of a region. The classification model uses data-mining techniques trained on several terrestrial mountain ranges in varied geologic and geographic settings. It was applied to a select set of previously catalogued locations on Mars that resemble terrestrial glaciers. The results suggest that the landforms do have a glacial origin, thus supporting much of the previous research that has identified the glacial landforms. This implies that the paleo-environment of Mars was at least episodically cold and wet, probably during a period of increased planetary obliquity. Furthermore, the results of this research and the implications thereof add to the body of knowledge for the current and past

  3. Statistical shape analysis using 3D Poisson equation--A quantitatively validated approach.

    Science.gov (United States)

    Gao, Yi; Bouix, Sylvain

    2016-05-01

    Statistical shape analysis has been an important area of research with applications in biology, anatomy, neuroscience, agriculture, paleontology, etc. Unfortunately, the proposed methods are rarely quantitatively evaluated, and as shown in recent studies, when they are evaluated, significant discrepancies exist in their outputs. In this work, we concentrate on the problem of finding the consistent location of deformation between two population of shapes. We propose a new shape analysis algorithm along with a framework to perform a quantitative evaluation of its performance. Specifically, the algorithm constructs a Signed Poisson Map (SPoM) by solving two Poisson equations on the volumetric shapes of arbitrary topology, and statistical analysis is then carried out on the SPoMs. The method is quantitatively evaluated on synthetic shapes and applied on real shape data sets in brain structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. NAIMA: target amplification strategy allowing quantitative on-chip detection of GMOs.

    Science.gov (United States)

    Morisset, Dany; Dobnik, David; Hamels, Sandrine; Zel, Jana; Gruden, Kristina

    2008-10-01

    We have developed a novel multiplex quantitative DNA-based target amplification method suitable for sensitive, specific and quantitative detection on microarray. This new method named NASBA Implemented Microarray Analysis (NAIMA) was applied to GMO detection in food and feed, but its application can be extended to all fields of biology requiring simultaneous detection of low copy number DNA targets. In a first step, the use of tailed primers allows the multiplex synthesis of template DNAs in a primer extension reaction. A second step of the procedure consists of transcription-based amplification using universal primers. The cRNA product is further on directly ligated to fluorescent dyes labelled 3DNA dendrimers allowing signal amplification and hybridized without further purification on an oligonucleotide probe-based microarray for multiplex detection. Two triplex systems have been applied to test maize samples containing several transgenic lines, and NAIMA has shown to be sensitive down to two target copies and to provide quantitative data on the transgenic contents in a range of 0.1-25%. Performances of NAIMA are comparable to singleplex quantitative real-time PCR. In addition, NAIMA amplification is faster since 20 min are sufficient to achieve full amplification.

  5. Quantitative Near-field Microscopy of Heterogeneous and Correlated Electron Oxides

    Science.gov (United States)

    McLeod, Alexander Swinton

    Scanning near-field optical microscopy (SNOM) is a novel scanning probe microscopy technique capable of circumventing the conventional diffraction limit of light, affording unparalleled optical resolution (down to 10 nanometers) even for radiation in the infrared and terahertz energy regimes, with light wavelengths exceeding 10 micrometers. However, although this technique has been developed and employed for more than a decade to a qualitatively impressive effect, researchers have lacked a practically quantitative grasp of its capabilities, and its application scope has so far remained restricted by implementations limited to ambient atmospheric conditions. The two-fold objective of this dissertation work has been to address both these shortcomings. The first half of the dissertation presents a realistic, semi-analytic, and benchmarked theoretical description of probe-sample near-field interactions that form the basis of SNOM. Owing its name to the efficient nano-focusing of light at a sharp metallic apex, the "lightning rod model" of probe-sample near-field interactions is mathematically developed from a flexible and realistic scattering formalism. Powerful and practical applications are demonstrated through the accurate prediction of spectroscopic near-field optical contrasts, as well as the "inversion" of these spectroscopic contrasts into a quantitative description of material optical properties. Thus enabled, this thesis work proceeds to present quantitative applications of infrared near-field spectroscopy to investigate nano-resolved chemical compositions in a diverse host of samples, including technologically relevant lithium ion battery materials, astrophysical planetary materials, and invaluable returned extraterrestrial samples. The second half of the dissertation presents the design, construction, and demonstration of a sophisticated low-temperature scanning near-field infrared microscope. This instrument operates in an ultra-high vacuum environment

  6. Quantitative remote sensing in thermal infrared theory and applications

    CERN Document Server

    Tang, Huajun

    2014-01-01

    This comprehensive technical overview of the core theory of thermal remote sensing and its applications in hydrology, agriculture, and forestry includes a host of illuminating examples and covers everything from the basics to likely future trends in the field.

  7. Limitations for qualitative and quantitative neutron activation analysis using reactor neutrons

    International Nuclear Information System (INIS)

    El-Abbady, W.H.; El-Tanahy, Z.H.; El-Hagg, A.A.; Hassan, A.M.

    1999-01-01

    In this work, the most important limitations for qualitative and quantitative analysis using reactor neutrons for activation are reviewed. Each limitation is discussed using different examples of activated samples. Photopeak estimation, nuclear reactions interference and neutron flux measurements are taken into consideration. Solutions for high accuracy evaluation in neutron activation analysis applications are given. (author)

  8. Quantitative assessment of breast density from digitized mammograms into Tabar's patterns

    International Nuclear Information System (INIS)

    Jamal, N; Ng, K-H; Looi, L-M; McLean, D; Zulfiqar, A; Tan, S-P; Liew, W-F; Shantini, A; Ranganathan, S

    2006-01-01

    We describe a semi-automated technique for the quantitative assessment of breast density from digitized mammograms in comparison with patterns suggested by Tabar. It was developed using the MATLAB-based graphical user interface applications. It is based on an interactive thresholding method, after a short automated method that shows the fibroglandular tissue area, breast area and breast density each time new thresholds are placed on the image. The breast density is taken as a percentage of the fibroglandular tissue to the breast tissue areas. It was tested in four different ways, namely by examining: (i) correlation of the quantitative assessment results with subjective classification, (ii) classification performance using the quantitative assessment technique, (iii) interobserver agreement and (iv) intraobserver agreement. The results of the quantitative assessment correlated well (r 2 = 0.92) with the subjective Tabar patterns classified by the radiologist (correctly classified 83% of digitized mammograms). The average kappa coefficient for the agreement between the readers was 0.63. This indicated moderate agreement between the three observers in classifying breast density using the quantitative assessment technique. The kappa coefficient of 0.75 for intraobserver agreement reflected good agreement between two sets of readings. The technique may be useful as a supplement to the radiologist's assessment in classifying mammograms into Tabar's pattern associated with breast cancer risk

  9. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    Science.gov (United States)

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  10. Clinical application of quantitative computed tomography in osteogenesis imperfecta-suspected cat.

    Science.gov (United States)

    Won, Sungjun; Chung, Woo-Jo; Yoon, Junghee

    2017-09-30

    One-year-old male Persian cat presented with multiple fractures and no known traumatic history. Marked decrease of bone radiopacity and thin cortices of all long bones were identified on radiography. Tentative diagnosis was osteogenesis imperfecta, a congenital disorder characterized by fragile bone. To determine bone mineral density (BMD), quantitative computed tomography (QCT) was performed. The QCT results revealed a mean trabecular BMD of vertebral bodies of 149.9 ± 86.5 mg/cm 3 . After bisphosphonate therapy, BMD of the same site increased significantly (218.5 ± 117.1 mg/cm 3 , p < 0.05). QCT was a useful diagnostic tool to diagnose osteopenia and quantify response to medical treatment.

  11. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1978-01-01

    The bases for developing quantitative assessment of exposure risks in the human being, and the several problems that accompany the assessment and introduction of the risk of exposure to high and low LET radiation into radiation protection, will be evaluated. The extension of the pioneering radiation protection philosophies to the control of other hazardous agents that cannot be eliminated from the environment will be discussed, as will the serious misunderstandings and misuse of concepts and facts that have inevitably surrounded the application to one agent alone, of the protection philosophy that must in time be applied to a broad spectrum of potentially hazardous agents. (orig.) [de

  12. Application of microcomputed tomography for quantitative analysis of dental root canal obturations

    Directory of Open Access Journals (Sweden)

    Anna Kierklo

    2014-03-01

    Full Text Available Introduction: The aim of the study was to apply microcomputed tomography to quantitative evaluation of voids and to test any specific location of voids in tooth’s root canal obturations. Materials and Methods: Twenty root canals were prepared and obturated with gutta-percha and Tubli-Seal sealer using the thermoplastic compaction method (System B + Obtura II. Roots were scanned and three-dimensional visualization was obtained. The volume and Feret’s diameter of I-voids (at the filling/dentine interface and S-voids (surrounded by filling material were measured.Results: The results revealed that none of the scanned root canal fillings were void-free. For I-voids, the volume fraction was significantly larger, but their number was lower (P = 0.0007, than for S-voids. Both types of voids occurred in characteristic regions (P < 0.001. I-voids occurred mainly in the apical third, while S-voids in the coronal third of the canal filling.Conclusions: Within the limitations of this study, our results indicate that microtomography, with proposed semi-automatic algorithm, is a useful tools for three-dimensional quantitative evaluation of dental root canal fillings. In canals filled with thermoplastic gutta-percha and Tubli-Seal, voids at the interface between the filling and canal dentine deserve special attention due to of their periapical location, which might promote apical microleakage. Further studies might help to elucidate the clinical relevance of these results.

  13. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    Science.gov (United States)

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  14. Application of Near-Infrared Spectroscopy to Quantitatively Determine Relative Content of Puccnia striiformis f. sp. tritici DNA in Wheat Leaves in Incubation Period

    Directory of Open Access Journals (Sweden)

    Yaqiong Zhao

    2017-01-01

    Full Text Available Stripe rust caused by Puccinia striiformis f. sp. tritici (Pst is a devastating wheat disease worldwide. Potential application of near-infrared spectroscopy (NIRS in detection of pathogen amounts in latently Pst-infected wheat leaves was investigated for disease prediction and control. A total of 300 near-infrared spectra were acquired from the Pst-infected leaf samples in an incubation period, and relative contents of Pst DNA in the samples were obtained using duplex TaqMan real-time PCR arrays. Determination models of the relative contents of Pst DNA in the samples were built using quantitative partial least squares (QPLS, support vector regression (SVR, and a method integrated with QPLS and SVR. The results showed that the kQPLS-SVR model built with a ratio of training set to testing set equal to 3 : 1 based on the original spectra, when the number of the randomly selected wavelength points was 700, the number of principal components was 8, and the number of the built QPLS models was 5, was the best. The results indicated that quantitative detection of Pst DNA in leaves in the incubation period could be implemented using NIRS. A novel method for determination of latent infection levels of Pst and early detection of stripe rust was provided.

  15. Quantitative diagnosis of skeletons with demineralizing osteopathy

    International Nuclear Information System (INIS)

    Banzer, D.

    1979-01-01

    The quantitative diagnosis of bone diseases must be assessed according to the accuracy of the applied method, the expense in apparatus, personnel and financial resources and the comparability of results. Nuclide absorptiometry and in the future perhaps computed tomography represent the most accurate methods for determining the mineral content of bones. Their application is the clinics' prerogative because of the costs. Morphometry provides quantiative information, in particular in course control, and enables an objective judgement of visual pictures. It requires little expenditure and should be combined with microradioscopy. Direct comparability of the findings of different working groups is most easy in morphometry; it depends on the equipment in computerized tomography and is still hardly possible in nuclide absorptiometry. For fundamental physical reason, it will hardly be possible to produce a low-cost, fast and easy-to-handle instrument for the determination of the mineral salt concentration in bones. Instead, there is rather a trend towards more expensive equipment, e.g. CT instruments; the universal use of these instruments, however, will help to promote quantitative diagnoses. (orig.) [de

  16. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  17. Quantitative x-ray dark-field computed tomography

    International Nuclear Information System (INIS)

    Bech, M; Pfeiffer, F; Bunk, O; Donath, T; David, C; Feidenhans'l, R

    2010-01-01

    The basic principles of x-ray image formation in radiology have remained essentially unchanged since Roentgen first discovered x-rays over a hundred years ago. The conventional approach relies on x-ray attenuation as the sole source of contrast and draws exclusively on ray or geometrical optics to describe and interpret image formation. Phase-contrast or coherent scatter imaging techniques, which can be understood using wave optics rather than ray optics, offer ways to augment or complement the conventional approach by incorporating the wave-optical interaction of x-rays with the specimen. With a recently developed approach based on x-ray optical gratings, advanced phase-contrast and dark-field scatter imaging modalities are now in reach for routine medical imaging and non-destructive testing applications. To quantitatively assess the new potential of particularly the grating-based dark-field imaging modality, we here introduce a mathematical formalism together with a material-dependent parameter, the so-called linear diffusion coefficient and show that this description can yield quantitative dark-field computed tomography (QDFCT) images of experimental test phantoms.

  18. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    Science.gov (United States)

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  19. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    Science.gov (United States)

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification

  20. Quantitative Image Restoration in Bright Field Optical Microscopy.

    Science.gov (United States)

    Gutiérrez-Medina, Braulio; Sánchez Miranda, Manuel de Jesús

    2017-11-07

    Bright field (BF) optical microscopy is regarded as a poor method to observe unstained biological samples due to intrinsic low image contrast. We introduce quantitative image restoration in bright field (QRBF), a digital image processing method that restores out-of-focus BF images of unstained cells. Our procedure is based on deconvolution, using a point spread function modeled from theory. By comparing with reference images of bacteria observed in fluorescence, we show that QRBF faithfully recovers shape and enables quantify size of individual cells, even from a single input image. We applied QRBF in a high-throughput image cytometer to assess shape changes in Escherichia coli during hyperosmotic shock, finding size heterogeneity. We demonstrate that QRBF is also applicable to eukaryotic cells (yeast). Altogether, digital restoration emerges as a straightforward alternative to methods designed to generate contrast in BF imaging for quantitative analysis. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  1. Sustainability appraisal. Quantitative methods and mathematical techniques for environmental performance evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Erechtchoukova, Marina G.; Khaiter, Peter A. [York Univ., Toronto, ON (Canada). School of Information Technology; Golinska, Paulina (eds.) [Poznan Univ. of Technology (Poland)

    2013-06-01

    The book will present original research papers on the quantitative methods and techniques for the evaluation of the sustainability of business operations and organizations' overall environmental performance. The book contributions will describe modern methods and approaches applicable to the multi-faceted problem of sustainability appraisal and will help to fulfil generic frameworks presented in the literature with the specific quantitative techniques so needed in practice. The scope of the book is interdisciplinary in nature, making it of interest to environmental researchers, business managers and process analysts, information management professionals and environmental decision makers, who will find valuable sources of information for their work-related activities. Each chapter will provide sufficient background information, a description of problems, and results, making the book useful for a wider audience. Additional software support is not required. One of the most important issues in developing sustainable management strategies and incorporating ecodesigns in production, manufacturing and operations management is the assessment of the sustainability of business operations and organizations' overall environmental performance. The book presents the results of recent studies on sustainability assessment. It provides a solid reference for researchers in academia and industrial practitioners on the state-of-the-art in sustainability appraisal including the development and application of sustainability indices, quantitative methods, models and frameworks for the evaluation of current and future welfare outcomes, recommendations on data collection and processing for the evaluation of organizations' environmental performance, and eco-efficiency approaches leading to business process re-engineering.

  2. Improvements in fast-neutron spectroscopy methods (1961); Amelioration des methodes de spectrometrie des neutrons rapides (1961)

    Energy Technology Data Exchange (ETDEWEB)

    Cambou, F [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1961-02-15

    This research aimed at improving fast-neutron electronic detectors based on n-p elastic scattering. The first part concerns proportional counters; careful constructional methods have made it possible to plot mono-energetic neutron spectra in the range 700 keV - 3 MeV with a resolution of 7 per cent. The second part concerns scintillation counters: an organic scintillator and an inorganic scintillator covered with a thin layer of a scattering agent. An exact study of the types of scintillation has made it possible to develop efficient discriminator circuits. Different neutron spectra plotted in the presence of a strong gamma background are presented. The last part deals with the development of form discrimination methods for the study, in the actual beam, of the elastic scattering of 14.58 MeV electrons. With hydrogen, the distribution f ({phi}) of the recoil protons is f({phi}) = 1 + 0.034 cos {phi} + 0.042 cos{sup 2} {phi}. With tritium the scattering is strongly anisotropic; the curve representing the variation of the differential cross-section for the elastic scattering in the centre of mass system is obtained with a target containing 1 cm{sup 3} of tritium. (author) [French] Le travail a porte sur l'amelioration des detecteurs electroniques de neutrons rapides bases sur la diffusion elastique n-p. La premiere partie est relative aux compteurs proportionnels; des methodes soignees de fabrication ont permis des traces de spectres de neutrons monoenergetiques dans le domaine 700 keV - 3 MeV avec une resolution de 7 pour cent. La deuxieme partie est relative au compteur a scintillations; scintillateur organique et scintillateur mineral recouvert d'un diffuseur mince. Une etude precise des formes de scintillations a permis la mise au point de circuits discriminateurs efficaces. Differents spectres de neutrons traces en presence d'un fond gamma intense sont presentes. La derniere partie est relative a la mise en oeuvre des methodes de discrimination de forme pour l

  3. Special issues in quantitation of brain receptors and related markers by emission computed tomography

    International Nuclear Information System (INIS)

    Links, J.M.

    1998-01-01

    Emission computed tomography provides an opportunity to quantify neurotransmitter-neuro receptor systems in vivo. In order to do so, very high image quality and quantitative accuracy are required. Quantitation of receptor systems involves considerations of physical effects (such as finite spatial resolution, scatter, and attenuation), instrumentation design (such as spatial sampling), image processing (such as filtering), and data analysis (such as kinetic modeling). Appropriate application of these considerations can lead to useful results, but emerging approaches promise even greater levels of accuracy and precision

  4. The bovine QTL viewer: a web accessible database of bovine Quantitative Trait Loci

    Directory of Open Access Journals (Sweden)

    Xavier Suresh R

    2006-06-01

    Full Text Available Abstract Background Many important agricultural traits such as weight gain, milk fat content and intramuscular fat (marbling in cattle are quantitative traits. Most of the information on these traits has not previously been integrated into a genomic context. Without such integration application of these data to agricultural enterprises will remain slow and inefficient. Our goal was to populate a genomic database with data mined from the bovine quantitative trait literature and to make these data available in a genomic context to researchers via a user friendly query interface. Description The QTL (Quantitative Trait Locus data and related information for bovine QTL are gathered from published work and from existing databases. An integrated database schema was designed and the database (MySQL populated with the gathered data. The bovine QTL Viewer was developed for the integration of QTL data available for cattle. The tool consists of an integrated database of bovine QTL and the QTL viewer to display QTL and their chromosomal position. Conclusion We present a web accessible, integrated database of bovine (dairy and beef cattle QTL for use by animal geneticists. The viewer and database are of general applicability to any livestock species for which there are public QTL data. The viewer can be accessed at http://bovineqtl.tamu.edu.

  5. Quantitative X-ray microtomography with synchrotron radiation

    Energy Technology Data Exchange (ETDEWEB)

    Donath, T. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Materialforschung

    2007-07-01

    Synchrotron-radiation-based computed microtomography (SR{sub {mu}}CT) is an established method for the examination of volume structures. It allows to measure the x-ray attenuation coefficient of a specimen three-dimensionally with a spatial resolution of about one micrometer. In contrast to conventional x-ray sources (x-ray tubes), the unique properties of synchrotron radiation enable quantitative measurements that do not suffer from beam-hardening artifacts. During this work the capabilities for quantitative SR{sub {mu}}CT measurements have been further improved by enhancements that were made to the SR{sub {mu}}CT apparatus and to the reconstruction chain. For high-resolution SR{sub {mu}}CT an x-ray camera consisting of luminescent screen (x-ray phosphor), lens system, and CCD camera was used. A significant suppression of blur that is caused by reflections inside the luminescent screen could be achieved by application of an absorbing optical coating to the screen surface. It is shown that blur and ring artifacts in the tomographic reconstructions are thereby drastically reduced. Furthermore, a robust and objective method for the determination of the center of rotation in projection data (sinograms) is presented that achieves sub-pixel precision. By implementation of this method into the reconstruction chain, complete automation of the reconstruction process has been achieved. Examples of quantitative SR{sub {mu}}CT studies conducted at the Hamburger Synchrotronstrahlungslabor HASYLAB at the Deutsches Elektronen-Synchrotron DESY are presented and used for the demonstration of the achieved enhancements. (orig.)

  6. Quantitative X-ray microtomography with synchrotron radiation

    International Nuclear Information System (INIS)

    Donath, T.

    2007-01-01

    Synchrotron-radiation-based computed microtomography (SR μ CT) is an established method for the examination of volume structures. It allows to measure the x-ray attenuation coefficient of a specimen three-dimensionally with a spatial resolution of about one micrometer. In contrast to conventional x-ray sources (x-ray tubes), the unique properties of synchrotron radiation enable quantitative measurements that do not suffer from beam-hardening artifacts. During this work the capabilities for quantitative SR μ CT measurements have been further improved by enhancements that were made to the SR μ CT apparatus and to the reconstruction chain. For high-resolution SR μ CT an x-ray camera consisting of luminescent screen (x-ray phosphor), lens system, and CCD camera was used. A significant suppression of blur that is caused by reflections inside the luminescent screen could be achieved by application of an absorbing optical coating to the screen surface. It is shown that blur and ring artifacts in the tomographic reconstructions are thereby drastically reduced. Furthermore, a robust and objective method for the determination of the center of rotation in projection data (sinograms) is presented that achieves sub-pixel precision. By implementation of this method into the reconstruction chain, complete automation of the reconstruction process has been achieved. Examples of quantitative SR μ CT studies conducted at the Hamburger Synchrotronstrahlungslabor HASYLAB at the Deutsches Elektronen-Synchrotron DESY are presented and used for the demonstration of the achieved enhancements. (orig.)

  7. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  8. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    Science.gov (United States)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  9. Oracle database performance and scalability a quantitative approach

    CERN Document Server

    Liu, Henry H

    2011-01-01

    A data-driven, fact-based, quantitative text on Oracle performance and scalability With database concepts and theories clearly explained in Oracle's context, readers quickly learn how to fully leverage Oracle's performance and scalability capabilities at every stage of designing and developing an Oracle-based enterprise application. The book is based on the author's more than ten years of experience working with Oracle, and is filled with dependable, tested, and proven performance optimization techniques. Oracle Database Performance and Scalability is divided into four parts that enable reader

  10. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  11. The controlled incorporation of foreign elements in metal surfaces by means of quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1977-01-01

    Quantitative ion implantation is a powerful new method for the doping of metal surfaces with accurately known quantities of an element or one of its isotopes. It can be applied for the preparation of standards for various uses in instrumental methods of surface and bulk analysis. This paper provides selected information on some theoretical and practical aspects of quantitative ion implantation with the object of promoting the application of the method and stimulating further purposeful research on the subject. (Auth.)

  12. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    Science.gov (United States)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  13. Measurement error of a simplified protocol for quantitative sensory tests in chronic pain patients

    DEFF Research Database (Denmark)

    Müller, Monika; Biurrun Manresa, José; Limacher, Andreas

    2017-01-01

    BACKGROUND AND OBJECTIVES: Large-scale application of Quantitative Sensory Tests (QST) is impaired by lacking standardized testing protocols. One unclear methodological aspect is the number of records needed to minimize measurement error. Traditionally, measurements are repeated 3 to 5 times...

  14. Realizing the quantitative potential of the radioisotope image

    International Nuclear Information System (INIS)

    Brown, N.J.G.; Britton, K.E.; Cruz, F.R.

    1977-01-01

    The sophistication and accuracy of a clinical strategy depends on the accuracy of the results of the tests used. When numerical values are given in the test report powerful clinical strategies can be developed. The eye is well able to perceive structures in a high-quality grey-scale image. However, the degree of difference in density between two points cannot be estimated quantitatively by eye. This creates a problem particularly when there is only a small difference between the count-rate at a suspicious point or region and the count-rate to be expected there if the image were normal. To resolve this problem methods of quantitation of the amplitude of a feature, defined as the difference between the observed and expected values at the region of the feature, have been developed. The eye can estimate the frequency of light entering it very accurately (perceived as colour). Thus, if count-rate data are transformed into colour in a systematic way then information about realtive count-rate can be perceived. A computer-driven, interactive colour display system is used in which the count-rate range of each colour is computed as a percentage of a reference count-rate value. This can be used to obtain quantitative estimates of the amplitude of an image feature. The application of two methods to normal and pathological data are described and the results discussed. (author)

  15. Quantitative criteria in the application of the principle of precaution

    International Nuclear Information System (INIS)

    Touzet, Rodolfo; Ferrari, Jorge

    2008-01-01

    Full text: The Principle of Precaution establishes that 'when an activity represents a threat or damage for the human health or the environment, it is necessary to take measurements of precaution even when it could not have demonstrated the cause-effect relationship in a scientific and conclusive form'. This declaration implies acting even in presence of uncertainty, deriving the responsibility and the safety to who creates the risk, to analyze the possible alternatives and to use participate methods to take decisions. This presents practically two dilemmas: 1) How to perform a cost-benefit analysis when the relation cause-effect is not even established for the health of the exposed persons? (In case of ionizing radiations a factor α is in use, that represents the economic cost of the dose received by a person); 2) Which criterion must to be used in case of ionizing radiation act synergically with non ionizing radiation? How to integrate the quantitative optimization criterion with a qualitative criterion of precaution?. It will have to appear some temporary hypotheses in order to be able to perform the quantitative corresponding evaluations. In case of the low frequencies the situation was exactly the same in the past; but the epidemiological studies as well as the experiments in vivo and in vitro demonstrated that the exposure can increase the risk of leukaemia in children and induce other problems of health in children and adults. A temporary possible hypothesis for radio frequency is to assume that the effects are similar in magnitude to the ones caused for the fields of low frequency. In this case it is possible to demonstrate that if this was really like that the statistics would not allow to demonstrate it even due to the persons' quantity and the times used in the studied populations and the times of latency of the leukaemia. The use of some hypothesis of work to perform the cost - benefit studies allow us to establish different alternatives for the

  16. Direct and quantitative photothermal absorption spectroscopy of individual particulates

    International Nuclear Information System (INIS)

    Tong, Jonathan K.; Hsu, Wei-Chun; Eon Han, Sang; Burg, Brian R.; Chen, Gang; Zheng, Ruiting; Shen, Sheng

    2013-01-01

    Photonic structures can exhibit significant absorption enhancement when an object's length scale is comparable to or smaller than the wavelength of light. This property has enabled photonic structures to be an integral component in many applications such as solar cells, light emitting diodes, and photothermal therapy. To characterize this enhancement at the single particulate level, conventional methods have consisted of indirect or qualitative approaches which are often limited to certain sample types. To overcome these limitations, we used a bilayer cantilever to directly and quantitatively measure the spectral absorption efficiency of a single silicon microwire in the visible wavelength range. We demonstrate an absorption enhancement on a per unit volume basis compared to a thin film, which shows good agreement with Mie theory calculations. This approach offers a quantitative approach for broadband absorption measurements on a wide range of photonic structures of different geometric and material compositions

  17. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Repins, Ingrid L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hacke, Peter L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Kempe, Michael D [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Whitfield, Kent [Underwriters Laboratories; Phillips, Nancy [DuPont; Sample, Tony [European Commission; Monokroussos, Christos [TUV Rheinland; Hsi, Edward [Swiss RE; Wohlgemuth, John [PowerMark Corporation; Seidel, Peter [First Solar; Jahn, Ulrike [TUV Rheinland; Tanahashi, Tadanori [National Institute of Advanced Industrial Science and Technology; Chen, Yingnan [China General Certification Center; Jaeckel, Bengt [Underwriters Laboratories; Yamamichi, Masaaki [RTS Corporation

    2017-10-05

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured acceleration factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.

  18. Integration of Qualitative and Quantitative Methods: Building and Interpreting Clusters from Grounded Theory and Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Aldo Merlino

    2007-01-01

    Full Text Available Qualitative methods present a wide spectrum of application possibilities as well as opportunities for combining qualitative and quantitative methods. In the social sciences fruitful theoretical discussions and a great deal of empirical research have taken place. This article introduces an empirical investigation which demonstrates the logic of combining methodologies as well as the collection and interpretation, both sequential as simultaneous, of qualitative and quantitative data. Specifically, the investigation process will be described, beginning with a grounded theory methodology and its combination with the techniques of structural semiotics discourse analysis to generate—in a first phase—an instrument for quantitative measuring and to understand—in a second phase—clusters obtained by quantitative analysis. This work illustrates how qualitative methods allow for the comprehension of the discursive and behavioral elements under study, and how they function as support making sense of and giving meaning to quantitative data. URN: urn:nbn:de:0114-fqs0701219

  19. Factors affecting the repeatability of gamma camera calibration for quantitative imaging applications using a sealed source

    International Nuclear Information System (INIS)

    Anizan, N; Wahl, R L; Frey, E C; Wang, H; Zhou, X C

    2015-01-01

    Several applications in nuclear medicine require absolute activity quantification of single photon emission computed tomography images. Obtaining a repeatable calibration factor that converts voxel values to activity units is essential for these applications. Because source preparation and measurement of the source activity using a radionuclide activity meter are potential sources of variability, this work investigated instrumentation and acquisition factors affecting repeatability using planar acquisition of sealed sources. The calibration factor was calculated for different acquisition and geometry conditions to evaluate the effect of the source size, lateral position of the source in the camera field-of-view (FOV), source-to-camera distance (SCD), and variability over time using sealed Ba-133 sources. A small region of interest (ROI) based on the source dimensions and collimator resolution was investigated to decrease the background effect. A statistical analysis with a mixed-effects model was used to evaluate quantitatively the effect of each variable on the global calibration factor variability. A variation of 1 cm in the measurement of the SCD from the assumed distance of 17 cm led to a variation of 1–2% in the calibration factor measurement using a small disc source (0.4 cm diameter) and less than 1% with a larger rod source (2.9 cm diameter). The lateral position of the source in the FOV and the variability over time had small impacts on calibration factor variability. The residual error component was well estimated by Poisson noise. Repeatability of better than 1% in a calibration factor measurement using a planar acquisition of a sealed source can be reasonably achieved. The best reproducibility was obtained with the largest source with a count rate much higher than the average background in the ROI, and when the SCD was positioned within 5 mm of the desired position. In this case, calibration source variability was limited by the quantum

  20. The X-ray spectrometry Si(Li) system and it's application in quantitative analysis of rare-earth elements

    International Nuclear Information System (INIS)

    Barbosa, J.B.S.

    1985-11-01

    The basic principles involved in Si(Li) system used in X-ray spectrometry is described. It also demonstrates its application in the energy range where the resolution is better than that characteristic of conventional spectrometers. The theoretical principles underlying the interaction between the electromagnetic radiation and matter, and a review on semiconductors are presented at first. It emphasizes the fluorescence phenomenon and the process of photon detection by semiconductor crystals whose properties and characteristics allow, in the specific case of Si-crystal, the confection of detectors with large sensitivity volume useful for X-ray spectrometry. In addition, the components of the Si(Li) system are described individually, with special attention to the operating aspects, and to the parameters affecting the quality of pulse height spectrum. Finally, the spectrometer performance is experimentally evaluated though the quantitative analyses of rare-earth element oxides (La, Ce, Pr, Nd). It should be stressed that this research indicates that the X-ray emission-transmission analysis is the most adequate method under the activation conditions provided by the spectrometer, where Am 241 emissor UPSILON of 60KeV is the photon source for the fluorescence. Therefore, the experimental work was extended in order to include all the necessary treatment. (Author) [pt

  1. New journal selection for quantitative survey of infectious disease research: application for Asian trend analysis

    Directory of Open Access Journals (Sweden)

    Okabe Nobuhiko

    2009-10-01

    Full Text Available Abstract Background Quantitative survey of research articles, as an application of bibliometrics, is an effective tool for grasping overall trends in various medical research fields. This type of survey has been also applied to infectious disease research; however, previous studies were insufficient as they underestimated articles published in non-English or regional journals. Methods Using a combination of Scopus™ and PubMed, the databases of scientific literature, and English and non-English keywords directly linked to infectious disease control, we identified international and regional infectious disease journals. In order to ascertain whether the newly selected journals were appropriate to survey a wide range of research articles, we compared the number of original articles and reviews registered in the selected journals to those in the 'Infectious Disease Category' of the Science Citation Index Expanded™ (SCI Infectious Disease Category during 1998-2006. Subsequently, we applied the newly selected journals to survey the number of original articles and reviews originating from 11 Asian countries during the same period. Results One hundred journals, written in English or 7 non-English languages, were newly selected as infectious disease journals. The journals published 14,156 original articles and reviews of Asian origin and 118,158 throughout the world, more than those registered in the SCI Infectious Disease Category (4,621 of Asian origin and 66,518 of the world in the category. In Asian trend analysis of the 100 journals, Japan had the highest percentage of original articles and reviews in the area, and no noticeable increase in articles was revealed during the study period. China, India and Taiwan had relatively large numbers and a high increase rate of original articles among Asian countries. When adjusting the publication of original articles according to the country population and the gross domestic product (GDP, Singapore and

  2. Developments in quantitative electron probe microanalysis

    International Nuclear Information System (INIS)

    Tixier, R.

    1977-01-01

    A study of the range of validity of the formulae for corrections used with massive specimen analysis is made. The method used is original; we have shown that it was possible to use a property of invariability of corrected intensity ratios for standards. This invariance property provides a test for the self consistency of the theory. The theoretical and experimental conditions required for quantitative electron probe microanalysis of thin transmission electron microscope specimens are examined. The correction formulae for atomic number, absorption and fluorescence effects are calculated. Several examples of experimental results are given, relative to the quantitative analysis of intermetallic precipitates and carbides in steels. Advances in applications of electron probe instruments related to the use of computer and the present development of fully automated instruments are reviewed. The necessary statistics for measurements of X ray count data are studied. Estimation procedure and tests are developed. These methods are used to perform a statistical check of electron probe microanalysis measurements and to reject rogue values. An estimator of the confidence interval of the apparent concentration is derived. Formulae were also obtained to optimize the counting time in order to obtain the best precision in a minimum amount of time [fr

  3. Chemical applicability domain of the local lymph node assay (LLNA) for skin sensitisation potency. Part 4. Quantitative correlation of LLNA potency with human potency.

    Science.gov (United States)

    Roberts, David W; Api, Anne Marie

    2018-07-01

    Prediction of skin sensitisation potential and potency by non-animal methods is the target of many active research programmes. Although the aim is to predict sensitisation potential and potency in humans, data from the murine local lymph node assay (LLNA) constitute much the largest source of quantitative data on in vivo skin sensitisation. The LLNA has been the preferred in vivo method for identification of skin sensitising chemicals and as such is potentially valuable as a benchmark for assessment of non-animal approaches. However, in common with all predictive test methods, the LLNA is subject to false positives and false negatives with an overall level of accuracy said variously to be approximately 80% or 90%. It is also necessary to consider the extent to which, for true positives, LLNA potency correlates with human potency. In this paper LLNA potency and human potency are compared so as to express quantitatively the correlation between them, and reasons for non-agreement between LLNA and human potency are analysed. This leads to a better definition of the applicability domain of the LLNA, within which LLNA data can be used confidently to predict human potency and as a benchmark to assess the performance of non-animal approaches. Copyright © 2018. Published by Elsevier Inc.

  4. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  5. 76 FR 39432 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2011-07-06

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and.... Multiple Applicants (New Applications) The following applicants each request a permit to import the sport...

  6. Applications of FT-IR spectrophotometry in cancer diagnostics.

    Science.gov (United States)

    Bunaciu, Andrei A; Hoang, Vu Dang; Aboul-Enein, Hassan Y

    2015-01-01

    This review provides a brief background to the application of infrared spectroscopy, including Fourier transform-infrared spectroscopy, in biological fluids. It is not meant to be complete or exhaustive but to provide the reader with sufficient background for selected applications in cancer diagnostics. Fourier transform-infrared spectroscopy (FT-IR) is a fast and nondestructive analytical method. The infrared spectrum of a mixture serves as the basis to quantitate its constituents, and a number of common clinical chemistry tests have proven to be feasible using this approach. This review focuses on biomedical FT-IR applications, published in the period 2009-2013, used for early detection of cancer through qualitative and quantitative analysis.

  7. A quantitative approach to evolution of music and philosophy

    International Nuclear Information System (INIS)

    Vieira, Vilson; Fabbri, Renato; Travieso, Gonzalo; Oliveira Jr, Osvaldo N; Costa, Luciano da Fontoura

    2012-01-01

    The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master–apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic. (paper)

  8. Nanoscale Structure of Type I Collagen Fibrils: Quantitative Measurement of D-spacing

    Science.gov (United States)

    Erickson, Blake; Fang, Ming; Wallace, Joseph M.; Orr, Bradford G.; Les, Clifford M.; Holl, Mark M. Banaszak

    2012-01-01

    This paper details a quantitative method to measure the D-periodic spacing of Type I collagen fibrils using Atomic Force Microscopy coupled with analysis using a 2D Fast Fourier Transform approach. Instrument calibration, data sampling and data analysis are all discussed and comparisons of the data to the complementary methods of electron microscopy and X-ray scattering are made. Examples of the application of this new approach to the analysis of Type I collagen morphology in disease models of estrogen depletion and Osteogenesis Imperfecta are provided. We demonstrate that it is the D-spacing distribution, not the D-spacing mean, that showed statistically significant differences in estrogen depletion associated with early stage Osteoporosis and Osteogenesis Imperfecta. The ability to quantitatively characterize nanoscale morphological features of Type I collagen fibrils will provide important structural information regarding Type I collagen in many research areas, including tissue aging and disease, tissue engineering, and gene knock out studies. Furthermore, we also envision potential clinical applications including evaluation of tissue collagen integrity under the impact of diseases or drug treatments. PMID:23027700

  9. Real-time quantitative phase reconstruction in off-axis digital holography using multiplexing.

    Science.gov (United States)

    Girshovitz, Pinhas; Shaked, Natan T

    2014-04-15

    We present a new approach for obtaining significant speedup in the digital processing of extracting unwrapped phase profiles from off-axis digital holograms. The new technique digitally multiplexes two orthogonal off-axis holograms, where the digital reconstruction, including spatial filtering and two-dimensional phase unwrapping on a decreased number of pixels, can be performed on both holograms together, without redundant operations. Using this technique, we were able to reconstruct, for the first time to our knowledge, unwrapped phase profiles from off-axis holograms with 1 megapixel in more than 30 frames per second using a standard single-core personal computer on a MATLAB platform, without using graphic-processing-unit programming or parallel computing. This new technique is important for real-time quantitative visualization and measurements of highly dynamic samples and is applicable for a wide range of applications, including rapid biological cell imaging and real-time nondestructive testing. After comparing the speedups obtained by the new technique for holograms of various sizes, we present experimental results of real-time quantitative phase visualization of cells flowing rapidly through a microchannel.

  10. Development and Application of an MSALL-Based Approach for the Quantitative Analysis of Linear Polyethylene Glycols in Rat Plasma by Liquid Chromatography Triple-Quadrupole/Time-of-Flight Mass Spectrometry.

    Science.gov (United States)

    Zhou, Xiaotong; Meng, Xiangjun; Cheng, Longmei; Su, Chong; Sun, Yantong; Sun, Lingxia; Tang, Zhaohui; Fawcett, John Paul; Yang, Yan; Gu, Jingkai

    2017-05-16

    Polyethylene glycols (PEGs) are synthetic polymers composed of repeating ethylene oxide subunits. They display excellent biocompatibility and are widely used as pharmaceutical excipients. To fully understand the biological fate of PEGs requires accurate and sensitive analytical methods for their quantitation. Application of conventional liquid chromatography-tandem mass spectrometry (LC-MS/MS) is difficult because PEGs have polydisperse molecular weights (MWs) and tend to produce multicharged ions in-source resulting in innumerable precursor ions. As a result, multiple reaction monitoring (MRM) fails to scan all ion pairs so that information on the fate of unselected ions is missed. This Article addresses this problem by application of liquid chromatography-triple-quadrupole/time-of-flight mass spectrometry (LC-Q-TOF MS) based on the MS ALL technique. This technique performs information-independent acquisition by allowing all PEG precursor ions to enter the collision cell (Q2). In-quadrupole collision-induced dissociation (CID) in Q2 then effectively generates several fragments from all PEGs due to the high collision energy (CE). A particular PEG product ion (m/z 133.08592) was found to be common to all linear PEGs and allowed their total quantitation in rat plasma with high sensitivity, excellent linearity and reproducibility. Assay validation showed the method was linear for all linear PEGs over the concentration range 0.05-5.0 μg/mL. The assay was successfully applied to the pharmacokinetic study in rat involving intravenous administration of linear PEG 600, PEG 4000, and PEG 20000. It is anticipated the method will have wide ranging applications and stimulate the development of assays for other pharmaceutical polymers in the future.

  11. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    Science.gov (United States)

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  12. Quantitative performance characterization of three-dimensional noncontact fluorescence molecular tomography

    Science.gov (United States)

    Favicchio, Rosy; Psycharakis, Stylianos; Schönig, Kai; Bartsch, Dusan; Mamalaki, Clio; Papamatheakis, Joseph; Ripoll, Jorge; Zacharakis, Giannis

    2016-02-01

    Fluorescent proteins and dyes are routine tools for biological research to describe the behavior of genes, proteins, and cells, as well as more complex physiological dynamics such as vessel permeability and pharmacokinetics. The use of these probes in whole body in vivo imaging would allow extending the range and scope of current biomedical applications and would be of great interest. In order to comply with a wide variety of application demands, in vivo imaging platform requirements span from wide spectral coverage to precise quantification capabilities. Fluorescence molecular tomography (FMT) detects and reconstructs in three dimensions the distribution of a fluorophore in vivo. Noncontact FMT allows fast scanning of an excitation source and noninvasive measurement of emitted fluorescent light using a virtual array detector operating in free space. Here, a rigorous process is defined that fully characterizes the performance of a custom-built horizontal noncontact FMT setup. Dynamic range, sensitivity, and quantitative accuracy across the visible spectrum were evaluated using fluorophores with emissions between 520 and 660 nm. These results demonstrate that high-performance quantitative three-dimensional visible light FMT allowed the detection of challenging mesenteric lymph nodes in vivo and the comparison of spectrally distinct fluorescent reporters in cell culture.

  13. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  14. Optical Polarizationin Biomedical Applications

    CERN Document Server

    Tuchin, Valery V; Zimnyakov, Dmitry A

    2006-01-01

    Optical Polarization in Biomedical Applications introduces key developments in optical polarization methods for quantitative studies of tissues, while presenting the theory of polarization transfer in a random medium as a basis for the quantitative description of polarized light interaction with tissues. This theory uses the modified transfer equation for Stokes parameters and predicts the polarization structure of multiple scattered optical fields. The backscattering polarization matrices (Jones matrix and Mueller matrix) important for noninvasive medical diagnostic are introduced. The text also describes a number of diagnostic techniques such as CW polarization imaging and spectroscopy, polarization microscopy and cytometry. As a new tool for medical diagnosis, optical coherent polarization tomography is analyzed. The monograph also covers a range of biomedical applications, among them cataract and glaucoma diagnostics, glucose sensing, and the detection of bacteria.

  15. Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.

    Science.gov (United States)

    Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin

    2018-01-08

    We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.

  16. Quantitative analysis of urine vapor and breath by gas-liquid partition chromatography.

    Science.gov (United States)

    Pauling, L; Robinson, A B; Teranishi, R; Cary, P

    1971-10-01

    When a human being is placed for several days on a completely defined diet, consisting almost entirely of small molecules that are absorbed from the stomach into the blood, intestinal flora disappear because of lack of nutrition. By this technique, the composition of body fluids can be made constant (standard deviation about 10%) after a few days, permitting significant quantitative analyses to be performed. A method of temperature-programmed gas-liquid partition chromatography has been developed for this purpose. It permits the quantitative determination of about 250 substances in a sample of breath, and of about 280 substances in a sample of urine vapor. The technique should be useful in the application of the principles of orthomolecular medicine.

  17. 76 FR 20705 - Endangered Species Receipt of Applications for Permit

    Science.gov (United States)

    2011-04-13

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable.... Multiple Applicants The following applicants each request a permit to import the sport- hunted trophy of...

  18. Gold nanoparticle immunochromatographic assay for quantitative detection of urinary RBP

    Directory of Open Access Journals (Sweden)

    XU Kuan

    2013-04-01

    Full Text Available A rapid quantitative detection of urinary RBP was established by using nano-gold immunochromatography (sandwich method and trisodium citrate reduction method and a rapid immunochromatographic test strip was developed. Theimmunochromatographic test strip can quantitatively detect RBP within 15 minutes. The detection limit was 150ng/mL and detection range was from 150 to 5000 ng/mL. There were no cross-reactions with others kidney disease markers,such as urinary albumin (ALB,transferrin protein (TRF,β2-microglobulin (β2-MG,urinary fiber connecting protein (FN,and lysozyme (LZM. The results indicate that it is a quick and simple method with strong specificity,high sensitivity,and wide detection range. The rapid detection method will have extensive clinical applications in the early diagnosis of proximal tubular damage,kidney disease,diabetic nephropathy,and process monitoring.

  19. Marker-assisted selection for improving quantitative traits of forage crops

    International Nuclear Information System (INIS)

    Dolstra, O.; Denneboom, C.; Vos, Ab L.F. de; Loo, E.N. van

    2007-01-01

    This chapter provides an example of using marker-assisted selection (MAS) for breeding perennial ryegrass (Lolium perenne), a pasture species. A mapping study had shown the presence of quantitative trait loci (QTL) for seven component traits of nitrogen use efficiency (NUE). The NUE-related QTL clustered in five chromosomal regions. These QTL were validated through divergent marker selection in an F 2 population. The criterion used for plant selection was a summation index based on the number of positive QTL alleles. The evaluation studies showed a strong indirect response of marker selection on NUE. Marker selection using a summation index such as applied here proved to be very effective for difficult and complex quantitative traits such as NUE. The strategy is easily applicable in outbreeding crops to raise the frequency of several desirable alleles simultaneously. (author)

  20. Dynamic and gated PET. Quantitative imaging of the heart revisited

    International Nuclear Information System (INIS)

    Nekolla, S.G.

    2005-01-01

    This short overview focuses on the basic implementation as well as applications of cardiac PET studies acquired in dynamic and ECG triggered modes. Both acquisition modes are well suited for quantitative analysis and the advantages of such an approach are discussed. An outlook on the measurement of respiratory triggered studies and the new challenges this data presents is provided. In the context of modern PET/CT tomographs with the combination of high sensitivity and morphologic resolution, the promise of list mode acquisition is investigated. The before mentioned acquisition modes are ideal candidates for this technology the utility of which in a clinical setting is briefly discussed. The retrospective generation of dynamic and gated image data (and any combinations) is greatly facilitated with this approach. Finally, a novel presentation mode for the wealth of quantitative information generated by these systems is presented. (orig.)

  1. Controlled initiation and quantitative visualization of cell interaction dynamics - a novel hybrid microscopy method -

    NARCIS (Netherlands)

    Snijder-van As, M.I.

    2010-01-01

    This thesis describes the development, validation, and application of a hybrid microscopy technique to study cell-substrate and cell-cell interactions in a controlled and quantitative manner. We studied the spatial and temporal dynamics of the selected membrane molecules CD6 and the activated

  2. Optimized protein extraction for quantitative proteomics of yeasts.

    Directory of Open Access Journals (Sweden)

    Tobias von der Haar

    2007-10-01

    Full Text Available The absolute quantification of intracellular protein levels is technically demanding, but has recently become more prominent because novel approaches like systems biology and metabolic control analysis require knowledge of these parameters. Current protocols for the extraction of proteins from yeast cells are likely to introduce artifacts into quantification procedures because of incomplete or selective extraction.We have developed a novel procedure for protein extraction from S. cerevisiae based on chemical lysis and simultaneous solubilization in SDS and urea, which can extract the great majority of proteins to apparent completeness. The procedure can be used for different Saccharomyces yeast species and varying growth conditions, is suitable for high-throughput extraction in a 96-well format, and the resulting extracts can easily be post-processed for use in non-SDS compatible procedures like 2D gel electrophoresis.An improved method for quantitative protein extraction has been developed that removes some of the sources of artefacts in quantitative proteomics experiments, while at the same time allowing novel types of applications.

  3. Improvement of the ID model for quantitative network data

    DEFF Research Database (Denmark)

    Sørensen, Peter Borgen; Damgaard, Christian Frølund; Dupont, Yoko Luise

    2015-01-01

    Many interactions are often poorly registered or even unobserved in empirical quantitative networks. Hence, the output of the statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks. Such artefa......Many interactions are often poorly registered or even unobserved in empirical quantitative networks. Hence, the output of the statistical analyses may fail to differentiate between patterns that are statistical artefacts and those which are real characteristics of ecological networks......)1. This presentation will illustrate the application of the ID method based on a data set which consists of counts of visits by 152 pollinator species to 16 plant species. The method is based on two definitions of the underlying probabilities for each combination of pollinator and plant species: (1), pi...... reproduce the high number of zero valued cells in the data set and mimic the sampling distribution. 1 Sørensen et al, Journal of Pollination Ecology, 6(18), 2011, pp129-139...

  4. Reproducibility of radionuclide gastroesophageal reflux studies using quantitative parameters and potential role of quantitative assessment in follow-up

    International Nuclear Information System (INIS)

    Fatima, S.; Khursheed, K.; Nasir, W.; Saeed, M.A.; Fatmi, S.; Jafri, S.; Asghar, S.

    2004-01-01

    scintigraphic study. Strong correlation was seen with the RI value and severity of the clinical symptoms. It was possible to Objectively evaluate and monitor response to treatment following conservative or corrective surgical therapy using RI calculation. Our results indicate that GER may be reproducibly analyzed on scintigraphy using qualitative and quantitative parameters. Nuclear medicine studies have made a major contribution to reflux studies. The ability to quantify reflux is a particular strength of this approach. Providing the materials and procedures are validated we would anticipate further growth in both routine and research applications of this technique. (authors)

  5. Enhancing quantitative approaches for assessing community resilience

    Science.gov (United States)

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  6. A kinetic-based sigmoidal model for the polymerase chain reaction and its application to high-capacity absolute quantitative real-time PCR

    Directory of Open Access Journals (Sweden)

    Stewart Don

    2008-05-01

    Full Text Available Abstract Background Based upon defining a common reference point, current real-time quantitative PCR technologies compare relative differences in amplification profile position. As such, absolute quantification requires construction of target-specific standard curves that are highly resource intensive and prone to introducing quantitative errors. Sigmoidal modeling using nonlinear regression has previously demonstrated that absolute quantification can be accomplished without standard curves; however, quantitative errors caused by distortions within the plateau phase have impeded effective implementation of this alternative approach. Results Recognition that amplification rate is linearly correlated to amplicon quantity led to the derivation of two sigmoid functions that allow target quantification via linear regression analysis. In addition to circumventing quantitative errors produced by plateau distortions, this approach allows the amplification efficiency within individual amplification reactions to be determined. Absolute quantification is accomplished by first converting individual fluorescence readings into target quantity expressed in fluorescence units, followed by conversion into the number of target molecules via optical calibration. Founded upon expressing reaction fluorescence in relation to amplicon DNA mass, a seminal element of this study was to implement optical calibration using lambda gDNA as a universal quantitative standard. Not only does this eliminate the need to prepare target-specific quantitative standards, it relegates establishment of quantitative scale to a single, highly defined entity. The quantitative competency of this approach was assessed by exploiting "limiting dilution assay" for absolute quantification, which provided an independent gold standard from which to verify quantitative accuracy. This yielded substantive corroborating evidence that absolute accuracies of ± 25% can be routinely achieved. Comparison

  7. The Relationship between Student's Quantitative Skills, Application of Math, Science Courses, and Science Marks at Single-Sex Independent High Schools

    Science.gov (United States)

    Cambridge, David

    2012-01-01

    For independent secondary schools who offer rigorous curriculum to attract students, integration of quantitative skills in the science courses has become an important definition of rigor. However, there is little research examining students' quantitative skills in relation to high school science performance within the single-sex independent school…

  8. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  9. Quantitative aspects of myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Vogel, R.A.

    1980-01-01

    Myocardial perfusion measurements have traditionally been performed in a quantitative fashion using application of the Sapirstein, Fick, Kety-Schmidt, or compartmental analysis principles. Although global myocardial blood flow measurements have not proven clinically useful, regional determinations have substantially advanced our understanding of and ability to detect myocardial ischemia. With the introduction of thallium-201, such studies have become widely available, although these have generally undergone qualitative evaluation. Using computer-digitized data, several methods for the quantification of myocardial perfusion images have been introduced. These include orthogonal and polar coordinate systems and anatomically oriented region of interest segmentation. Statistical ranges of normal and time-activity analyses have been applied to these data, resulting in objective and reproducible means of data evaluation

  10. Quantitative identification and analysis of sub-seismic extensional structure system: technique schemes and processes

    International Nuclear Information System (INIS)

    Chenghua, Ou; Chen, Wei; Ma, Zhonggao

    2015-01-01

    Quantitative characterization of complex sub-seismic extensional structure system that essentially controls petroleum exploitation is difficult to implement in seismic profile interpretation. This research, based on a case study in block M of Myanmar, established a set of quantitative treatment schemes and technique processes for the identification of sub-seismic low-displacement (SSLD) extensional faults or fractures upon structural deformation restoration and geometric inversion. Firstly, the master-subsidiary inheritance relations and configuration of the seismic-scale extensional fault systems are determined by analyzing the structural pattern. Besides, three-dimensional (3D) pattern and characteristics of the seismic-scale extensional structure have been illustrated by a 3D structure model built upon seismic sections. Moreover, according to the dilatancy obtained from structural restoration on the basis of inclined shear method, as well as the fracture-flow index, potential SSLD extensional faults or fractures have been quantitatively identified. Application of the technique processes to the sub-seismic low-displacement extensional structures in block M in Myanmar is instructive to quantitatively interpret those SSLD extensional structure systems in practice. (paper)

  11. Solid-phase peptide quantitation assay using labeled monoclonal antibody and glutaraldehyde fixation

    International Nuclear Information System (INIS)

    Kasprzyk, P.G.; Cuttitta, F.; Avis, I.; Nakanishi, Y.; Treston, A.; Wong, H.; Walsh, J.H.; Mulshine, J.L.

    1988-01-01

    A solid-phase radioimmunoassay utilizing iodinated peptide-specific monoclonal antibody as a detection system instead of labeled peptide has been developed. Regional specific monoclonal antibodies to either gastrin-releasing peptide or gastrin were used as models to validate the general application of our modified assay. Conditions for radioactive labeling of the monoclonal antibody were determined to minimize oxidant damage, which compromises the sensitivity of other reported peptide quantitation assays. Pretreatment of 96-well polyvinyl chloride test plates with a 5% glutaraldehyde solution resulted in consistent retention of sufficient target peptide on the solid-phase matrix to allow precise quantitation. This quantitative method is completed within 1 h of peptide solid phasing. Pretreatment of assay plates with glutaraldehyde increased binding of target peptide and maximized antibody binding by optimizing antigen presentation. The hypothesis that glutaraldehyde affects both peptide binding to the plate and orientation of the peptide was confirmed by analysis of several peptide analogs. These studies indicate that peptide binding was mediated through a free amino group leaving the carboxy-terminal portion of the target peptide accessible for antibody binding. It was observed that the length of the peptide also affects the amount of monoclonal antibody that will bind. Under the optimal conditions, results from quantitation of gastrin-releasing peptide in relevant samples agree well with those from previously reported techniques. Thus, we report here a modified microplate assay which may be generally applied for the rapid and sensitive quantitation of peptide hormones

  12. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  13. Data Acceptance Criteria for Standardized Human-Associated Fecal Source Identification Quantitative Real-Time PCR Methods

    Science.gov (United States)

    There is a growing interest in the application of human-associated fecal sourceidentification quantitative real-time PCR (qPCR) technologies for water quality management. The transition from a research tool to a standardized protocol requires a high degree of confidence in data q...

  14. New apparatus for discriminating shapes - application to the study of neutron-proton elastic scattering at 14.6 MeV; Nouveau dispositif de discrimination de formes - Application a l'etude de la diffusion elastique neutron-proton a 14,6 MeV; Novyj diskriminator po forme - primenenie k izucheniyu uprugogo rasseyaniya nejtronproton s 14,6 MeV; Nuevo dispositivo para discriminacion de formas - Aplicacion al estudio de la dispersion elastica neutron-proton a 14,6 MeV

    Energy Technology Data Exchange (ETDEWEB)

    Crettez, J P; Cambou, F; Ambrosino, G [Laboratoire Maurice de Broglie, Paris (France)

    1962-04-15

    The mean life of scintillations in caesium iodide is dependent on the nature of the ionizing particle, the shortest life corresponding to the highest ionization density. This property is utilized for distinguishing different particles producing scintillations of similar amplitude. The apparatus described is a shape discriminator. It measures the time required by the scintillation to fall from its maximum to an adjustable fraction thereof. A time-amplitude converter provides a pulse the height of which is proportional to the time so measured. A comparison is then made between the shapes of the scintillations produced by alpha particles, protons and electrons, and the results obtained are shown. Application of this method to the measurement of recoil protons set in motion in a thin hydrogenated diffuser by a neutron flux is also described. Suppression of the pulses produced by gamma rays makes it possible to deduce from the spectra obtained the variation of the differential elastic scattering cross-section in terms of the angle, for d-t reaction neutrons. (author) [French] La vie moyenne des scintillations dans l'iodure de cesium varie suivant la nature de la particule ionisante. A la plus grande densite d'ionisation correspond la vie la plus courte. Cette propriete est employee pour distinguer differentes particules produisant des scintillations d'amplitude analogue. L'appareil decrit est un discriminateur de forme. Il mesure le temps mis par la scintillation pour passer de son maximum a une fraction reglable de cette amplitude. Un convertisseur temps-amplitude fournit une impulsion dont la hauteur est proportionnelle au temps ainsi mesure. Les auteurs comparent les formes des scintillations produites par les particules a, les protons, les electrons; ils presentent les resultats obtenus. Les auteurs decrivent l'application de cette methode a la mesure des protons de recul qu'un flux de neutrons met en mouvement dans un diffuseur hydrogene mince. La suppression des

  15. Integration of hydrothermal-energy economics: related quantitative studies

    Energy Technology Data Exchange (ETDEWEB)

    1982-08-01

    A comparison of ten models for computing the cost of hydrothermal energy is presented. This comparison involved a detailed examination of a number of technical and economic parameters of the various quantitative models with the objective of identifying the most important parameters in the context of accurate estimates of cost of hydrothermal energy. Important features of various models, such as focus of study, applications, marked sectors covered, methodology, input data requirements, and output are compared in the document. A detailed sensitivity analysis of all the important engineering and economic parameters is carried out to determine the effect of non-consideration of individual parameters.

  16. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  17. Application of new least-squares methods for the quantitative infrared analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.

    1982-01-01

    Improvements have been made in previous least-squares regression analyses of infrared spectra for the quantitative estimation of concentrations of multicomponent mixtures. Spectral baselines are fitted by least-squares methods, and overlapping spectral features are accounted for in the fitting procedure. Selection of peaks above a threshold value reduces computation time and data storage requirements. Four weighted least-squares methods incorporating different baseline assumptions were investigated using FT-IR spectra of the three pure xylene isomers and their mixtures. By fitting only regions of the spectra that follow Beer's Law, accurate results can be obtained using three of the fitting methods even when baselines are not corrected to zero. Accurate results can also be obtained using one of the fits even in the presence of Beer's Law deviations. This is a consequence of pooling the weighted results for each spectral peak such that the greatest weighting is automatically given to those peaks that adhere to Beer's Law. It has been shown with the xylene spectra that semiquantitative results can be obtained even when all the major components are not known or when expected components are not present. This improvement over previous methods greatly expands the utility of quantitative least-squares analyses

  18. A Stereological Method for the Quantitative Evaluation of Cartilage Repair Tissue

    Science.gov (United States)

    Nyengaard, Jens Randel; Lind, Martin; Spector, Myron

    2015-01-01

    Objective To implement stereological principles to develop an easy applicable algorithm for unbiased and quantitative evaluation of cartilage repair. Design Design-unbiased sampling was performed by systematically sectioning the defect perpendicular to the joint surface in parallel planes providing 7 to 10 hematoxylin–eosin stained histological sections. Counting windows were systematically selected and converted into image files (40-50 per defect). The quantification was performed by two-step point counting: (1) calculation of defect volume and (2) quantitative analysis of tissue composition. Step 2 was performed by assigning each point to one of the following categories based on validated and easy distinguishable morphological characteristics: (1) hyaline cartilage (rounded cells in lacunae in hyaline matrix), (2) fibrocartilage (rounded cells in lacunae in fibrous matrix), (3) fibrous tissue (elongated cells in fibrous tissue), (4) bone, (5) scaffold material, and (6) others. The ability to discriminate between the tissue types was determined using conventional or polarized light microscopy, and the interobserver variability was evaluated. Results We describe the application of the stereological method. In the example, we assessed the defect repair tissue volume to be 4.4 mm3 (CE = 0.01). The tissue fractions were subsequently evaluated. Polarized light illumination of the slides improved discrimination between hyaline cartilage and fibrocartilage and increased the interobserver agreement compared with conventional transmitted light. Conclusion We have applied a design-unbiased method for quantitative evaluation of cartilage repair, and we propose this algorithm as a natural supplement to existing descriptive semiquantitative scoring systems. We also propose that polarized light is effective for discrimination between hyaline cartilage and fibrocartilage. PMID:26069715

  19. Gas-chromatographic quantitative determination of argon in air samples, by elimination of oxigen

    International Nuclear Information System (INIS)

    Sofronie, E.

    1982-08-01

    A method of gas-chromatographic quantitative determination of argon in air samples, by elimination of oxygen, is presented. Experiments were carried out in a static system. Conditions for the application of the method in dynamic systems are specified. Sensibility of the method: 5 10 -4 cm 3 Ar per cm 3 of air. (author)

  20. Quantitative assessment of breast density from digitized mammograms into Tabar's patterns

    Energy Technology Data Exchange (ETDEWEB)

    Jamal, N [Medical Technology Division, Malaysian Institute for Nuclear Technology Research (MINT) 43000 Kajang (Malaysia); Ng, K-H [Department of Radiology, University of Malaya, 50603 Kuala Lumpur (Malaysia); Looi, L-M [Department of Pathology, University of Malaya, 50603 Kuala Lumpur (Malaysia); McLean, D [Medical Physics Department, Westmead Hospital, Sydney, NSW 2145 (Australia); Zulfiqar, A [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Tan, S-P [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Liew, W-F [Department of Radiology, Hospital Universiti Kebangsaan Malaysia, 56000 Malaysia, Kuala Lumpur, Malaysia (Malaysia); Shantini, A [Department of Radiology, Kuala Lumpur Hospital, 50586 Kuala Lumpur (Malaysia); Ranganathan, S [Department of Radiology, University of Malaya, 50603 Kuala Lumpur (Malaysia)

    2006-11-21

    We describe a semi-automated technique for the quantitative assessment of breast density from digitized mammograms in comparison with patterns suggested by Tabar. It was developed using the MATLAB-based graphical user interface applications. It is based on an interactive thresholding method, after a short automated method that shows the fibroglandular tissue area, breast area and breast density each time new thresholds are placed on the image. The breast density is taken as a percentage of the fibroglandular tissue to the breast tissue areas. It was tested in four different ways, namely by examining: (i) correlation of the quantitative assessment results with subjective classification, (ii) classification performance using the quantitative assessment technique, (iii) interobserver agreement and (iv) intraobserver agreement. The results of the quantitative assessment correlated well (r{sup 2} = 0.92) with the subjective Tabar patterns classified by the radiologist (correctly classified 83% of digitized mammograms). The average kappa coefficient for the agreement between the readers was 0.63. This indicated moderate agreement between the three observers in classifying breast density using the quantitative assessment technique. The kappa coefficient of 0.75 for intraobserver agreement reflected good agreement between two sets of readings. The technique may be useful as a supplement to the radiologist's assessment in classifying mammograms into Tabar's pattern associated with breast cancer risk.

  1. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  2. Quantitative in situ magnetization reversal studies in Lorentz microscopy and electron holography.

    Science.gov (United States)

    Rodríguez, L A; Magén, C; Snoeck, E; Gatel, C; Marín, L; Serrano-Ramón, L; Prieto, J L; Muñoz, M; Algarabel, P A; Morellon, L; De Teresa, J M; Ibarra, M R

    2013-11-01

    A generalized procedure for the in situ application of magnetic fields by means of the excitation of the objective lens for magnetic imaging experiments in Lorentz microscopy and electron holography is quantitatively described. A protocol for applying magnetic fields with arbitrary in-plane magnitude and orientation is presented, and a freeware script for Digital Micrograph(™) is provided to assist the operation of the microscope. Moreover, a method to accurately reconstruct hysteresis loops is detailed. We show that the out-of-plane component of the magnetic field cannot be always neglected when performing quantitative measurements of the local magnetization. Several examples are shown to demonstrate the accuracy and functionality of the methods. © 2013 Elsevier B.V. All rights reserved.

  3. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform

    Science.gov (United States)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-11-01

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (r

  4. Quantitative evaluation of photoplethysmographic artifact reduction for pulse oximetry

    Science.gov (United States)

    Hayes, Matthew J.; Smith, Peter R.

    1999-01-01

    Motion artefact corruption of pulse oximeter output, causing both measurement inaccuracies and false alarm conditions, is a primary restriction in the current clinical practice and future applications of this useful technique. Artefact reduction in photoplethysmography (PPG), and therefore by application in pulse oximetry, is demonstrated using a novel non-linear methodology recently proposed by the authors. The significance of these processed PPG signals for pulse oximetry measurement is discussed, with particular attention to the normalization inherent in the artefact reduction process. Quantitative experimental investigation of the performance of PPG artefact reduction is then utilized to evaluate this technology for application to pulse oximetry. While the successfully demonstrated reduction of severe artefacts may widen the applicability of all PPG technologies and decrease the occurrence of pulse oximeter false alarms, the observed reduction of slight artefacts suggests that many such effects may go unnoticed in clinical practice. The signal processing and output averaging used in most commercial oximeters can incorporate these artefact errors into the output, while masking the true PPG signal corruption. It is therefore suggested that PPG artefact reduction should be incorporated into conventional pulse oximetry measurement, even in the absence of end-user artefact problems.

  5. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    , the abort triggers must have low false negative rates to be sure that real crew-threatening failures are detected, and also low false positive rates to ensure that the crew does not abort from non-crew-threatening launch vehicle behaviors. The analysis process described in this paper is a compilation of over six years of lessons learned and refinements from experiences developing abort triggers for NASA's Constellation Program (Ares I Project) and the SLS Program, as well as the simultaneous development of SHM/FM theory. The paper will describe the abort analysis concepts and process, developed in conjunction with SLS Safety and Mission Assurance (S&MA) to define a common set of mission phase, failure scenario, and Loss of Mission Environment (LOME) combinations upon which the SLS Loss of Mission (LOM) Probabilistic Risk Assessment (PRA) models are built. This abort analysis also requires strong coordination with the Multi-Purpose Crew Vehicle (MPCV) and SLS Structures and Environments (STE) to formulate a series of abortability tables that encapsulate explosion dynamics over the ascent mission phase. The design and assessment of abort conditions and triggers to estimate their Loss of Crew (LOC) Benefits also requires in-depth integration with other groups, including Avionics, Guidance, Navigation and Control(GN&C), the Crew Office, Mission Operations, and Ground Systems. The outputs of this analysis are a critical input to SLS S&MA's LOC PRA models. The process described here may well be the first full quantitative application of SHM/FM theory to the selection of a sensor suite for any aerospace system.

  6. Development of a Rapid Real-Time PCR Assay for Quantitation of Pneumocystis carinii f. sp. Carinii

    DEFF Research Database (Denmark)

    Larsen, Hans Henrik; Kovacs, Joseph A; Stock, Frida

    2002-01-01

    ) PCR assay for detecting P. carinii f. sp. carinii, the subspecies of P. carinii commonly used in research models of PCP. The assay was based on the single-copy dihydrofolate reductase gene and was able to detect r = 0.99) over...... 6 log values for standards containing > or =5 copies/tube. Application of the assay to a series of 10-fold dilutions of P. carinii organisms isolated from rat lung demonstrated that it was reproducibly quantitative over 5 log values (r = 0.99). The assay was applied to a recently reported in vitro....... In conclusion, a rapid, sensitive, and reproducible quantitative PCR assay for P. carinii f. sp. carinii has been developed and is applicable to in vivo as well as in vitro systems. The assay should prove useful for conducting studies in which quantification of organism burden or growth assessment is critical...

  7. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications.

    Science.gov (United States)

    Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M

    2017-05-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.

  8. Detection of nonauthorized genetically modified organisms using differential quantitative polymerase chain reaction: application to 35S in maize.

    Science.gov (United States)

    Cankar, Katarina; Chauvensy-Ancel, Valérie; Fortabat, Marie-Noelle; Gruden, Kristina; Kobilinsky, André; Zel, Jana; Bertheau, Yves

    2008-05-15

    Detection of nonauthorized genetically modified organisms (GMOs) has always presented an analytical challenge because the complete sequence data needed to detect them are generally unavailable although sequence similarity to known GMOs can be expected. A new approach, differential quantitative polymerase chain reaction (PCR), for detection of nonauthorized GMOs is presented here. This method is based on the presence of several common elements (e.g., promoter, genes of interest) in different GMOs. A statistical model was developed to study the difference between the number of molecules of such a common sequence and the number of molecules identifying the approved GMO (as determined by border-fragment-based PCR) and the donor organism of the common sequence. When this difference differs statistically from zero, the presence of a nonauthorized GMO can be inferred. The interest and scope of such an approach were tested on a case study of different proportions of genetically modified maize events, with the P35S promoter as the Cauliflower Mosaic Virus common sequence. The presence of a nonauthorized GMO was successfully detected in the mixtures analyzed and in the presence of (donor organism of P35S promoter). This method could be easily transposed to other common GMO sequences and other species and is applicable to other detection areas such as microbiology.

  9. Effects of Single and Combined Application of Organic, Biological and Chemical Fertilizers on Quantitative and Qualitative Yield of Coriander (Coriandrum sativum

    Directory of Open Access Journals (Sweden)

    M. Aghhavani Shajari

    2016-07-01

    Full Text Available Introduction: Medicinal plants were one of the main natural resources of Iran from ancient times. Coriander (Coriandrum sativum L. is from Apiaceae family that it has cultivated extensively in the world. Management and environmental factors such as nutritional management has a significant impact on the quantity and quality of plants. Application of organic fertilizers in conventional farming systems is not common and most of the nutritional need of plants supply through chemical fertilizers for short period. Excessive and unbalanced use of fertilizers in the long period, reduce crop yield and soil biological activity, accumulation of nitrates and heavy metals, and finally cause negative environmental effects and increase the cost of production. The use of bio-fertilizers and organic matter are taken into consideration to reduce the use of chemical fertilizers and increase the quality of most crops. Stability and soil fertility through the use of organic fertilizers are important due to having most of the elements required by plants and beneficial effects on physical, chemical, biological and soil fertility. Therefore, the aim of this research was to evaluate the effects of organic, biological and chemical fertilizers on quality and quantity characteristics of coriander. Materials and Methods: In order to study the effects of single and combined applications of organic, biological and chemical fertilizers on quantitative and qualitative characteristics of Coriander (Coriandrum sativum, an experiment was conducted based on a randomized complete block design with three replications and 12 treatments at Research Station, Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, in - 2011. Treatments included: (1 mycorrhizae (Glomus mosseae (2 biosulfur (Thiobacillus sp., (3 chemical fertilizer (NPK, (4 cow manure, 5( vermin compost, 6( mycorrhizae + chemical fertilizer, 7( mycorrhizae + cow manure, 8( mycorrhizae + vermicompost, 9( biosulfur

  10. Deep Learning for Magnetic Resonance Fingerprinting: A New Approach for Predicting Quantitative Parameter Values from Time Series.

    Science.gov (United States)

    Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas

    2017-01-01

    The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.

  11. Weighing evidence: quantitative measures of the importance of bitemark evidence.

    Science.gov (United States)

    Kittelson, J M; Kieser, J A; Buckingham, D M; Herbison, G P

    2002-12-01

    Quantitative measures of the importance of evidence such as the "likelihood ratio" have become increasingly popular in the courtroom. These measures have been used by expert witnesses formally to describe their certainty about a piece of evidence. These measures are commonly interpreted as the amount by which the evidence should revise the opinion of guilt, and thereby summarize the importance of a particular piece of evidence. Unlike DNA evidence, quantitative measures have not been widely used by forensic dentists to describe their certainty when testifying about bitemark evidence. There is, however, no inherent reason why they should not be used to evaluate bitemarks. The purpose of this paper is to describe the likelihood ratio as it might be applied to bitemark evidence. We use a simple bitemark example to define the likelihood ratio, its application, and interpretation. In particular we describe how the jury interprets the likelihood ratio from a Bayesian perspective when evaluating the impact of the evidence on the odds that the accused is guilty. We describe how the dentist would calculate the likelihood ratio based on frequentist interpretations. We also illustrate some of the limitations of the likelihood ratio, and show how those limitations apply to bitemark evidence. We conclude that the quality of bitemark evidence cannot be adequately summarized by the likelihood ratio, and argue that its application in this setting may be more misleading than helpful.

  12. Brain microstructure mapping using quantitative and diffusion MRI

    International Nuclear Information System (INIS)

    Lebois, Alice

    2014-01-01

    ), in order to better understand their relations and to explain the observed variability along the fascicles and the interhemispheric asymmetries. The second part was focused on the brain tissue modeling at the cell scale to extract the quantitative parameters characterizing the geometry of the cellular membranes, such as the axonal diameter and the axonal density. A diffusion MRI sequence was developed on the 3 Teslas and 7 Teslas Siemens clinical systems of NeuroSpin which is able to apply any kind of gradient waveforms to fall within an approach where the gradient waveform results from an optimization under the hypothesis of a geometrical tissue model, hardware and time constraints induced by clinical applications. This sequence was applied in the study of fourteen healthy subjects in order to build the first quantitative atlas of the axonal diameter and the local axonal density at 7 T. We also proposed a new geometrical model to model the axon, dividing the axonal compartment, usually modelled using a simple cylinder, into two compartments: one being near the membranes with low diffusivity and one farther from the membranes, less restricted and with higher diffusivity. We conducted a theoretical study showing that under clinical conditions, this new model allows, in part, to overcome the bias induced by the simple cylindrical model leading to a systematic overestimation of the smallest diameters. Finally, in the aim of going further in the physiopathology of the autism, we added to the current 3 T imaging protocol the dMRI sequence developed in the framework of this thesis in order to map the axonal diameter and density. This study is ongoing and should validate shortly the contribution of these new quantitative measures of the microstructure in the comprehension of the atrophies of the corpus callosum, initially observed using less specific diffusion parameters such as the generalized fractional anisotropy. There will be other clinical applications in the future

  13. Proposed quantitative approach to safety for nuclear power plants in Canada

    International Nuclear Information System (INIS)

    1995-07-01

    A set of quantitative risk and frequency limits plus required processes is proposed to help ensure that a nuclear power plant in Canada meets the qualitative safety objectives defined in ACNS-2 and in IAEA 75-INSAG-3. As emphasized in this report, risks and hence doses are to be reduced below the limits using ALARA (As Low as Reasonably Achievable, economic and social factors being taken into account) or VIA (value-impact analysis) processes unless, in general, calculated risks and hence doses are below recommended de minimis levels. An updated version of ACNS-4, which will be issued as ACNS-21, will incorporate a statement of these limits and objectives as well as assessment criteria and procedures that will facilitate their application. The quantitative approach proposed here is consistent with a growing consensus on the need for, and the elements of, a quantitative approach to risk management of all major activities in an advanced industrial society. The ACNS recommends that the Atomic Energy Control Board adopt the proposed approach as a rational and coherent basis for nuclear power plant safety policy and requirements in Canada. (author). 68 refs., 4 tabs., 1 fig

  14. Proposed quantitative approach to safety for nuclear power plants in Canada

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    A set of quantitative risk and frequency limits plus required processes is proposed to help ensure that a nuclear power plant in Canada meets the qualitative safety objectives defined in ACNS-2 and in IAEA 75-INSAG-3. As emphasized in this report, risks and hence doses are to be reduced below the limits using ALARA (As Low as Reasonably Achievable, economic and social factors being taken into account) or VIA (value-impact analysis) processes unless, in general, calculated risks and hence doses are below recommended de minimis levels. An updated version of ACNS-4, which will be issued as ACNS-21, will incorporate a statement of these limits and objectives as well as assessment criteria and procedures that will facilitate their application. The quantitative approach proposed here is consistent with a growing consensus on the need for, and the elements of, a quantitative approach to risk management of all major activities in an advanced industrial society. The ACNS recommends that the Atomic Energy Control Board adopt the proposed approach as a rational and coherent basis for nuclear power plant safety policy and requirements in Canada. (author). 68 refs., 4 tabs., 1 fig.

  15. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  16. Chemistry Teachers' Knowledge and Application of Models

    Science.gov (United States)

    Wang, Zuhao; Chi, Shaohui; Hu, Kaiyan; Chen, Wenting

    2014-01-01

    Teachers' knowledge and application of model play an important role in students' development of modeling ability and scientific literacy. In this study, we investigated Chinese chemistry teachers' knowledge and application of models. Data were collected through test questionnaire and analyzed quantitatively and qualitatively. The result indicated…

  17. Application of quantitative image analysis to the investigation of macroporosity of graphitic materials

    International Nuclear Information System (INIS)

    Delle, W.; Koizlik, K.; Hoven, H.; Wallura, E.

    1978-01-01

    The essence of quantitative image analysis is that the classification of graphitic materials to be inspected is possible on the basis of the grey value contrast between pores (dark) and carbon (bright). Macroporosity is defined as total of all pores with diameters larger than 0.2 μm. The pore size distributions and pore shapes of graphites based on petroleum, pitch, gilsonite and fluid coke as well as graphitic fuel matrices and pyrolytic carbons were investigated. The relationships between maximum grain size, macroporosity and total porosity as well as the anisotropies of macroporosity and electrical resistivity of graphite were established. (orig./GSC) [de

  18. NecroQuant: quantitative assessment of radiological necrosis

    Science.gov (United States)

    Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay

    2017-11-01

    Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.

  19. Iron filled carbon nanotubes as novel monopole-like sensors for quantitative magnetic force microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Wolny, F; Muehl, T; Weissker, U; Lipert, K; Schumann, J; Leonhardt, A; Buechner, B, E-mail: f.wolny@ifw-dresden.de, E-mail: t.muehl@ifw-dresden.de [Leibniz Institute for Solid State and Materials Research (IFW) Dresden, Helmholtzstrasse 20, 01069 Dresden (Germany)

    2010-10-29

    We present a novel ultrahigh stability sensor for quantitative magnetic force microscopy (MFM) based on an iron filled carbon nanotube. In contrast to the complex magnetic structure of conventional MFM probes, this sensor constitutes a nanomagnet with defined properties. The long iron nanowire can be regarded as an extended dipole of which only the monopole close to the sample surface is involved in the imaging process. We demonstrate its potential for high resolution imaging. Moreover, we present an easy routine to determine its monopole moment and prove that this calibration, unlike other approaches, is universally applicable. For the first time this enables straightforward quantitative MFM measurements.

  20. Presentation of a method for consequence modeling and quantitative risk assessment of fire and explosion in process industry (Case study: Hydrogen Production Process

    Directory of Open Access Journals (Sweden)

    M J Jafari

    2013-05-01

     .Conclusion: Since the proposed method is applicable in all phases of process or system design, and estimates the risk of fire and explosion by a quantitative, comprehensive and mathematical-based equations approach. It can be used as an alternative method instead of qualitative and semi quantitative methods.

  1. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  2. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    Science.gov (United States)

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  3. A Checklist for Successful Quantitative Live Cell Imaging in Systems Biology

    Science.gov (United States)

    Sung, Myong-Hee

    2013-01-01

    Mathematical modeling of signaling and gene regulatory networks has provided unique insights about systems behaviors for many cell biological problems of medical importance. Quantitative single cell monitoring has a crucial role in advancing systems modeling of molecular networks. However, due to the multidisciplinary techniques that are necessary for adaptation of such systems biology approaches, dissemination to a wide research community has been relatively slow. In this essay, I focus on some technical aspects that are often under-appreciated, yet critical in harnessing live cell imaging methods to achieve single-cell-level understanding and quantitative modeling of molecular networks. The importance of these technical considerations will be elaborated with examples of successes and shortcomings. Future efforts will benefit by avoiding some pitfalls and by utilizing the lessons collectively learned from recent applications of imaging in systems biology. PMID:24709701

  4. Qualitative and Quantitative Security Analyses for ZigBee Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Yuksel, Ender

    methods and techniques in different areas and brings them together to create an efficient verification system. The overall ambition is to provide a wide range of powerful techniques for analyzing models with quantitative and qualitative security information. We stated a new approach that first verifies...... applications, home automation, and traffic control. The challenges for research in this area are due to the unique features of wireless sensor devices such as low processing power and associated low energy. On top of this, wireless sensor networks need secure communication as they operate in open fields...... low level security protocol s in a qualitative manner and guarantees absolute security, and then takes these verified protocols as actions of scenarios to be verified in a quantitative manner. Working on the emerging ZigBee wireless sensor networks, we used probabilistic verification that can return...

  5. Portable instrumentation for quantitatively measuring radioactive surface contaminations, including 90Sr

    International Nuclear Information System (INIS)

    Brodzinski, R.L.

    1983-10-01

    In order to measure the effectiveness of decontamination efforts, a quantitative analysis of the radiocontamination is necessary, both before and after decontamination. Since it is desirable to release the decontaminated material for unrestricted use or disposal, the assay equipment must provide adequate sensitivity to measure the radioactivity at or below the release limit. In addition, the instrumentation must be capable of measuring all kinds of radiocontaminants including fission products, activation products, and transuranic materials. Finally, the survey instrumentation must be extremely versatile in order to assay the wide variety of contaminated surfaces in many environments, some of which may be extremely hostile or remote. This communication describes the development and application of portable instrumentation capable of quantitatively measuring most transuranics, activation products, and fission products, including 90 Sr, on almost any contaminated surface in nearly any location

  6. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  7. Disaster metrics: quantitative benchmarking of hospital surge capacity in trauma-related multiple casualty events.

    Science.gov (United States)

    Bayram, Jamil D; Zuabi, Shawki; Subbarao, Italo

    2011-06-01

    Hospital surge capacity in multiple casualty events (MCE) is the core of hospital medical response, and an integral part of the total medical capacity of the community affected. To date, however, there has been no consensus regarding the definition or quantification of hospital surge capacity. The first objective of this study was to quantitatively benchmark the various components of hospital surge capacity pertaining to the care of critically and moderately injured patients in trauma-related MCE. The second objective was to illustrate the applications of those quantitative parameters in local, regional, national, and international disaster planning; in the distribution of patients to various hospitals by prehospital medical services; and in the decision-making process for ambulance diversion. A 2-step approach was adopted in the methodology of this study. First, an extensive literature search was performed, followed by mathematical modeling. Quantitative studies on hospital surge capacity for trauma injuries were used as the framework for our model. The North Atlantic Treaty Organization triage categories (T1-T4) were used in the modeling process for simplicity purposes. Hospital Acute Care Surge Capacity (HACSC) was defined as the maximum number of critical (T1) and moderate (T2) casualties a hospital can adequately care for per hour, after recruiting all possible additional medical assets. HACSC was modeled to be equal to the number of emergency department beds (#EDB), divided by the emergency department time (EDT); HACSC = #EDB/EDT. In trauma-related MCE, the EDT was quantitatively benchmarked to be 2.5 (hours). Because most of the critical and moderate casualties arrive at hospitals within a 6-hour period requiring admission (by definition), the hospital bed surge capacity must match the HACSC at 6 hours to ensure coordinated care, and it was mathematically benchmarked to be 18% of the staffed hospital bed capacity. Defining and quantitatively benchmarking the

  8. Application of multivariable analysis methods to the quantitative detection of gas by tin dioxide micro-sensors; Application des methodes d'analyse multivariables a la detection quantitative de gaz par microcapteurs a base de dioxyde d'etain

    Energy Technology Data Exchange (ETDEWEB)

    Perdreau, N.

    2000-01-17

    The electric conductivity of tin dioxide depends on the temperature of the material and on the nature and environment of the surrounding gas. This work shows that the treatment by multivariable analysis methods of electric conductance signals of one sensor allows to determine concentrations of binary or ternary mixtures of ethanol (0-80 ppm), carbon monoxide (0-300 ppm) and methane (0-1000 ppm). A part of this study has consisted of the design and the implementation of an automatic testing bench allowing to acquire the electric conductance of four sensors in thermal cycle and under gaseous cycles. It has also revealed some disturbing effects (humidity,..) of the measurement. Two techniques of sensor fabrication have been used to obtain conductances (depending of temperature) distinct for each gas, reproducible for the different sensors and enough stable with time to allow the exploitation of the signals by multivariable analysis methods (tin dioxide under the form of thin layers obtained by reactive evaporation or under the form of sintered powder bars). In a last part, it has been shown that the quantitative determination of gas by the application of chemo-metry methods is possible although the relation between the electric conductances in one part and the temperatures and concentrations in another part is non linear. Moreover, the modelling with the 'Partial Least Square' method and a pretreatment allows to obtain performance data comparable to those obtained with neural networks. (O.M.)

  9. 75 FR 76022 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2010-12-07

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and... public display. Multiple Applicants The following applicants each request a permit to import the sport...

  10. A Direct, Competitive Enzyme-Linked Immunosorbent Assay (ELISA) as a Quantitative Technique for Small Molecules

    Science.gov (United States)

    Powers, Jennifer L.; Rippe, Karen Duda; Imarhia, Kelly; Swift, Aileen; Scholten, Melanie; Islam, Naina

    2012-01-01

    ELISA (enzyme-linked immunosorbent assay) is a widely used technique with applications in disease diagnosis, detection of contaminated foods, and screening for drugs of abuse or environmental contaminants. However, published protocols with a focus on quantitative detection of small molecules designed for teaching laboratories are limited. A…

  11. Quantum dots assisted laser desorption/ionization mass spectrometric detection of carbohydrates: qualitative and quantitative analysis.

    Science.gov (United States)

    Bibi, Aisha; Ju, Huangxian

    2016-04-01

    A quantum dots (QDs) assisted laser desorption/ionization mass spectrometric (QDA-LDI-MS) strategy was proposed for qualitative and quantitative analysis of a series of carbohydrates. The adsorption of carbohydrates on the modified surface of different QDs as the matrices depended mainly on the formation of hydrogen bonding, which led to higher MS intensity than those with conventional organic matrix. The effects of QDs concentration and sample preparation method were explored for improving the selective ionization process and the detection sensitivity. The proposed approach offered a new dimension to the application of QDs as matrices for MALDI-MS research of carbohydrates. It could be used for quantitative measurement of glucose concentration in human serum with good performance. The QDs served as a matrix showed the advantages of low background, higher sensitivity, convenient sample preparation and excellent stability under vacuum. The QDs assisted LDI-MS approach has promising application to the analysis of carbohydrates in complex biological samples. Copyright © 2016 John Wiley & Sons, Ltd.

  12. A semi-quantitative and thematic analysis of medical student attitudes towards M-Learning.

    Science.gov (United States)

    Green, Ben L; Kennedy, Iain; Hassanzadeh, Hadi; Sharma, Suneal; Frith, Gareth; Darling, Jonathan C

    2015-10-01

    Smartphone and mobile application technology have in recent years furthered the development of novel learning and assessment resources. 'MBChB Mobile' is a pioneering mobile learning (M-Learning) programme at University of Leeds, United Kingdom and provides all senior medical students with iPhone handsets complete with academic applications, assessment software and a virtual reflective environment. This study aimed to evaluate the impact of MBChB Mobile on student learning. Ethical approval was granted to invite fourth and fifth year medical students to participate in a semi-quantitative questionnaire: data were collected anonymously with informed consent and analysed where appropriate using chi-squared test of association. Qualitative data generated through focus group participation were subjected to both content and thematic analysis. A total of 278 of 519 (53.6%) invited participants responded. Overall, 72.6% of students agreed that MBChB Mobile enhanced their learning experience; however, this was significantly related to overall usage (P mobile technology proficiency (P mobile devices, and perceived patient acceptability. As one of the largest evaluative and only quantitative study of smartphone-assisted M-Learning in undergraduate medical education, MBChB Mobile suggests that smartphone and application technology enhances students' learning experience. Barriers to implementation may be addressed through the provision of tailored learning resources, along with user-defined support systems, and appropriate means of ensuring acceptability to patients. © 2015 John Wiley & Sons, Ltd.

  13. Improved cancer risk stratification and diagnosis via quantitative phase microscopy (Conference Presentation)

    Science.gov (United States)

    Liu, Yang; Uttam, Shikhar; Pham, Hoa V.; Hartman, Douglas J.

    2017-02-01

    Pathology remains the gold standard for cancer diagnosis and in some cases prognosis, in which trained pathologists examine abnormality in tissue architecture and cell morphology characteristic of cancer cells with a bright-field microscope. The limited resolution of conventional microscope can result in intra-observer variation, missed early-stage cancers, and indeterminate cases that often result in unnecessary invasive procedures in the absence of cancer. Assessment of nanoscale structural characteristics via quantitative phase represents a promising strategy for identifying pre-cancerous or cancerous cells, due to its nanoscale sensitivity to optical path length, simple sample preparation (i.e., label-free) and low cost. I will present the development of quantitative phase microscopy system in transmission and reflection configuration to detect the structural changes in nuclear architecture, not be easily identifiable by conventional pathology. Specifically, we will present the use of transmission-mode quantitative phase imaging to improve diagnostic accuracy of urine cytology and the nuclear dry mass is progressively correlate with negative, atypical, suspicious and positive cytological diagnosis. In a second application, we will present the use of reflection-mode quantitative phase microscopy for depth-resolved nanoscale nuclear architecture mapping (nanoNAM) of clinically prepared formalin-fixed, paraffin-embedded tissue sections. We demonstrated that the quantitative phase microscopy system detects a gradual increase in the density alteration of nuclear architecture during malignant transformation in animal models of colon carcinogenesis and in human patients with ulcerative colitis, even in tissue that appears histologically normal according to pathologists. We evaluated the ability of nanoNAM to predict "future" cancer progression in patients with ulcerative colitis.

  14. Quantitative assessment of videolaryngostroboscopic images in patients with glottic pathologies.

    Science.gov (United States)

    Niebudek-Bogusz, Ewa; Kopczynski, Bartosz; Strumillo, Pawel; Morawska, Joanna; Wiktorowicz, Justyna; Sliwinska-Kowalska, Mariola

    2017-07-01

    Digital imaging techniques enable exploration of novel visualization modalities of the vocal folds during phonation and definition of parameters, facilitating more precise diagnosis of voice disorders. Application of computer vision algorithms for analysis of videolaryngostroboscopic (VLS) images aimed at qualitative and quantitative description of phonatory vibrations. VLS examinations were conducted for 45 females, including 15 subjects with vocal nodules, 15 subjects with glottal incompetence, and 15 normophonic females. The recorded VLS images were preprocessed, the glottis area was segmented out, and the glottal cycles were identified. The glottovibrograms were built, and then the glottal area waveforms (GAW) were quantitatively described by computing the following parameters: open quotient (OQ), closing quotient (CQ), speed quotient (SQ), minimal relative glottal area (MRGA), and a new parameter termed closure difference index (CDI). Profiles of the glottal widths assessed along the glottal length differentiated the study groups (P diagnostics. Results of the performed ROC curve analysis suggest that the evaluated parameters can distinguish patients with voice disorders from normophonic subjects.

  15. [THE COMPARATIVE ANALYSIS OF RESULTS OF DETECTION OF CARCINOGENIC TYPES OF HUMAN PAPILLOMA VIRUS BY QUALITATIVE AND QUANTITATIVE TESTS].

    Science.gov (United States)

    Kuzmenko, E T; Labigina, A V; Leshenko, O Ya; Rusanov, D N; Kuzmenko, V V; Fedko, L P; Pak, I P

    2015-05-01

    The analysis of results of screening (n = 3208; sexually active citizen aged from 18 to 59 years) was carried out to detect oncogene types of human papilloma virus in using qualitative (1150 females and 720 males) and quantitative (polymerase chain reaction in real-time (843 females and 115 males) techniques. The human papilloma virus of high oncogene type was detected in 65% and 68.4% of females and in 48.6% and 53% of males correspondingly. Among 12 types of human papilloma virus the most frequently diagnosed was human papilloma virus 16 independently of gender of examined and technique of analysis. In females, under application of qualitative tests rate of human papilloma virus 16 made up to 18.3% (n = 280) and under application of quantitative tests Rte of human papilloma virus made up to 14.9% (n = 126; p ≤ 0.05). Under examination of males using qualitative tests rate of human papilloma virus 16 made up to 8.3% (n = 60) and under application of qualitative tests made up to 12.2% (n = 14; p ≥ 0.05). Under application of qualitative tests rate of detection on the rest ofoncogene types of human papilloma virus varied in females from 3.4% to 8.4% and in males from 1.8% to 5.9%. Under application of qualitative tests to females rate of human papilloma virus with high viral load made up to 68.4%, with medium viral load - 2.85% (n = 24) and with low viral load -0.24% (n = 2). Under application of quantitative tests in males rate of detection of types of human papilloma virus made up to 53% and at that in all high viral load was established. In females, the most of oncogene types of human papilloma virus (except for 31, 39, 59) are detected significantly more often than in males.

  16. Laser-induced breakdown spectroscopy fundamentals and applications

    CERN Document Server

    Noll, Reinhard

    2012-01-01

    This book is a comprehensive source of the fundamentals, process parameters, instrumental components and applications of laser-induced breakdown spectroscopy (LIBS). The effect of multiple pulses on material ablation, plasma dynamics and plasma emission is presented. A heuristic plasma modeling allows to simulate complex experimental plasma spectra. These methods and findings form the basis for a variety of applications to perform quantitative multi-element analysis with LIBS. These application potentials of LIBS have really boosted in the last years ranging from bulk analysis of metallic alloys and non-conducting materials, via spatially resolved analysis and depth profiling covering measuring objects in all physical states: gaseous, liquid and solid. Dedicated chapters present LIBS investigations for these tasks with special emphasis on the methodical and instrumental concepts as well as the optimization strategies for a quantitative analysis. Requirements, concepts, design and characteristic features of LI...

  17. 78 FR 59052 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2013-09-25

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and.... Multiple Applicants The following applicants each request a permit to import the sport- hunted trophy of...

  18. The quantitative imaging network: the role of quantitative imaging in radiation therapy

    International Nuclear Information System (INIS)

    Tandon, Pushpa; Nordstrom, Robert J.; Clark, Laurence

    2014-01-01

    The potential value of modern medical imaging methods has created a need for mechanisms to develop, translate and disseminate emerging imaging technologies and, ideally, to quantitatively correlate those with other related laboratory methods, such as the genomics and proteomics analyses required to support clinical decisions. One strategy to meet these needs efficiently and cost effectively is to develop an international network to share and reach consensus on best practices, imaging protocols, common databases, and open science strategies, and to collaboratively seek opportunities to leverage resources wherever possible. One such network is the Quantitative Imaging Network (QIN) started by the National Cancer Institute, USA. The mission of the QIN is to improve the role of quantitative imaging for clinical decision making in oncology by the development and validation of data acquisition, analysis methods, and other quantitative imaging tools to predict or monitor the response to drug or radiation therapy. The network currently has 24 teams (two from Canada and 22 from the USA) and several associate members, including one from Tata Memorial Centre, Mumbai, India. Each QIN team collects data from ongoing clinical trials and develops software tools for quantitation and validation to create standards for imaging research, and for use in developing models for therapy response prediction and measurement and tools for clinical decision making. The members of QIN are addressing a wide variety of cancer problems (Head and Neck cancer, Prostrate, Breast, Brain, Lung, Liver, Colon) using multiple imaging modalities (PET, CT, MRI, FMISO PET, DW-MRI, PET-CT). (author)

  19. Quantitative Evaluation of the Use of Actigraphy for Neurological and Psychiatric Disorders

    Directory of Open Access Journals (Sweden)

    Weidong Pan

    2014-01-01

    Full Text Available Quantitative and objective evaluation of disease severity and/or drug effect is necessary in clinical practice. Wearable accelerometers such as an actigraph enable long-term recording of a patient’s movement during activities and they can be used for quantitative assessment of symptoms due to various diseases. We reviewed some applications of actigraphy with analytical methods that are sufficiently sensitive and reliable to determine the severity of diseases and disorders such as motor and nonmotor disorders like Parkinson’s disease, sleep disorders, depression, behavioral and psychological symptoms of dementia (BPSD for vascular dementia (VD, seasonal affective disorder (SAD, and stroke, as well as the effects of drugs used to treat them. We believe it is possible to develop analytical methods to assess more neurological or psychopathic disorders using actigraphy records.

  20. Incorporating Quantitative Reasoning in Common Core Courses: Mathematics for The Ghost Map

    Directory of Open Access Journals (Sweden)

    John R. Jungck

    2012-01-01

    Full Text Available How can mathematics be integrated into multi-section interdisciplinary courses to enhance thematic understandings and shared common readings? As an example, four forms of quantitative reasoning are used to understand and critique one such common reading: Steven Berlin Johnson’s "The Ghost Map: The Story of London's Most Terrifying Epidemic - and How it Changed Science, Cities and the Modern World" (Riverhead Books, 2006. Geometry, statistics, modeling, and networks are featured in this essay as the means of depicting, understanding, elaborating, and critiquing the public health issues raised in Johnson’s book. Specific pedagogical examples and resources are included to illustrate applications and opportunities for generalization beyond this specific example. Quantitative reasoning provides a robust, yet often neglected, lens for doing literary and historical analyses in interdisciplinary education.

  1. Quantitative determination of titin and nebulin in poultry meat by SDS-PAGE with an internal standard

    NARCIS (Netherlands)

    Tomaszewska Gras, J.; Kijowski,; Schreurs, F.J.G.

    2002-01-01

    The method of quantitative determination of titin and nebulin in chicken meat by SDS-PAGE electrophoresis technique was developed by application of β-galactosidase as the internal standard. The method was tested first on marker protein samples of known concentrations (myosin, transferrin, glutamic

  2. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  3. Quantitative PCR assay to determine prevalence and intensity of MSX (Haplosporidium nelsoni) in North Carolina and Rhode Island oysters Crassostrea virginica.

    Science.gov (United States)

    Wilbur, Ami E; Ford, Susan E; Gauthier, Julie D; Gomez-Chiarri, Marta

    2012-12-27

    The continuing challenges to the management of both wild and cultured eastern oyster Crassostrea virginica populations resulting from protozoan parasites has stimulated interest in the development of molecular assays for their detection and quantification. For Haplosporidium nelsoni, the causative agent of multinucleated sphere unknown (MSX) disease, diagnostic evaluations depend extensively on traditional but laborious histological approaches and more recently on rapid and sensitive (but not quantitative) end-point polymerase chain reaction (PCR) assays. Here, we describe the development and application of a quantitative PCR (qPCR) assay for H. nelsoni using an Applied Biosystems TaqMan® assay designed with minor groove binder (MGB) probes. The assay was highly sensitive, detecting as few as 20 copies of cloned target DNA. Histologically evaluated parasite density was significantly correlated with the quantification cycle (Cq), regardless of whether quantification was categorical (r2 = 0.696, p < 0.0001) or quantitative (r2 = 0.797, p < 0.0001). Application in field studies conducted in North Carolina, USA (7 locations), revealed widespread occurrence of the parasite with moderate to high intensities noted in some locations. In Rhode Island, USA, application of the assay on oysters from 2 locations resulted in no positives.

  4. Qupe--a Rich Internet Application to take a step forward in the analysis of mass spectrometry-based quantitative proteomics experiments.

    Science.gov (United States)

    Albaum, Stefan P; Neuweger, Heiko; Fränzel, Benjamin; Lange, Sita; Mertens, Dominik; Trötschel, Christian; Wolters, Dirk; Kalinowski, Jörn; Nattkemper, Tim W; Goesmann, Alexander

    2009-12-01

    The goal of present -omics sciences is to understand biological systems as a whole in terms of interactions of the individual cellular components. One of the main building blocks in this field of study is proteomics where tandem mass spectrometry (LC-MS/MS) in combination with isotopic labelling techniques provides a common way to obtain a direct insight into regulation at the protein level. Methods to identify and quantify the peptides contained in a sample are well established, and their output usually results in lists of identified proteins and calculated relative abundance values. The next step is to move ahead from these abstract lists and apply statistical inference methods to compare measurements, to identify genes that are significantly up- or down-regulated, or to detect clusters of proteins with similar expression profiles. We introduce the Rich Internet Application (RIA) Qupe providing comprehensive data management and analysis functions for LC-MS/MS experiments. Starting with the import of mass spectra data the system guides the experimenter through the process of protein identification by database search, the calculation of protein abundance ratios, and in particular, the statistical evaluation of the quantification results including multivariate analysis methods such as analysis of variance or hierarchical cluster analysis. While a data model to store these results has been developed, a well-defined programming interface facilitates the integration of novel approaches. A compute cluster is utilized to distribute computationally intensive calculations, and a web service allows to interchange information with other -omics software applications. To demonstrate that Qupe represents a step forward in quantitative proteomics analysis an application study on Corynebacterium glutamicum has been carried out. Qupe is implemented in Java utilizing Hibernate, Echo2, R and the Spring framework. We encourage the usage of the RIA in the sense of the 'software as a

  5. Quantitative ultrasound imaging detects degenerative changes in articular cartilage surface and subchondral bone

    International Nuclear Information System (INIS)

    Saarakkala, Simo; Laasanen, Mikko S; Jurvelin, Jukka S; Toeyraes, Juha

    2006-01-01

    Previous studies have suggested that quantitative ultrasound imaging could sensitively diagnose degeneration of the articular surface and changes in the subchondral bone during the development of osteoarthrosis (OA). We have recently introduced a new parameter, ultrasound roughness index (URI), for the quantification of cartilage surface roughness, and successfully tested it with normal and experimentally degraded articular surfaces. In this in vitro study, the applicability of URI was tested in bovine cartilage samples with spontaneously developed tissue degeneration. Simultaneously, we studied the sensitivity of quantitative ultrasound imaging to detect degenerative changes in the cartilage-bone interface. For reference, histological degenerative grade of the cartilage samples was determined. Mechanical reference measurements were also conducted. Cartilage surface roughness (URI) was significantly (p < 0.05) higher in histologically degenerated samples with inferior mechanical properties. Ultrasound reflection at the cartilage-bone interface was also significantly (p < 0.05) increased in degenerated samples. Furthermore, it was quantitatively confirmed that ultrasound attenuation in the overlying cartilage significantly affects the measured ultrasound reflection values from the cartilage-bone interface. To conclude, the combined ultrasound measurement of the cartilage surface roughness and ultrasound reflection at the cartilage-bone interface complement each other, and may together enable more sensitive and quantitative diagnosis of early OA or follow up after surgical cartilage repair

  6. Quantitative capillary electrophoresis and its application in analysis of alkaloids in tea, coffee, coca cola, and theophylline tablets.

    Science.gov (United States)

    Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao

    2009-01-01

    A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.

  7. Quantitative surface analysis using deuteron-induced nuclear reactions

    International Nuclear Information System (INIS)

    Afarideh, Hossein

    1991-01-01

    The nuclear reaction analysis (NRA) technique consists of looking at the energies of the reaction products which uniquely define the particular elements present in the sample and it analysis the yield/energy distribution to reveal depth profiles. A summary of the basic features of the nuclear reaction analysis technique is given, in particular emphasis is placed on quantitative light element determination using (d,p) and (d,alpha) reactions. The experimental apparatus is also described. Finally a set of (d,p) spectra for the elements Z=3 to Z=17 using 2 MeV incident deutrons is included together with example of more applications of the (d,alpha) spectra. (author)

  8. 78 FR 45954 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2013-07-30

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and... Applicants The following applicants each request a permit to import the sport- hunted trophy of one male...

  9. Application of LC–MS/MS for quantitative analysis of glucocorticoids and stimulants in biological fluids

    OpenAIRE

    Haneef, Jamshed; Shaharyar, Mohammad; Husain, Asif; Rashid, Mohd; Mishra, Ravinesh; Parveen, Shama; Ahmed, Niyaz; Pal, Manoj; Kumar, Deepak

    2013-01-01

    Liquid chromatography tandem mass chromatography (LCâMS/MS) is an important hyphenated technique for quantitative analysis of drugs in biological fluids. Because of high sensitivity and selectivity, LCâMS/MS has been used for pharmacokinetic studies, metabolites identification in the plasma and urine. This manuscript gives comprehensive analytical review, focusing on chromatographic separation approaches (column packing materials, column length and mobile phase) as well as different acquisiti...

  10. Application of LC–MS/MS for quantitative analysis of glucocorticoids and stimulants in biological fluids

    OpenAIRE

    Haneef, Jamshed; Shaharyar, Mohammad; Husain, Asif; Rashid, Mohd; Mishra, Ravinesh; Parveen, Shama; Ahmed, Niyaz; Pal, Manoj; Kumar, Deepak

    2013-01-01

    Liquid chromatography tandem mass chromatography (LC–MS/MS) is an important hyphenated technique for quantitative analysis of drugs in biological fluids. Because of high sensitivity and selectivity, LC–MS/MS has been used for pharmacokinetic studies, metabolites identification in the plasma and urine. This manuscript gives comprehensive analytical review, focusing on chromatographic separation approaches (column packing materials, column length and mobile phase) as well as different acquisiti...

  11. Development of an SRM method for absolute quantitation of MYDGF/C19orf10 protein.

    Science.gov (United States)

    Dwivedi, Ravi C; Krokhin, Oleg V; El-Gabalawy, Hani S; Wilkins, John A

    2016-06-01

    To develop a MS-based selected reaction monitoring (SRM) assay for quantitation of myeloid-derived growth factor (MYDGF) formerly chromosome 19 open reading frame (C19orf10). Candidate reporter peptides were identified in digests of recombinant MYDGF. Isotopically labeled forms of these reporter peptides were employed as internal standards for assay development. Two reference peptides were selected SYLYFQTFFK and GAEIEYAMAYSK with respective LOQ of 42 and 380 attomole per injection. Application of the assay to human serum and synovial fluid determined that the assay sensitivity was reduced and quantitation was not achievable. However, the partial depletion of albumin and immunoglobulin from synovial fluids provided estimates of 300-650 femtomoles per injection (0.7-1.6 nanomolar (nM) fluid concentrations) in three of the six samples analyzed. A validated sensitive assay for the quantitation of MYDGF in biological fluids was developed. However, the endogenous levels of MYDGF in such fluids are at or below the current levels of quantitation. The levels of MYDGF are lower than those previously reported using an ELISA. The current results suggest that additional steps may be required to remove high abundance proteins or to enrich MYDGF for SRM-based quantitation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    Science.gov (United States)

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  13. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    Directory of Open Access Journals (Sweden)

    Pia M. Jungmann

    2014-01-01

    Full Text Available Background. New quantitative magnetic resonance imaging (MRI techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC, and diffusion weighted imaging (DWI are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair.

  14. Quantitative kinetics of proteolytic enzymes determined by a surface concentration-based assay using peptide arrays.

    Science.gov (United States)

    Jung, Se-Hui; Kong, Deok-Hoon; Park, Seoung-Woo; Kim, Young-Myeong; Ha, Kwon-Soo

    2012-08-21

    Peptide arrays have emerged as a key technology for drug discovery, diagnosis, and cell biology. Despite the promise of these arrays, applications of peptide arrays to quantitative analysis of enzyme kinetics have been limited due to the difficulty in obtaining quantitative information of enzymatic reaction products. In this study, we developed a new approach for the quantitative kinetics analysis of proteases using fluorescence-conjugated peptide arrays, a surface concentration-based assay with solid-phase peptide standards using dry-off measurements, and compared it with an applied concentration-based assay. For fabrication of the peptide arrays, substrate peptides of cMMP-3, caspase-3, caspase-9, and calpain-1 were functionalized with TAMRA and cysteine, and were immobilized onto amine-functionalized arrays using a heterobifunctional linker, N-[γ-maleimidobutyloxy]succinimide ester. The proteolytic activities of the four enzymes were quantitatively analyzed by calculating changes induced by enzymatic reactions in the concentrations of peptides bound to array surfaces. In addition, this assay was successfully applied for calculating the Michaelis constant (K(m,surf)) for the four enzymes. Thus, this new assay has a strong potential for use in the quantitative evaluation of proteases, and for drug discovery through kinetics studies including the determination of K(m) and V(max).

  15. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    Science.gov (United States)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  16. Hydrologic applications of weather radar

    Science.gov (United States)

    Seo, Dong-Jun; Habib, Emad; Andrieu, Hervé; Morin, Efrat

    2015-12-01

    By providing high-resolution quantitative precipitation information (QPI), weather radars have revolutionized hydrology in the last two decades. With the aid of GIS technology, radar-based quantitative precipitation estimates (QPE) have enabled routine high-resolution hydrologic modeling in many parts of the world. Given the ever-increasing need for higher-resolution hydrologic and water resources information for a wide range of applications, one may expect that the use of weather radar will only grow. Despite the tremendous progress, a number of significant scientific, technological and engineering challenges remain to realize its potential. New challenges are also emerging as new areas of applications are discovered, explored and pursued. The purpose of this special issue is to provide the readership with some of the latest advances, lessons learned, experiences gained, and science issues and challenges related to hydrologic applications of weather radar. The special issue features 20 contributions on various topics which reflect the increasing diversity as well as the areas of focus in radar hydrology today. The contributions may be grouped as follows:

  17. 78 FR 21627 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2013-04-11

    ... decisions are: (1) Those supported by quantitative information or studies; and (2) Those that include... Tapiridae Applicant: Michael Tomb, Jackson, LA; PRT-01602B The applicant requests a permit to import a sport...

  18. 76 FR 60862 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2011-09-30

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... The following applicants each request a permit to import the sport- hunted trophy of one male bontebok...

  19. 76 FR 65207 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2011-10-20

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... applicants each request a permit to import the sport- hunted trophy of one male bontebok (Damaliscus pygargus...

  20. Theory and Practice in Quantitative Genetics

    DEFF Research Database (Denmark)

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C

    2003-01-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative...... geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships......) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each...

  1. Effect of simultaneous application of mycorrhiza with compost, vermicompost and sulfural geranole on some quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system

    Directory of Open Access Journals (Sweden)

    P rezvani moghaddam

    2016-03-01

    quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system was investigated. Materials and methods In order to evaluate the effects of simultaneous application of mycorrhiza and organic fertilizers on some quantitative and qualitative characteristics of sesame (Sesamum indicum L., an experiment was conducted based on randomized complete block design with three replications at Agricultural Research Farm, Ferdowsi University of Mashhad, Iran during growing season 2009-2010 growing season. Treatments were mycorrhiza (Glomus mosseae, mycorrhiza+compost, mycorrhiza+vermicompost, mycorrhiza+organic sulfural geranole, compost, vermicompost, Organic sulfural geranole and control (no fertilizer. Finally, data analysis was done using SAS 9.1 and means were compared by duncan’s multiple range test at 5% level of probability. Results and discussion The results showed that the effect of different organic and biological fertilizers were significant on seed yield. Seed yield significantly increased by using mycorrhiza in both condition of single and mixed with organic sulfural geranole and vermicompost compared to control treatment. Biological yield, in simultaneous application of vermicompost and organic sulfural geranole with mycorrhiza increased significantly compared to separate use of these fertilizers. All study organic fertilizers with mycorrhiza had significant effect on increasing oil content of sesame. Seed oil increased in simultaneous application of mycorrhiza and each of compost, vermicompost and organic sulfural geranole compared to separate application of mycorrhiza 12, 13 and 10 percentages, respectively. It seems that mycorrhiza and organic fertilizers improved quantitative and qualitative characteristics of sesame due to provide better conditions to absorption and transportation of nutrient to the plant (Hawkes et al., 2008. Conclusion In general, the results showed that the simultaneous use of ecological inputs can improve

  2. A qualitative and quantitative assessment for a bone marrow harvest simulator.

    Science.gov (United States)

    Machado, Liliane S; Moraes, Ronei M

    2009-01-01

    Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.

  3. Public and Public Utility Enterprises Restructuring: Statistical and Quantitative Aid for Ensuring Human Resource Sustainability

    Directory of Open Access Journals (Sweden)

    Mladen Čudanov

    2014-04-01

    Full Text Available This article presents a quantitative approach to restructuring public and public utility enterprises, particularly during downsizing requests. The large number of employees in the public sector can be one of the causes for economic instability at country level. That is particularly visible in the context of the euro zone crisis and economic/political instability in countries like Greece, Portugal, Ireland, Spain and Italy. Our approach is based on the statistical analysis of productivity oscillation and setting of performance standards in public and public utility enterprises based on the aforementioned productivity. Data background is given through job descriptions, organizational charts, salary reports and monthly performance reports, in most cases part of organizational information systems. It is recommended for quantitative data to be analyzed on a monthly basis, during a period of 30 or more months. Our method increases procedural fairness and accuracy, because quantitative, statistical, impartial and objective approach is applied for estimating parameters which could be related to downsizing. However, the application of this method is not limited to downsizing, as during its application in more than 20 public and public utility enterprises it was sometimes applied to increase output or reduce costs not necessarily connected to labour. Although it finally refers to downsizing, this method can provide fairer and more impartial approach than the subjective estimate of employee surplus, and its arbitral distribution within the enterprise.

  4. A method for improved clustering and classification of microscopy images using quantitative co-localization coefficients

    LENUS (Irish Health Repository)

    Singan, Vasanth R

    2012-06-08

    AbstractBackgroundThe localization of proteins to specific subcellular structures in eukaryotic cells provides important information with respect to their function. Fluorescence microscopy approaches to determine localization distribution have proved to be an essential tool in the characterization of unknown proteins, and are now particularly pertinent as a result of the wide availability of fluorescently-tagged constructs and antibodies. However, there are currently very few image analysis options able to effectively discriminate proteins with apparently similar distributions in cells, despite this information being important for protein characterization.FindingsWe have developed a novel method for combining two existing image analysis approaches, which results in highly efficient and accurate discrimination of proteins with seemingly similar distributions. We have combined image texture-based analysis with quantitative co-localization coefficients, a method that has traditionally only been used to study the spatial overlap between two populations of molecules. Here we describe and present a novel application for quantitative co-localization, as applied to the study of Rab family small GTP binding proteins localizing to the endomembrane system of cultured cells.ConclusionsWe show how quantitative co-localization can be used alongside texture feature analysis, resulting in improved clustering of microscopy images. The use of co-localization as an additional clustering parameter is non-biased and highly applicable to high-throughput image data sets.

  5. Application of sensitivity analysis to a quantitative assessment of neutron cross-section requirements for the TFTR: an interim report

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; Dudziak, D.J.; Muir, D.W.

    1975-09-01

    A computational method to determine cross-section requirements quantitatively is described and applied to the Tokamak Fusion Test Reactor (TFTR). In order to provide a rational basis for the priorities assigned to new cross-section measurements or evaluations, this method includes quantitative estimates of the uncertainty of currently available data, the sensitivity of important nuclear design parameters to selected cross sections, and the accuracy desired in predicting nuclear design parameters. Perturbation theory is used to combine estimated cross-section uncertainties with calculated sensitivities to determine the variance of any nuclear design parameter of interest

  6. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    Science.gov (United States)

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  7. Quantitative isotopes miction cystoureterography (QIMCU)

    International Nuclear Information System (INIS)

    Szy, D.A.G.; Stroetges, M.W.; Funke-Voelkers, R.

    1982-01-01

    A simple method for a quantitative evaluation of vesicoureteral reflux was developed. It allows the determination of a) the volume of reflux b) the volume of the bladder at each point of time during the examination. The QIMCU gives an insight into the dynamic of reflux, of reflux volume, and of actual bladder volume. The clinical application in 37 patients with 53 insufficient ureteral orifices (i.e. reflux) showed that the onset of reflux occured in 60% as early as in the first five minutes of the examination but later in the remaining 40%. The maximal reflux was found only in 26% during the first five minutes. The reflux volume exceeded in more than 50% the amount of 3.5 ml. The international grading corresponds with the reflux volume determined by this method. Radionuclide cystoureterography can be used as well in childhood as in adults. Because the radiaction exposure is low, the method can be recommended for the initial examination and for follow up studies. (Author)

  8. Quantitative imaging of subcellular metabolism with stable isotopes and multi-isotope imaging mass spectrometry

    Science.gov (United States)

    Steinhauser, Matthew L.; Lechene, Claude P.

    2014-01-01

    Multi-isotope imaging mass spectrometry (MIMS) is the quantitative imaging of stable isotope labels in cells with a new type of secondary ion mass spectrometer (NanoSIMS). The power of the methodology is attributable to (i) the immense advantage of using non-toxic stable isotope labels, (ii) high resolution imaging that approaches the resolution of usual transmission electron microscopy and (iii) the precise quantification of label down to 1 part-per-million and spanning several orders of magnitude. Here we review the basic elements of MIMS and describe new applications of MIMS to the quantitative study of metabolic processes including protein and nucleic acid synthesis in model organisms ranging from microbes to humans. PMID:23660233

  9. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    International Nuclear Information System (INIS)

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired

  10. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  11. Safety culture management and quantitative indicator evaluation

    International Nuclear Information System (INIS)

    Mandula, J.

    2002-01-01

    This report discuses a relationship between safety culture and evaluation of quantitative indicators. It shows how a systematic use of generally shared operational safety indicators may contribute to formation and reinforcement of safety culture characteristics in routine plant operation. The report also briefly describes the system of operational safety indicators used at the Dukovany plant. It is a PC database application enabling an effective work with the indicators and providing all users with an efficient tool for making synoptic overviews of indicator values in their links and hierarchical structure. Using color coding, the system allows quick indicator evaluation against predefined limits considering indicator value trends. The system, which has resulted from several-year development, was completely established at the plant during the years 2001 and 2002. (author)

  12. Determination of quantitative tissue composition by iterative reconstruction on 3D DECT volumes

    Energy Technology Data Exchange (ETDEWEB)

    Magnusson, Maria [Linkoeping Univ. (Sweden). Dept. of Electrical Engineering; Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Linkoeping Univ. (Sweden). Center for Medical Image Science and Visualization (CMIV); Malusek, Alexandr [Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Linkoeping Univ. (Sweden). Center for Medical Image Science and Visualization (CMIV); Nuclear Physics Institute AS CR, Prague (Czech Republic). Dept. of Radiation Dosimetry; Muhammad, Arif [Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Carlsson, Gudrun Alm [Linkoeping Univ. (Sweden). Dept. of Medical and Health Sciences, Radiation Physics; Linkoeping Univ. (Sweden). Center for Medical Image Science and Visualization (CMIV)

    2011-07-01

    Quantitative tissue classification using dual-energy CT has the potential to improve accuracy in radiation therapy dose planning as it provides more information about material composition of scanned objects than the currently used methods based on single-energy CT. One problem that hinders successful application of both single- and dual-energy CT is the presence of beam hardening and scatter artifacts in reconstructed data. Current pre- and post-correction methods used for image reconstruction often bias CT attenuation values and thus limit their applicability for quantitative tissue classification. Here we demonstrate simulation studies with a novel iterative algorithm that decomposes every soft tissue voxel into three base materials: water, protein, and adipose. The results demonstrate that beam hardening artifacts can effectively be removed and accurate estimation of mass fractions of each base material can be achieved. Our iterative algorithm starts with calculating parallel projections on two previously reconstructed DECT volumes reconstructed from fan-beam or helical projections with small conebeam angle. The parallel projections are then used in an iterative loop. Future developments include segmentation of soft and bone tissue and subsequent determination of bone composition. (orig.)

  13. Quantitative stress measurement of elastic deformation using mechanoluminescent sensor: An intensity ratio model

    Science.gov (United States)

    Cai, Tao; Guo, Songtao; Li, Yongzeng; Peng, Di; Zhao, Xiaofeng; Liu, Yingzheng

    2018-04-01

    The mechanoluminescent (ML) sensor is a newly developed non-invasive technique for stress/strain measurement. However, its application has been mostly restricted to qualitative measurement due to the lack of a well-defined relationship between ML intensity and stress. To achieve accurate stress measurement, an intensity ratio model was proposed in this study to establish a quantitative relationship between the stress condition and its ML intensity in elastic deformation. To verify the proposed model, experiments were carried out on a ML measurement system using resin samples mixed with the sensor material SrAl2O4:Eu2+, Dy3+. The ML intensity ratio was found to be dependent on the applied stress and strain rate, and the relationship acquired from the experimental results agreed well with the proposed model. The current study provided a physical explanation for the relationship between ML intensity and its stress condition. The proposed model was applicable in various SrAl2O4:Eu2+, Dy3+-based ML measurement in elastic deformation, and could provide a useful reference for quantitative stress measurement using the ML sensor in general.

  14. Quantitative imaging features: extension of the oncology medical image database

    Science.gov (United States)

    Patel, M. N.; Looney, P. T.; Young, K. C.; Halling-Brown, M. D.

    2015-03-01

    Radiological imaging is fundamental within the healthcare industry and has become routinely adopted for diagnosis, disease monitoring and treatment planning. With the advent of digital imaging modalities and the rapid growth in both diagnostic and therapeutic imaging, the ability to be able to harness this large influx of data is of paramount importance. The Oncology Medical Image Database (OMI-DB) was created to provide a centralized, fully annotated dataset for research. The database contains both processed and unprocessed images, associated data, and annotations and where applicable expert determined ground truths describing features of interest. Medical imaging provides the ability to detect and localize many changes that are important to determine whether a disease is present or a therapy is effective by depicting alterations in anatomic, physiologic, biochemical or molecular processes. Quantitative imaging features are sensitive, specific, accurate and reproducible imaging measures of these changes. Here, we describe an extension to the OMI-DB whereby a range of imaging features and descriptors are pre-calculated using a high throughput approach. The ability to calculate multiple imaging features and data from the acquired images would be valuable and facilitate further research applications investigating detection, prognosis, and classification. The resultant data store contains more than 10 million quantitative features as well as features derived from CAD predictions. Theses data can be used to build predictive models to aid image classification, treatment response assessment as well as to identify prognostic imaging biomarkers.

  15. Application of LC–MS/MS for quantitative analysis of glucocorticoids and stimulants in biological fluids

    Directory of Open Access Journals (Sweden)

    Jamshed Haneef

    2013-10-01

    Full Text Available Liquid chromatography tandem mass chromatography (LC–MS/MS is an important hyphenated technique for quantitative analysis of drugs in biological fluids. Because of high sensitivity and selectivity, LC–MS/MS has been used for pharmacokinetic studies, metabolites identification in the plasma and urine. This manuscript gives comprehensive analytical review, focusing on chromatographic separation approaches (column packing materials, column length and mobile phase as well as different acquisition modes (SIM, MRM for quantitative analysis of glucocorticoids and stimulants. This review is not meant to be exhaustive but rather to provide a general overview for detection and confirmation of target drugs using LC–MS/MS and thus useful in the doping analysis, toxicological studies as well as in pharmaceutical analysis. Keywords: LC–MS/MS, Ionization techniques, Glucocorticoids, Stimulants, Hyphenated techniques, Biological fluid

  16. Quantitative sub-surface and non-contact imaging using scanning microwave microscopy

    International Nuclear Information System (INIS)

    Gramse, Georg; Kasper, Manuel; Hinterdorfer, Peter; Brinciotti, Enrico; Rankl, Christian; Kienberger, Ferry; Lucibello, Andrea; Marcelli, Romolo; Patil, Samadhan B.; Giridharagopal, Rajiv

    2015-01-01

    The capability of scanning microwave microscopy for calibrated sub-surface and non-contact capacitance imaging of silicon (Si) samples is quantitatively studied at broadband frequencies ranging from 1 to 20 GHz. Calibrated capacitance images of flat Si test samples with varying dopant density (10 15 –10 19 atoms cm −3 ) and covered with dielectric thin films of SiO 2 (100–400 nm thickness) are measured to demonstrate the sensitivity of scanning microwave microscopy (SMM) for sub-surface imaging. Using standard SMM imaging conditions the dopant areas could still be sensed under a 400 nm thick oxide layer. Non-contact SMM imaging in lift-mode and constant height mode is quantitatively demonstrated on a 50 nm thick SiO 2 test pad. The differences between non-contact and contact mode capacitances are studied with respect to the main parameters influencing the imaging contrast, namely the probe tip diameter and the tip–sample distance. Finite element modelling was used to further analyse the influence of the tip radius and the tip–sample distance on the SMM sensitivity. The understanding of how the two key parameters determine the SMM sensitivity and quantitative capacitances represents an important step towards its routine application for non-contact and sub-surface imaging. (paper)

  17. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform.

    Science.gov (United States)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-12-14

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.

  18. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  19. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  20. The application of high-speed cinematography for the quantitative analysis of equine locomotion.

    Science.gov (United States)

    Fredricson, I; Drevemo, S; Dalin, G; Hjertën, G; Björne, K

    1980-04-01

    Locomotive disorders constitute a serious problem in horse racing which will only be rectified by a better understanding of the causative factors associated with disturbances of gait. This study describes a system for the quantitative analysis of the locomotion of horses at speed. The method is based on high-speed cinematography with a semi-automatic system of analysis of the films. The recordings are made with a 16 mm high-speed camera run at 500 frames per second (fps) and the films are analysed by special film-reading equipment and a mini-computer. The time and linear gait variables are presented in tabular form and the angles and trajectories of the joints and body segments are presented graphically.

  1. 78 FR 76171 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2013-12-16

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and...: Geoffrey Ridder; Utopia, TX; PRT-00030B The applicant requests a permit to export the sport-hunted trophy... period. Multiple Applicants The following applicants each request a permit to import the sport- hunted...

  2. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  3. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review

    Directory of Open Access Journals (Sweden)

    Prasanna A. Datar

    2015-08-01

    Full Text Available Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application. Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and quantitative analysis of carbamazepine in biological samples throughout all phases of clinical research and quality control. The article incorporates various reported methods developed to help analysts in choosing crucial parameters for new method development of carbamazepine and its derivatives and also enumerates metabolites, and impurities reported so far. Keywords: Carbamazepine, HPLC, LC–MS/MS, HPTLC, RP-UFLC, Micellar electrokinetic chromatography

  4. Four Popular Books on Consumer Debt: A Context for Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Andrew J. Miller

    2011-01-01

    Full Text Available The topics of credit cards, mortgages, subprime lending, and fringe banking are rich sources of problems and discussions for classes focused on quantitative literacy. In this theme book review, we look at four recent books on the consumer debt industry: Credit Card Nation, by Robert Manning; Maxed Out, by James Scurlock; Collateral Damaged, by Charles Geisst; and Broke, USA, by Gary Rivlin. Credit Card Nation takes a scholarly look at the history of credit in America with a focus on the genesis and growth of the credit card industry up to the turn of the 20th century. Maxed Out also examines the credit card industry, but its approach is to highlight the stories of individuals struggling with debt and thereby examine some of the damaging effects of credit card debt in the United States. Collateral Damaged is a timely exploration of the root causes at the institutional level of the credit crisis that began in 2008. Broke USA focuses on high-cost financing (pawn shops, payday loans, title loans, describing the history of what Rivlin calls the "poverty industry" and the political and legal challenges critics have mounted against the industry. Each of these books has something to offer a wide variety of quantitative literacy classes, providing scenarios, statistics, and problems worthy of examination. After reviewing each of the four books, we provide several examples of such quantitative literacy applications and close with some thoughts on the relationship between financial literacy and quantitative literacy.

  5. Application of Fault Management Theory to the Quantitive Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial

  6. 75 FR 52971 - Receipt of Applications for Permit

    Science.gov (United States)

    2010-08-30

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... to import the sport- hunted trophy of one male bontebok (Damaliscus pygargus pygargus) culled from a... applicant requests a permit to re-export a sport- hunted trophy of one male bontebok (Damaliscus pygargus...

  7. Quantitative Decision Making Model for Carbon Reduction in Road Construction Projects Using Green Technologies

    Directory of Open Access Journals (Sweden)

    Woosik Jang

    2015-08-01

    Full Text Available Numerous countries have established policies for reducing greenhouse gas emissions and have suggested goals pertaining to these reductions. To reach the target reduction amounts, studies on the reduction of carbon emissions have been conducted with regard to all stages and processes in construction projects. According to a study on carbon emissions, the carbon emissions generated during the construction stage of road projects account for approximately 76 to 86% of the total carbon emissions, far exceeding the other stages, such as maintenance or demolition. Therefore, this study aims to develop a quantitative decision making model that supports the application of green technologies (GTs to reduce carbon emissions during the construction stage of road construction projects. First, the authors selected environmental soundness, economic feasibility and constructability as the key assessment indices for evaluating 20 GTs. Second, a fuzzy set/qualitative comparative analysis (FS/QCA was used to establish an objective decision-making model for the assessment of both the quantitative and qualitative characteristics of the key indices. To support the developed model, an expert survey was performed to assess the applicability of each GT from a practical perspective, which was verified with a case study using two additional GTs. The proposed model is expected to support practitioners in the application of suitable GTs to road projects and reduce carbon emissions, resulting in better decision making during road construction projects.

  8. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  9. A review of state-of-the-art stereology for better quantitative 3D morphology in cardiac research.

    Science.gov (United States)

    Mühlfeld, Christian; Nyengaard, Jens Randel; Mayhew, Terry M

    2010-01-01

    The aim of stereological methods in biomedical research is to obtain quantitative information about three-dimensional (3D) features of tissues, cells, or organelles from two-dimensional physical or optical sections. With immunogold labeling, stereology can even be used for the quantitative analysis of the distribution of molecules within tissues and cells. Nowadays, a large number of design-based stereological methods offer an efficient quantitative approach to intriguing questions in cardiac research, such as "Is there a significant loss of cardiomyocytes during progression from ventricular hypertrophy to heart failure?" or "Does a specific treatment reduce the degree of fibrosis in the heart?" Nevertheless, the use of stereological methods in cardiac research is rare. The present review article demonstrates how some of the potential pitfalls in quantitative microscopy may be avoided. To this end, we outline the concepts of design-based stereology and illustrate their practical applications to a wide range of biological questions in cardiac research. We hope that the present article will stimulate researchers in cardiac research to incorporate design-based stereology into their study designs, thus promoting an unbiased quantitative 3D microscopy.

  10. The evolution of medical imaging from qualitative to quantitative: opportunities, challenges, and approaches (Conference Presentation)

    Science.gov (United States)

    Jackson, Edward F.

    2016-04-01

    Over the past decade, there has been an increasing focus on quantitative imaging biomarkers (QIBs), which are defined as "objectively measured characteristics derived from in vivo images as indicators of normal biological processes, pathogenic processes, or response to a therapeutic intervention"1. To evolve qualitative imaging assessments to the use of QIBs requires the development and standardization of data acquisition, data analysis, and data display techniques, as well as appropriate reporting structures. As such, successful implementation of QIB applications relies heavily on expertise from the fields of medical physics, radiology, statistics, and informatics as well as collaboration from vendors of imaging acquisition, analysis, and reporting systems. When successfully implemented, QIBs will provide image-derived metrics with known bias and variance that can be validated with anatomically and physiologically relevant measures, including treatment response (and the heterogeneity of that response) and outcome. Such non-invasive quantitative measures can then be used effectively in clinical and translational research and will contribute significantly to the goals of precision medicine. This presentation will focus on 1) outlining the opportunities for QIB applications, with examples to demonstrate applications in both research and patient care, 2) discussing key challenges in the implementation of QIB applications, and 3) providing overviews of efforts to address such challenges from federal, scientific, and professional organizations, including, but not limited to, the RSNA, NCI, FDA, and NIST. 1Sullivan, Obuchowski, Kessler, et al. Radiology, epub August 2015.

  11. Quantitative modeling of gene networks of biological systems using fuzzy Petri nets and fuzzy sets

    Directory of Open Access Journals (Sweden)

    Raed I. Hamed

    2018-01-01

    Full Text Available Quantitative demonstrating of organic frameworks has turned into an essential computational methodology in the configuration of novel and investigation of existing natural frameworks. Be that as it may, active information that portrays the framework's elements should be known keeping in mind the end goal to get pertinent results with the routine displaying strategies. This information is frequently robust or even difficult to get. Here, we exhibit a model of quantitative fuzzy rational demonstrating approach that can adapt to obscure motor information and hence deliver applicable results despite the fact that dynamic information is fragmented or just dubiously characterized. Besides, the methodology can be utilized as a part of the blend with the current cutting edge quantitative demonstrating strategies just in specific parts of the framework, i.e., where the data are absent. The contextual analysis of the methodology suggested in this paper is performed on the model of nine-quality genes. We propose a kind of FPN model in light of fuzzy sets to manage the quantitative modeling of biological systems. The tests of our model appear that the model is practical and entirely powerful for information impersonation and thinking of fuzzy expert frameworks.

  12. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    Science.gov (United States)

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  13. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  14. Quantitative Examination of Piezoelectric/Seismoelectric Anomalies from Near-Surface Targets

    Directory of Open Access Journals (Sweden)

    Lev Eppelbaum

    2017-09-01

    Full Text Available The piezoelectric and seismo-electrokinetic phenomena are manifested by electrical and electromagnetic processes that occur in rocks under the influence of elastic oscillations triggered by shots or mechanical impacts. Differences in piezoelectric properties between the studied targets and host media determine the possibilities of the piezoelectric/seismoelectric method application. Over a long time, an interpretation of obtained data is carried out by the use of methods developed in seismic prospecting. Examination of nature of piezoelectric/seismoelectric anomalies observed in subsurface indicates that these may be related (mainly to electric potential field. In this paper, it is shown that quantitative analysis of piezoelectric/seismoelectric anomalies may be performed by the advanced and reliable methodologies developed in magnetic prospecting. Some examples from mining geophysics (Russia and ancient metallurgical site (Israel confirm applicability of the suggested approach.

  15. 77 FR 54604 - Endangered Species; Receipt of Applications for Permit

    Science.gov (United States)

    2012-09-05

    ...) Those supported by quantitative information or studies; and (2) Those that include citations to, and... applicant requests a permit to export sport hunted trophies of one male addax (Addax nasomaculatus), one.... Applicant: John Fry, Carson City, NV; PRT-82592A The applicant requests a permit to import a sport-hunted...

  16. Infusion of Quantitative and Statistical Concepts into Biology Courses Does Not Improve Quantitative Literacy

    Science.gov (United States)

    Beck, Christopher W.

    2018-01-01

    Multiple national reports have pushed for the integration of quantitative concepts into the context of disciplinary science courses. The aim of this study was to evaluate the quantitative and statistical literacy of biology students and explore learning gains when those skills were taught implicitly in the context of biology. I examined gains in…

  17. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    Science.gov (United States)

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  18. Metallurgical applications of the Moessbauer effect

    International Nuclear Information System (INIS)

    Flinn, P.A.

    1975-01-01

    Recent developments and practical applications of the Moessbauer effect are reviewed. Moessbauer studies into solid solutions, phase transformations in certain alloy systems and steels, deformation-induced transformations in and corrosion of steels are discussed. Also discussed are the applications of Moessbauer spectroscopy in process metallurgy for diffusion measurements in solids and in an accurate quantitative analysis. The use of backscatter geometry is dealt with. (Z.S.)

  19. Joint analysis of binary and quantitative traits with data sharing and outcome-dependent sampling.

    Science.gov (United States)

    Zheng, Gang; Wu, Colin O; Kwak, Minjung; Jiang, Wenhua; Joo, Jungnam; Lima, Joao A C

    2012-04-01

    We study the analysis of a joint association between a genetic marker with both binary (case-control) and quantitative (continuous) traits, where the quantitative trait values are only available for the cases due to data sharing and outcome-dependent sampling. Data sharing becomes common in genetic association studies, and the outcome-dependent sampling is the consequence of data sharing, under which a phenotype of interest is not measured for some subgroup. The trend test (or Pearson's test) and F-test are often, respectively, used to analyze the binary and quantitative traits. Because of the outcome-dependent sampling, the usual F-test can be applied using the subgroup with the observed quantitative traits. We propose a modified F-test by also incorporating the genotype frequencies of the subgroup whose traits are not observed. Further, a combination of this modified F-test and Pearson's test is proposed by Fisher's combination of their P-values as a joint analysis. Because of the correlation of the two analyses, we propose to use a Gamma (scaled chi-squared) distribution to fit the asymptotic null distribution for the joint analysis. The proposed modified F-test and the joint analysis can also be applied to test single trait association (either binary or quantitative trait). Through simulations, we identify the situations under which the proposed tests are more powerful than the existing ones. Application to a real dataset of rheumatoid arthritis is presented. © 2012 Wiley Periodicals, Inc.

  20. Quantitative Literacy and the Common Core State Standards in Mathematics

    Directory of Open Access Journals (Sweden)

    Bernard L. Madison

    2015-01-01

    Full Text Available How supportive of quantitative literacy (QL are the Common Core State Standards in Mathematics (CCSSM? The answer is tentative and conditional. There are some QL-supportive features including a strong probability and statistics strand in grade 6 through high school; a measurements and data strand in K-5; ratio and proportional reasoning standards in grades 6 and 7; and a comprehensive and coherent approach to algebraic reasoning and logical argument. However, the standards are weak in supporting reasoning and interpretation, and there are indications that the applications in CCSSM – mostly unspecified – will not include many QL contextual situations. Early indicators of assessment items follow a similar path. Except for statistics, most of the high school standards are aimed at development of algebra and precalculus topics, and there will likely be little room for more sophisticated applications of the QL-friendly mathematics of grades 6-8. The experience with CCSSM is limited at this point, leaving several crucial results uncertain, including assessments, emphases on statistics, and kinds of modeling and other applications.

  1. Particle induced X-ray emission for quantitative trace-element analysis using the Eindhoven cyclotron

    International Nuclear Information System (INIS)

    Kivits, H.

    1980-01-01

    Development of a multi-elemental trace analysis technique using PIXE (Particle Induced X-ray Emission), was started almost five years ago at the Eindhoven University of Technology, in the Cyclotron Applications Group of the Physics Department. The aim of the work presented is to improve the quantitative aspects of trace-element analysis with PIXE, as well as versatility, speed and simplicity. (Auth.)

  2. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  3. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  4. The Quantitative Preparation of Future Geoscience Graduate Students

    Science.gov (United States)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  5. Quantitative diffusion characteristics of the human brain depend on MRI sequence parameters

    International Nuclear Information System (INIS)

    Wilson, M.; Blumhardt, L.D.; Morgan, P.S.

    2002-01-01

    Quantitative diffusion-weighted MRI has been applied to the study of neurological diseases, including multiple sclerosis, where the molecular self-diffusion coefficient D has been measured in both lesions and normal-appearing white matter. Histograms of D have been used as a novel measure of the ''lesion load'', with potential applications that include the monitoring of efficacy in new treatment trials. However different ways of measuring D may affect its value, making comparison between different centres and research groups impossible. We aimed to assess the effect, if any, of using two different MRI sequences on the value of D. We studied 13 healthy volunteers, using two different quantitative diffusion sequences (including different b max values and gradient applications). Maps of D were analysed using both regions of interest (ROI) in white matter and ''whole brain'' histograms, and compared between the two sequences. In addition, we studied three standardised test liquids (with known values of D) using both sequences. Histograms from the two sequences had different distributions, with a greater spread and higher peak position from the sequence with lower b max . This greater spread of D was also evident in the white matter and test liquid ROI. ''Limits of agreement'' analysis demonstrated that the differences could be clinically relevant, despite significant correlations between the sequences obtained using simple rank methods. We conclude that different quantitative diffusion sequences are unlikely to produce directly comparable values of D, particularly if different b max values are used. In addition, the use of inappropriate statistical tests may give false impressions of close agreement. Standardisation of methods for the measurement of D are required if these techniques are to become useful tools, for example in monitoring changes in the disease burden of multiple sclerosis. (orig.)

  6. Quantitative diffusion characteristics of the human brain depend on MRI sequence parameters

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, M.; Blumhardt, L.D. [University of Nottingham, Department of Neurology, Royal Preston Hospital, Preston (United Kingdom); Morgan, P.S. [Division of Academic Radiology, Queens Medical Centre, Nottingham (United Kingdom)

    2002-07-01

    Quantitative diffusion-weighted MRI has been applied to the study of neurological diseases, including multiple sclerosis, where the molecular self-diffusion coefficient D has been measured in both lesions and normal-appearing white matter. Histograms of D have been used as a novel measure of the ''lesion load'', with potential applications that include the monitoring of efficacy in new treatment trials. However different ways of measuring D may affect its value, making comparison between different centres and research groups impossible. We aimed to assess the effect, if any, of using two different MRI sequences on the value of D. We studied 13 healthy volunteers, using two different quantitative diffusion sequences (including different b{sub max} values and gradient applications). Maps of D were analysed using both regions of interest (ROI) in white matter and ''whole brain'' histograms, and compared between the two sequences. In addition, we studied three standardised test liquids (with known values of D) using both sequences. Histograms from the two sequences had different distributions, with a greater spread and higher peak position from the sequence with lower b{sub max}. This greater spread of D was also evident in the white matter and test liquid ROI. ''Limits of agreement'' analysis demonstrated that the differences could be clinically relevant, despite significant correlations between the sequences obtained using simple rank methods. We conclude that different quantitative diffusion sequences are unlikely to produce directly comparable values of D, particularly if different b{sub max} values are used. In addition, the use of inappropriate statistical tests may give false impressions of close agreement. Standardisation of methods for the measurement of D are required if these techniques are to become useful tools, for example in monitoring changes in the disease burden of multiple sclerosis. (orig.)

  7. La PCR quantitative en temps réel : application à la quantification des OGM

    Directory of Open Access Journals (Sweden)

    Alary Rémi

    2002-11-01

    Full Text Available Suite à l’obligation d’étiquetage, au seuil de 1 %, des aliments contenant des OGM autorisés, il est nécessaire de disposer de méthodes fiables de quantification. Pour répondre à cette obligation, la technique de PCR quantitative en temps réel semble actuellement la mieux adaptée. Son principe, ses avantages et sa mise en oeuvre pour la détermination de la teneur en OGM de farines de soja sont présentés. Les PCR simplex et duplex sont comparées.

  8. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    Science.gov (United States)

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  9. Proteus mirabilis biofilm - Qualitative and quantitative colorimetric methods-based evaluation

    Directory of Open Access Journals (Sweden)

    Joanna Kwiecinska-Piróg

    2014-12-01

    Full Text Available Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride and CV (crystal violet application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters. The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  10. Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.

    Science.gov (United States)

    Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P

    2017-01-11

    Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.

  11. A probe-based quantitative PCR assay for detecting Tetracapsuloides bryosalmonae in fish tissue and environmental DNA water samples

    Science.gov (United States)

    Hutchins, Patrick; Sepulveda, Adam; Martin, Renee; Hopper, Lacey

    2017-01-01

    A probe-based quantitative real-time PCR assay was developed to detect Tetracapsuloides bryosalmonae, which causes proliferative kidney disease in salmonid fish, in kidney tissue and environmental DNA (eDNA) water samples. The limits of detection and quantification were 7 and 100 DNA copies for calibration standards and T. bryosalmonae was reliably detected down to 100 copies in tissue and eDNA samples. The assay presented here is a highly sensitive and quantitative tool for detecting T. bryosalmonae with potential applications for tissue diagnostics and environmental detection.

  12. From inverse problems in mathematical physiology to quantitative differential diagnoses.

    Directory of Open Access Journals (Sweden)

    Sven Zenker

    2007-11-01

    Full Text Available The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting, using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge. We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of

  13. From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses

    Science.gov (United States)

    Zenker, Sven; Rubin, Jonathan; Clermont, Gilles

    2007-01-01

    The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses

  14. Using ISOS consensus test protocols for development of quantitative life test models in ageing of organic solar cells

    DEFF Research Database (Denmark)

    Kettle, J.; Stoichkov, V.; Kumar, D.

    2017-01-01

    As Organic Photovoltaic (OPV) development matures, the demand grows for rapid characterisation of degradation and application of Quantitative Accelerated Life Tests (QALT) models to predict and improve reliability. To date, most accelerated testing on OPVs has been conducted using ISOS consensus...

  15. Quantitative fluorescence nanoscopy for cancer biomedicine

    Science.gov (United States)

    Huang, Tao; Nickerson, Andrew; Peters, Alec; Nan, Xiaolin

    2015-08-01

    Cancer is a major health threat worldwide. Options for targeted cancer therapy, however, are often limited, in a large part due to our incomplete understanding of how key processes including oncogenesis and drug response are mediated at the molecular level. New imaging techniques for visualizing biomolecules and their interactions at the nanometer and single molecule scales, collectively named fluorescence nanoscopy, hold the promise to transform biomedical research by providing direct mechanistic insight into cellular processes. We discuss the principles of quantitative single-molecule localization microscopy (SMLM), a subset of fluorescence nanoscopy, and their applications to cancer biomedicine. In particular, we will examine oncogenesis and drug resistance mediated by mutant Ras, which is associated with ~1/3 of all human cancers but has remained an intractable drug target. At ~20 nm spatial and single-molecule stoichiometric resolutions, SMLM clearly showed that mutant Ras must form dimers to activate its effector pathways and drive oncogenesis. SMLM further showed that the Raf kinase, one of the most important effectors of Ras, also forms dimers upon activation by Ras. Moreover, treatment of cells expressing wild type Raf with Raf inhibitors induces Raf dimer formation in a manner dependent on Ras dimerization. Together, these data suggest that Ras dimers mediate oncogenesis and drug resistance in tumors with hyperactive Ras and can potentially be targeted for cancer therapy. We also discuss recent advances in SMLM that enable simultaneous imaging of multiple biomolecules and their interactions at the nanoscale. Our work demonstrates the power of quantitative SMLM in cancer biomedicine.

  16. Quantitation of esophageal transit and gastroesophageal reflux

    International Nuclear Information System (INIS)

    Malmud, L.S.; Fisher, R.S.

    1986-01-01

    Scintigraphic techniques are the only quantitative methods for the evaluation of esophageal transit and gastroesophageal reflux. By comparison, other techniques are not quantitative and are either indirect, inconvenient, or less sensitive. Methods, such as perfusion techniques, which measure flow, require the introduction of a tube assembly into the gastrointestinal tract with the possible introduction of artifacts into the measurements due to the indwelling tubes. Earlier authors using radionuclide markers, introduced a method for measuring gastric emptying which was both tubeless and quantitative in comparison to other techniques. More recently, a number of scintigraphic methods have been introduced for the quantitation of esophageal transit and clearance, the detection and quantitation of gastroesophageal reflux, the measurement of gastric emptying using a mixed solid-liquid meal, and the quantitation of enterogastric reflux. This chapter reviews current techniques for the evaluation of esophageal transit and gastroesophageal reflux

  17. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  18. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  19. Quantitative bioanalytical and analytical method development of dibenzazepine derivative, carbamazepine: A review ☆

    OpenAIRE

    Datar, Prasanna A.

    2015-01-01

    Bioanalytical methods are widely used for quantitative estimation of drugs and their metabolites in physiological matrices. These methods could be applied to studies in areas of human clinical pharmacology and toxicology. The major bioanalytical services are method development, method validation and sample analysis (method application). Various methods such as GC, LC–MS/MS, HPLC, HPTLC, micellar electrokinetic chromatography, and UFLC have been used in laboratories for the qualitative and qua...

  20. 77 FR 58405 - Endangered Species; Marine Mammals; Receipt of Applications for Permit

    Science.gov (United States)

    2012-09-20

    ... decisions are: (1) Those supported by quantitative information or studies; and (2) Those that include... species. Multiple Applicants The following applicants each request a permit to import the sport- hunted...

  1. 77 FR 12870 - Endangered Species; Marine Mammals; Receipt of Applications for Permit

    Science.gov (United States)

    2012-03-02

    ... decisions are: (1) Those supported by quantitative information or studies; and (2) Those that include... propagation. Multiple Applicants The following applicants each request a permit to import the sport- hunted...

  2. New approaches for the analysis of confluent cell layers with quantitative phase digital holographic microscopy

    Science.gov (United States)

    Pohl, L.; Kaiser, M.; Ketelhut, S.; Pereira, S.; Goycoolea, F.; Kemper, Björn

    2016-03-01

    Digital holographic microscopy (DHM) enables high resolution non-destructive inspection of technical surfaces and minimally-invasive label-free live cell imaging. However, the analysis of confluent cell layers represents a challenge as quantitative DHM phase images in this case do not provide sufficient information for image segmentation, determination of the cellular dry mass or calculation of the cell thickness. We present novel strategies for the analysis of confluent cell layers with quantitative DHM phase contrast utilizing a histogram based-evaluation procedure. The applicability of our approach is illustrated by quantification of drug induced cell morphology changes and it is shown that the method is capable to quantify reliable global morphology changes of confluent cell layers.

  3. Application of quantitative autoradiography to the measurement of biochemical processes in vivo

    International Nuclear Information System (INIS)

    Sokoloff, L.

    1985-01-01

    Quantitative autoradiography makes it possible to measure the concentrations of isotopes in tissues of animals labeled in vivo. In a few cases, the administration of a judiciously selected labeled chemical compound and a properly designed procedure has made it possible to use this capability to measure the rate of a chemical process in animals in vivo. Emission tomography, and particularly positron emission tomography, provides a means to extend this capability to man and to assay the rates of biochemical processes in human tissues in vivo. It does not, however, obviate the need to adhere to established principles of chemical and enzyme kinetics and tracer theory. Generally, all such methods, whether to be used in man with positron emission tomography or in animals with autoradiography, must first be developed by research in animals with autoradiography, because it is only in animals that the measurements needed to validate the basic assumptions of the methods can be tested and evaluated

  4. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  5. New generation quantitative x-ray microscopy encompassing phase-contrast

    International Nuclear Information System (INIS)

    Wilkins, S.W.; Mayo, S.C.; Gureyev, T.E.; Miller, P.R.; Pogany, A.; Stevenson, A.W.; Gao, D.; Davis, T.J.; Parry, D.J.; Paganin, D.

    2000-01-01

    Full text: We briefly outline a new approach to X-ray ultramicroscopy using projection imaging in a scanning electron microscope (SEM). Compared to earlier approaches, the new approach offers spatial resolution of ≤0.1 micron and includes novel features such as: i) phase contrast to give additional sample information over a wide energy range, rapid phase/amplitude extraction algorithms to enable new real-time modes of microscopic imaging widespread applications are envisaged to fields such as materials science, biomedical research, and microelectronics device inspection. Some illustrative examples are presented. The quantitative methods described here are also very relevant to X-ray projection microscopy using synchrotron sources

  6. Prediction of quantitative phenotypes based on genetic networks: a case study in yeast sporulation

    Directory of Open Access Journals (Sweden)

    Shen Li

    2010-09-01

    Full Text Available Abstract Background An exciting application of genetic network is to predict phenotypic consequences for environmental cues or genetic perturbations. However, de novo prediction for quantitative phenotypes based on network topology is always a challenging task. Results Using yeast sporulation as a model system, we have assembled a genetic network from literature and exploited Boolean network to predict sporulation efficiency change upon deleting individual genes. We observe that predictions based on the curated network correlate well with the experimentally measured values. In addition, computational analysis reveals the robustness and hysteresis of the yeast sporulation network and uncovers several patterns of sporulation efficiency change caused by double gene deletion. These discoveries may guide future investigation of underlying mechanisms. We have also shown that a hybridized genetic network reconstructed from both temporal microarray data and literature is able to achieve a satisfactory prediction accuracy of the same quantitative phenotypes. Conclusions This case study illustrates the value of predicting quantitative phenotypes based on genetic network and provides a generic approach.

  7. Investigation of quantitative separation of thorium, uranium, neptunium and plutonium from complex radiochemical mixtures

    International Nuclear Information System (INIS)

    Ushatskij, V.N.; Preobrazhenskaya, L.D.; Kolychev, V.B.; Gugel', E.S.

    1979-01-01

    Quantitative separation of actinides and their radiochemical purification with the aid of TBP with subsequent separation of thorium and quantitative separation of U, Np and Pu with the aid of D2EHPA have been studied. The method has been developed for quantitative extraction-chromatographic separation and radiochemical purification of nanogram amounts of U, Pu and microgram amounts of Th and Np from complex radiochemical mixtures containing both fragment radioisotopes and non-radioactive macrocomponents ( Fe,Al,Mg,Mn, Na and others). The method calls for application of one-extraction-chromatographic column with TBP and one column with D2EHPA. Thorium is separated at the first stage since it does not form complexes in a chloride solution during washing of the sorption column with 6. OM HCl. Npsup((4)) and Pusup((3)) required for separation are stabilized with the aid of hydrazine and hydroxylamine mixture. The yield of each of the above-cited actinide elements during the complete two-stage separation and at the stage of their separation varies within the range of 98.5-99.3%

  8. Novel Quantitative Real-Time LCR for the Sensitive Detection of SNP Frequencies in Pooled DNA: Method Development, Evaluation and Application

    Science.gov (United States)

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-01

    Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808

  9. Fast automatic quantitative cell replication with fluorescent live cell imaging

    Directory of Open Access Journals (Sweden)

    Wang Ching-Wei

    2012-01-01

    Full Text Available Abstract Background live cell imaging is a useful tool to monitor cellular activities in living systems. It is often necessary in cancer research or experimental research to quantify the dividing capabilities of cells or the cell proliferation level when investigating manipulations of the cells or their environment. Manual quantification of fluorescence microscopic image is difficult because human is neither sensitive to fine differences in color intensity nor effective to count and average fluorescence level among cells. However, auto-quantification is not a straightforward problem to solve. As the sampling location of the microscopy changes, the amount of cells in individual microscopic images varies, which makes simple measurement methods such as the sum of stain intensity values or the total number of positive stain within each image inapplicable. Thus, automated quantification with robust cell segmentation techniques is required. Results An automated quantification system with robust cell segmentation technique are presented. The experimental results in application to monitor cellular replication activities show that the quantitative score is promising to represent the cell replication level, and scores for images from different cell replication groups are demonstrated to be statistically significantly different using ANOVA, LSD and Tukey HSD tests (p-value Conclusion A robust automated quantification method of live cell imaging is built to measure the cell replication level, providing a robust quantitative analysis system in fluorescent live cell imaging. In addition, the presented unsupervised entropy based cell segmentation for live cell images is demonstrated to be also applicable for nuclear segmentation of IHC tissue images.

  10. Development of monoclonal antibodies and quantitative ELISAs targeting insulin-degrading enzyme

    Directory of Open Access Journals (Sweden)

    Dickson Dennis W

    2009-10-01

    Full Text Available Abstract Background Insulin-degrading enzyme (IDE is a widely studied zinc-metalloprotease implicated in the pathogenesis of type 2 diabetes mellitus, Alzheimer disease (AD and varicella zoster virus infection. Despite more than six decades of research on IDE, progress has been hampered by the lack of well-characterized reagents targeting this biomedically important protease. To address this important need, we generated and characterized new mouse monoclonal antibodies (mAbs targeting natively folded human and rodent IDE. Results Eight monoclonal hybridoma cell lines were derived in house from mice immunized with full-length, natively folded, recombinant human IDE. The mAbs derived from these lines were shown to detect IDE selectively and sensitively by a wide range of methods. Two mAbs in particular—designated 6A1 and 6H9—proved especially selective for IDE in immunocytochemical and immunohistochemical applications. Using a variety of methods, we show that 6A1 selectively detects both human and rodent IDE, while 6H9 selectively detects human, but not rodent, IDE, with both mAbs showing essentially no cross reactivity with other proteins in these applications. Using these novel anti-IDE mAbs, we also developed sensitive and quantitative sandwich ELISAs capable of quantifying IDE levels present in human brain extracts. Conclusion We succeeded in developing novel mAbs that selectively detect rodent and/or human IDE, which we have shown to be suitable for a wide range of applications, including western blotting, immunoprecipitation, immunocytochemistry, immunohistochemistry, and quantitative sandwich ELISAs. These novel anti-IDE mAbs and the assays derived from them constitute important new tools for addressing many unresolved questions about the basic biology of IDE and its role in multiple highly prevalent human diseases.

  11. Quantitative comparison of the in situ microbial communities in different biomes

    Energy Technology Data Exchange (ETDEWEB)

    White, D.C. [Tennessee Univ., Knoxville, TN (United States)]|[Oak Ridge National Lab., TN (United States); Ringelberg, D.B.; Palmer, R.J. [Tennessee Univ., Knoxville, TN (United States). Center for Environmental Biotechnology

    1995-12-31

    A system to define microbial communities in different biomes requires the application of non-traditional methodology. Classical microbiological methods have severe limitations for the analysis of environmental samples. Pure-culture isolation, biochemical testing, and/or enumeration by direct microscopic counting are not well suited for the estimation of total biomass or the assessment of community composition within environmental samples. Such methods provide little insight into the in situ phenotypic activity of the extant microbiota since these techniques are dependent on microbial growth and thus select against many environmental microorganisms which are non- culturable under a wide range of conditions. It has been repeatedly documented in the literature that viable counts or direct counts of bacteria attached to sediment grains are difficult to quantitative and may grossly underestimate the extent of the existing community. The traditional tests provide little indication of the in situ nutritional status or for evidence of toxicity within the microbial community. A more recent development (MIDI Microbial Identification System), measure free and ester-linked fatty acids from isolated microorganisms. Bacterial isolates are identified by comparing their fatty acid profiles to the MIKI database which contains over 8000 entries. The application of the MIKI system to the analysis of environmental samples however, has significant drawbacks. The MIDI system was developed to identify clinical microorganisms and requires their isolation and culture on trypticase soy agar at 27{degrees}C. Since many isolates are unable to grow at these restrictive growth conditions, the system does not lend itself to identification of some environmental organisms. A more applicable methodology for environmental microbial analysis is based on the liquid extrication and separation of microbial lipids from environmental samples, followed by quantitative analysis using gas chromatography/

  12. A unique charge-coupled device/xenon arc lamp based imaging system for the accurate detection and quantitation of multicolour fluorescence.

    Science.gov (United States)

    Spibey, C A; Jackson, P; Herick, K

    2001-03-01

    In recent years the use of fluorescent dyes in biological applications has dramatically increased. The continual improvement in the capabilities of these fluorescent dyes demands increasingly sensitive detection systems that provide accurate quantitation over a wide linear dynamic range. In the field of proteomics, the detection, quantitation and identification of very low abundance proteins are of extreme importance in understanding cellular processes. Therefore, the instrumentation used to acquire an image of such samples, for spot picking and identification by mass spectrometry, must be sensitive enough to be able, not only, to maximise the sensitivity and dynamic range of the staining dyes but, as importantly, adapt to the ever changing portfolio of fluorescent dyes as they become available. Just as the available fluorescent probes are improving and evolving so are the users application requirements. Therefore, the instrumentation chosen must be flexible to address and adapt to those changing needs. As a result, a highly competitive market for the supply and production of such dyes and the instrumentation for their detection and quantitation have emerged. The instrumentation currently available is based on either laser/photomultiplier tube (PMT) scanning or lamp/charge-coupled device (CCD) based mechanisms. This review briefly discusses the advantages and disadvantages of both System types for fluorescence imaging, gives a technical overview of CCD technology and describes in detail a unique xenon/are lamp CCD based instrument, from PerkinElmer Life Sciences. The Wallac-1442 ARTHUR is unique in its ability to scan both large areas at high resolution and give accurate selectable excitation over the whole of the UV/visible range. It operates by filtering both the excitation and emission wavelengths, providing optimal and accurate measurement and quantitation of virtually any available dye and allows excellent spectral resolution between different fluorophores

  13. Radioisotope studies for quantitative measurement of manganese absorption

    International Nuclear Information System (INIS)

    Helbig, U.

    1981-01-01

    Purpose of the present study was to quantitatively determine the manganese absorption in growing rats by means of radioisotopes. First of all the following factors had to be investigated, which are significant for this determination: Measurability of stable and radioactive Mn in rat tissues; labelling of stable Mn and distribution of stable and radioactive Mn in the organism; verification of the isotope dilution method and of the comparative balance method with regard to its applicability for the determination of the true Mn absorption. We useed male and female Sprague-Dawley rats. The most important results are summarized in the following: in some separate tissues measurement of stable Mn was accompanied by difficulties. The measurement of radioactive Mn however, could be performed without any problems. 10 d after i.m. injection of 54 Mn only 17% of the administered Mn was still detectable in the organism. However, there was no uniform tissue labelling found. Therefore it is possible to an only restricted extent to draw quantitative conclusions on the content of stable Mn. A high percentage of stable and radioactive Mn was found above all in the liver. The isotope dilution method permits by feces analysis to differentiate between unabsorbed Mn coming from the food and endogenic Mn coming from the organism itself. The effective Mn absorption was also determined by means of the comparative balance method. By means of the isotope dilution method we determined the quantitative Mn-absorption with staged Mn administration and the contribution of absorption and excretion to the homeostatic regulation mechanisms of Mn. We found that absorption and excretion help the organism to keep an almost constant Mn concentration even with a differing Mn supply. (orig./MG) [de

  14. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    Science.gov (United States)

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-12-01

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  15. A human fecal contamination index for ranking impaired recreational watersusing the HF183 quantitative real-time PCR method

    Science.gov (United States)

    Human fecal pollution of surface water remains a public health concern worldwide. As a result, there is a growing interest in the application of human-associated fecal source identification quantitative real-time PCR (qPCR) technologies for recreational water quality risk managem...

  16. Measurement of {pi}{sup {+-}} p forward elastic differential cross sections at 410 and 490 MeV; Mesures des sections efficaces differentielles {pi}{sup {+-}} p a 410 MeV et 490 MeV vers l'avant

    Energy Technology Data Exchange (ETDEWEB)

    Banner, M [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-01-01

    Measurements of the {pi}{sup {+-}} p forward elastic differential cross sections were performed with a missing mass spectrometer using spark chambers. Photographs were automatically scanned. Phase shift analysis of experimental results leads to three solutions. Experiments are indicated which ought to permit overcoming ambiguity. (author) [French] Des mesures de sections efficaces differentielles elastiques {pi}{sup {+-}} p vers l'avant ont ete effectuees a l'aide d'un spectrometre de masse manquante a chambres a etincelles. Les cliches ont ete depouilles par un appareillage automatique. Les resultats experimentaux sont analyses en dephasages et conduisent a trois solutions. Des experiences sont indiquees qui devraient permettre de lever l'ambiguite. (auteur)

  17. Quantitative Reasoning in Environmental Science: A Learning Progression

    Science.gov (United States)

    Mayes, Robert Lee; Forrester, Jennifer Harris; Christus, Jennifer Schuttlefield; Peterson, Franziska Isabel; Bonilla, Rachel; Yestness, Nissa

    2014-01-01

    The ability of middle and high school students to reason quantitatively within the context of environmental science was investigated. A quantitative reasoning (QR) learning progression was created with three progress variables: quantification act, quantitative interpretation, and quantitative modeling. An iterative research design was used as it…

  18. Culture Sustainability: Culture Quotient (CQ and Its Quantitative Empirical Application to Chinese Cities

    Directory of Open Access Journals (Sweden)

    Jing Lin

    2016-11-01

    Full Text Available Culture sustainability is one of the indispensable components of sustainability. Culture has likely always been an important element for promoting urban and rural sustainable development. It is now playing an increasingly significant role in sparking and incubating innovation, which is becoming the main driver of economic growth and competitiveness. Unfortunately, little research has been conducted on how much culture matters to economic performance in a quantitative way. Therefore, in this paper, which is based on an intensive literature review, we try to specifically quantify the importance of culture to urban development in general and urban economic performance in particular, by proposing an index system dubbed as the Culture Quotient (CQ. Following this, an integrated database of 297 prefectural-level cities in China is accordingly established. By manipulating the database, the CQ value for each city is then calculated by using principal component analysis with SPSS (19.0. Afterwards, spatial pattern by CQ value tier is presented and illustrates urban China’s “winner-take-all” phenomenon, with the predominance by the three giant urban clusters in the coastal area, i.e., the Jing (Beijing-Jin (Tianjin-Ji (Hebei province-based Bohai rim region, Yangtze River delta, Pearl River delta, as well as some mega-cities such as Chengdu and Wuhan in other parts of China. More precisely, the regression analysis shows that there is a strong positive relationship between CQ and gross domestic product (GDP, with the striking result that every increase of one percentage point in CQ will induce a five percentage point increment in GDP. Although the finding makes an impressive and convincing case that culture does exert a great impact on urban economic development, and can also be measured in a quantitative way in Chinese cases, more cases from other countries need to be included for further verification and confirmation. We therefore urgently call for

  19. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    Science.gov (United States)

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  20. 76 FR 48880 - Endangered Species; Marine Mammals; Receipt of Applications for Permit

    Science.gov (United States)

    2011-08-09

    ... quantitative information or studies; and (2) Those that include citations to, and analyses of, the applicable... following applicants each request a permit to import the sport- hunted trophy of one male bontebok...