WorldWideScience

Sample records for elastique quantitative applications

  1. Quantitative elastic migration. Applications to 3D borehole seismic surveys; Migration elastique quantitative. Applications a la sismique de puits 3D

    Energy Technology Data Exchange (ETDEWEB)

    Clochard, V.

    1998-12-02

    3D VSP imaging is nowadays a strategic requirement by petroleum companies. It is used to precise in details the geology close to the well. Because of the lack of redundancy and limited coverage in the data. this kind of technology is more restrictive than surface seismic which allows an investigation at a higher scale. Our contribution was to develop an elastic quantitative imagine (GRT migration) which can be applied to 3 components borehole dataset. The method is similar to the Kirchhoff migration using sophistical weighting of the seismic amplitudes. In reality. GRT migration uses pre-calculated Green functions (travel time. amplitude. polarization). The maps are obtained by 3D ray tracing (wavefront construction) in the velocity model. The migration algorithm works with elementary and independent tasks. which is useful to process different kind of dataset (fixed or moving geophone antenna). The study has been followed with validations using asymptotic analytical solution. The ability of reconstruction in 3D borehole survey has been tested in the Overthrust synthetic model. The application to a real circular 3D VSP shows various problems like velocity model building, anisotropy factor and the preprocessing (deconvolution. wave mode separation) which can destroy seismic amplitudes. An isotropic 3 components preprocessing of the whole dataset allows a better lateral reconstruction. The choice of a big migration aperture can help the reconstruction of strong geological dip in spite of migration smiles. Finally, the methodology can be applied to PS converted waves. (author)

  2. Quantitative multi-waves migration in elastic anisotropic media; Migration quantitative multi-ondes en milieu elastique anisotrope

    Energy Technology Data Exchange (ETDEWEB)

    Borgne, H.

    2004-12-01

    modelling of waves propagation in anisotropic media. With the approximations of ray theory, 1 develop an expression of the geometrical spreading, the amplitude, and their reciprocity relations. I set up imaging formulas in order to reconstruct the reflection coefficients of the subsurface in elastic anisotropic media. In a first time, 1 salve the direct problem, by expressing the integral relation between the scattered wave field recorded by the receivers and the subsurface reflection coefficients. In a second time, 1 apply an elastic anisotropic quantitative migration method, based on the properties of the inverse Radon transforms (Beylkin's approach), in order to express the reflection coefficient in 2D, 2.5D and 3D media. 1 implemented these formulas in a new preserved amplitude migration algorithm, where the images are sorted by angle classes. At last, 1 apply these theoretical results to synthetic and real datasets. 1 show that migration is able to reconstruct the correct A V A behavior of anisotropic reflection coefficients if hath. modifications are achieved. Then, 1 degrade the process, by keeping an anisotropic ray tracing but using the classical isotropic imaging formula. F'or this commonly used configuration, 1 evaluate the error that can be expected in the A V A response of the migrated reflection coefficient. Methodological applications show the sensibility of the migration results to the velocity model smoothing and to an error on the anisotropic axis. (author)

  3. High temperature elastic constant measurements: application to plutonium; Mesure des constantes elastiques a haute temperature application au plutonium

    Energy Technology Data Exchange (ETDEWEB)

    Bouchet, J M [Commissariat a l' Energie Atomique, Fontenay-aux-Roses (France). Centre d' Etudes Nucleaires

    1969-03-01

    We present an apparatus with which we have measured the Young's modulus and the Poisson's ratio of several compounds from the resonance frequency of cylinders in the temperature range 0 deg. C-700 deg. C. We especially studied the elastic constants of plutonium and measured for the first time to our knowledge the Young's modulus of Pu{sub {delta}} and Pu{sub {epsilon}}. E{sub {delta}} 360 deg. C = 1.6 10{sup 11} dy/cm{sup 2}; E{sub {epsilon}} 490 deg. C = 1.1 10{sup 11} dy/cm{sup 2}, {sigma}{sub {epsilon}} = 0.25 {+-} 0.03 Using our results, we have calculated the compressibility, the Debye temperature, the Grueneisen constant and the electronic specific heat of Pu{sub {epsilon}}. (author) [French] Nous decrivons un appareil qui permet de mesurer les constantes elastiques (module de Young et module de Poisson) jusqu'a 700 deg. C a partir des frequences de resonance de barreaux cylindriques. Nous avons plus specialement etudie le plutonium et determine pour la premiere fois a notre connaissance le module de Young des phases {delta} et {epsilon}: E{sub {delta}} 360 deg. C = 1.6 10{sup 11} dy/cm{sup 2}; E{sub {epsilon}} 490 deg. C = 1.1 10{sup 11} dy/cm{sup 2}, {sigma}{sub {epsilon}} = 0.25 {+-} 0.03 Nos mesures nous ont permis de calculer la compressibilite, la temperature de Debye, la constante de Gruneisen et la chaleur specifique electronique de Pu{sub {epsilon}}. (auteur)

  4. Quantitation: clinical applications

    International Nuclear Information System (INIS)

    Britton, K.E.

    1982-01-01

    Single photon emission tomography may be used quantitatively if its limitations are recognized and quantitation is made in relation to some reference area on the image. Relative quantitation is discussed in outline in relation to the liver, brain and pituitary, thyroid, adrenals, and heart. (U.K.)

  5. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  6. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  7. Development and applications of quantitative NMR spectroscopy

    International Nuclear Information System (INIS)

    Yamazaki, Taichi

    2016-01-01

    Recently, quantitative NMR spectroscopy has attracted attention as an analytical method which can easily secure traceability to SI unit system, and discussions about its accuracy and inaccuracy are also started. This paper focuses on the literatures on the advancement of quantitative NMR spectroscopy reported between 2009 and 2016, and introduces both NMR measurement conditions and actual analysis cases in quantitative NMR. The quantitative NMR spectroscopy using an internal reference method enables accurate quantitative analysis with a quick and versatile way in general, and it is possible to obtain the precision sufficiently applicable to the evaluation of pure substances and standard solutions. Since the external reference method can easily prevent contamination to samples and the collection of samples, there are many reported cases related to the quantitative analysis of biologically related samples and highly scarce natural products in which NMR spectra are complicated. In the precision of quantitative NMR spectroscopy, the internal reference method is superior. As the quantitative NMR spectroscopy widely spreads, discussions are also progressing on how to utilize this analytical method as the official methods in various countries around the world. In Japan, this method is listed in the Pharmacopoeia and Japanese Standard of Food Additives, and it is also used as the official method for purity evaluation. In the future, this method will be expected to spread as the general-purpose analysis method that can ensure traceability to SI unit system. (A.O.)

  8. Reactor applications of quantitative diffraction analysis

    International Nuclear Information System (INIS)

    Feguson, I.F.

    1976-09-01

    Current work in quantitative diffraction analysis was presented under the main headings of: thermal systems, fast reactor systems, SGHWR applications and irradiation damage. Preliminary results are included on a comparison of various new instrumental methods of boron analysis as well as preliminary new results on Zircaloy corrosion, and materials transfer in liquid sodium. (author)

  9. Applications of quantitative remote sensing to hydrology

    NARCIS (Netherlands)

    Su, Z.; Troch, P.A.A.

    2003-01-01

    In order to quantify the rates of the exchanges of energy and matter among hydrosphere, biosphere and atmosphere, quantitative description of land surface processes by means of measurements at different scales are essential. Quantitative remote sensing plays an important role in this respect. The

  10. Some properties of the Boltzmann elastic collision operator; Quelques proprietes particulieres de l'operateur de collision elastique de Boltzmann

    Energy Technology Data Exchange (ETDEWEB)

    Delcroix, J. L. [Ecole Normale Superieure (France); Salmon, J. [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1959-07-01

    The authors point out some properties (an important one is a variational property) of the Boltzmann elastic collision operator, valid in a more general framework than that of the Lorentz gas. Reprint of a paper published in 'Le journal de physique et le radium', tome 20, Jun 1959, p. 594-596 [French] Les auteurs mettent en evidence quelques proprietes (dont notamment une propriete variationnelle) de l'operateur de collision elastique de Boltzmann valables dans un cadre plus general que celui du gaz de Lorentz. Reproduction d'un article publie dans 'Le journal de physique et le radium', tome 20, Jun 1959, p. 594-596.

  11. Application of an image processing software for quantitative autoradiography

    International Nuclear Information System (INIS)

    Sobeslavsky, E.; Bergmann, R.; Kretzschmar, M.; Wenzel, U.

    1993-01-01

    The present communication deals with the utilization of an image processing device for quantitative whole-body autoradiography, cell counting and also for interpretation of chromatograms. It is shown that the system parameters allow an adequate and precise determination of optical density values. Also shown are the main error sources limiting the applicability of the system. (orig.)

  12. New apparatus for discriminating shapes - application to the study of neutron-proton elastic scattering at 14.6 MeV; Nouveau dispositif de discrimination de formes - Application a l'etude de la diffusion elastique neutron-proton a 14,6 MeV; Novyj diskriminator po forme - primenenie k izucheniyu uprugogo rasseyaniya nejtronproton s 14,6 MeV; Nuevo dispositivo para discriminacion de formas - Aplicacion al estudio de la dispersion elastica neutron-proton a 14,6 MeV

    Energy Technology Data Exchange (ETDEWEB)

    Crettez, J P; Cambou, F; Ambrosino, G [Laboratoire Maurice de Broglie, Paris (France)

    1962-04-15

    The mean life of scintillations in caesium iodide is dependent on the nature of the ionizing particle, the shortest life corresponding to the highest ionization density. This property is utilized for distinguishing different particles producing scintillations of similar amplitude. The apparatus described is a shape discriminator. It measures the time required by the scintillation to fall from its maximum to an adjustable fraction thereof. A time-amplitude converter provides a pulse the height of which is proportional to the time so measured. A comparison is then made between the shapes of the scintillations produced by alpha particles, protons and electrons, and the results obtained are shown. Application of this method to the measurement of recoil protons set in motion in a thin hydrogenated diffuser by a neutron flux is also described. Suppression of the pulses produced by gamma rays makes it possible to deduce from the spectra obtained the variation of the differential elastic scattering cross-section in terms of the angle, for d-t reaction neutrons. (author) [French] La vie moyenne des scintillations dans l'iodure de cesium varie suivant la nature de la particule ionisante. A la plus grande densite d'ionisation correspond la vie la plus courte. Cette propriete est employee pour distinguer differentes particules produisant des scintillations d'amplitude analogue. L'appareil decrit est un discriminateur de forme. Il mesure le temps mis par la scintillation pour passer de son maximum a une fraction reglable de cette amplitude. Un convertisseur temps-amplitude fournit une impulsion dont la hauteur est proportionnelle au temps ainsi mesure. Les auteurs comparent les formes des scintillations produites par les particules a, les protons, les electrons; ils presentent les resultats obtenus. Les auteurs decrivent l'application de cette methode a la mesure des protons de recul qu'un flux de neutrons met en mouvement dans un diffuseur hydrogene mince. La suppression des

  13. Quantitative Security Risk Assessment of Android Permissions and Applications

    OpenAIRE

    Wang , Yang; Zheng , Jun; Sun , Chen; Mukkamala , Srinivas

    2013-01-01

    Part 6: Mobile Computing; International audience; The booming of the Android platform in recent years has attracted the attention of malware developers. However, the permissions-based model used in Android system to prevent the spread of malware, has shown to be ineffective. In this paper, we propose DroidRisk, a framework for quantitative security risk assessment of both Android permissions and applications (apps) based on permission request patterns from benign apps and malware, which aims ...

  14. Quantitative application study on the control system of contract progress

    International Nuclear Information System (INIS)

    Hu Xiaocong; Kang Rujie; Zhan Li

    2012-01-01

    Quantitative application study on the control system of contract progress, which is based on project management theory and PDCA cycle methods, provides a new way for the contract business management of enterprise, in line with the current situation and the nuclear power enterprise performance management needs. The concept of the system, system development, program design and development of ERP (VBA design) which come from the work experience summary of business managers are convenient and feasible in practical applications. By way of the applications in 2009, 2010, 2011 three-year overhaul contract management and continuous adjustment it has become an important business management tool, which not only effectively guaranteed the contract time and efficiency, but also combines the performance management and contract progress management. This study has provided useful reference for the enterprise management. (authors)

  15. Application of harmonic analysis in quantitative heart scintigraphy

    International Nuclear Information System (INIS)

    Fischer, P.; Knopp, R.; Breuel, H.P.

    1979-01-01

    Quantitative scintigraphy of the heart after equilibrium distribution of a radioactive tracer permits the measurement of time activity curves in the left ventricle during a representative heart cycle with great statistical accuracy. By application of Fourier's analysis, criteria are to be attained in addition for evaluation of the volume curve as a whole. Thus the entire information contained in the volume curve is completely described in a Fourier spectrum. Resynthesis after Fourier transformation seems to be an ideal method of smoothing because of its convergence in the minimum quadratic error for the type of function concerned. (orig./MG) [de

  16. Novel applications of quantitative MRI for the fetal brain

    Energy Technology Data Exchange (ETDEWEB)

    Clouchoux, Cedric [Children' s National Medical Center, Division of Diagnostic Imaging and Radiology, Washington, DC (United States); Limperopoulos, Catherine [Children' s National Medical Center, Division of Diagnostic Imaging and Radiology, Washington, DC (United States); McGill University, McConnell Brain Imaging Center, Montreal Neurological Institute, Montreal (Canada); McGill University, Department of Neurology and Neurosurgery, Montreal (Canada); Children' s National Medical Center, Division of Fetal and Transitional Medicine, Washington, DC (United States)

    2012-01-15

    The advent of ultrafast MRI acquisitions is offering vital insights into the critical maturational events that occur throughout pregnancy. Concurrent with the ongoing enhancement of ultrafast imaging has been the development of innovative image-processing techniques that are enabling us to capture and quantify the exuberant growth, and organizational and remodeling processes that occur during fetal brain development. This paper provides an overview of the role of advanced neuroimaging techniques to study in vivo brain maturation and explores the application of a range of new quantitative imaging biomarkers that can be used clinically to monitor high-risk pregnancies. (orig.)

  17. Automated quantitative micro-mineralogical characterization for environmental applications

    Science.gov (United States)

    Smith, Kathleen S.; Hoal, K.O.; Walton-Day, Katherine; Stammer, J.G.; Pietersen, K.

    2013-01-01

    Characterization of ore and waste-rock material using automated quantitative micro-mineralogical techniques (e.g., QEMSCAN® and MLA) has the potential to complement traditional acid-base accounting and humidity cell techniques when predicting acid generation and metal release. These characterization techniques, which most commonly are used for metallurgical, mineral-processing, and geometallurgical applications, can be broadly applied throughout the mine-life cycle to include numerous environmental applications. Critical insights into mineral liberation, mineral associations, particle size, particle texture, and mineralogical residence phase(s) of environmentally important elements can be used to anticipate potential environmental challenges. Resources spent on initial characterization result in lower uncertainties of potential environmental impacts and possible cost savings associated with remediation and closure. Examples illustrate mineralogical and textural characterization of fluvial tailings material from the upper Arkansas River in Colorado.

  18. Quantitative Susceptibility Mapping: Contrast Mechanisms and Clinical Applications

    Science.gov (United States)

    Liu, Chunlei; Wei, Hongjiang; Gong, Nan-Jie; Cronin, Matthew; Dibb, Russel; Decker, Kyle

    2016-01-01

    Quantitative susceptibility mapping (QSM) is a recently developed MRI technique for quantifying the spatial distribution of magnetic susceptibility within biological tissues. It first uses the frequency shift in the MRI signal to map the magnetic field profile within the tissue. The resulting field map is then used to determine the spatial distribution of the underlying magnetic susceptibility by solving an inverse problem. The solution is achieved by deconvolving the field map with a dipole field, under the assumption that the magnetic field is a result of the superposition of the dipole fields generated by all voxels and that each voxel has its unique magnetic susceptibility. QSM provides improved contrast to noise ratio for certain tissues and structures compared to its magnitude counterpart. More importantly, magnetic susceptibility is a direct reflection of the molecular composition and cellular architecture of the tissue. Consequently, by quantifying magnetic susceptibility, QSM is becoming a quantitative imaging approach for characterizing normal and pathological tissue properties. This article reviews the mechanism generating susceptibility contrast within tissues and some associated applications. PMID:26844301

  19. Applications of phosphorus/silicon standards in quantitative autoradiography

    International Nuclear Information System (INIS)

    Treutler, H.Ch.; Freyer, K.

    1983-01-01

    Quantitative autoradiography requires a careful selection of suitable standard preparations. After several basic comments related to the problems of standardization in autoradiography an example is given of the autoradiographic study of semiconductor materials and it is used for describing the system of standardization using silicon discs with diffused phosphorus. These standardized samples are processed in the same manner as the evaluated samples, i.e., from activation to exposure to sensitive material whereby optimal comparability is obtained. All failures of the processing cycle caused by the fluctuation of the neutron flux in the reactor, deviations at the time of activation, afterglow, etc. are eliminated by this standardization procedure. Experience is presented obtained with the application of this procedure. (author)

  20. Laser ablation ICP-MS for quantitative biomedical applications

    International Nuclear Information System (INIS)

    Konz, Ioana; Fernandez, Beatriz; Fernandez, M.L.; Pereiro, Rosario; Sanz-Medel, Alfredo

    2012-01-01

    LA-ICP-MS allows precise, relatively fast, and spatially resolved measurements of elements and isotope ratios at trace and ultratrace concentration levels with minimal sample preparation. Over the past few years this technique has undergone rapid development, and it has been increasingly applied in many different fields, including biological and medical research. The analysis of essential, toxic, and therapeutic metals, metalloids, and nonmetals in biomedical tissues is a key task in the life sciences today, and LA-ICP-MS has proven to be an excellent complement to the organic MS techniques that are much more commonly employed in the biomedical field. In order to provide an appraisal of the fast progress that is occurring in this field, this review critically describes new developments for LA-ICP-MS as well as the most important applications of LA-ICP-MS, with particular emphasis placed on the quantitative imaging of elements in biological tissues, the analysis of heteroatom-tagged proteins after their separation and purification by gel electrophoresis, and the analysis of proteins that do not naturally have ICP-MS-detectable elements in their structures, thus necessitating the use of labelling strategies. (orig.)

  1. Study of the reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p at 2.77 GeV/c for low momentum transfer of the proton. Application to the Chew-Low extrapolation method for the {pi}{sup -}{pi}{sup 0} elastic scattering; Etude de la reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p a 2.77 GeV/c pour de faibles impulsions du proton diffuse. Application de la methode d'extrapolation de Chew et Low a la diffusion elastiques {pi}{sup -}{pi}{sup 0}

    Energy Technology Data Exchange (ETDEWEB)

    Baton, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-05-01

    Study of the reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p at 2.77 GeV/c carried out in the CERN 2 meter large liquid hydrogen bubble chamber at the proton synchrotron, shows that 70 per cent of this reaction goes through {pi}{sup -}p {yields} {rho}{sup -}p channel. The high statistics allow us to specify the mass and the width of the {rho}{sup -} resonance. In other hand, if the {rho}{sup -} production parameters are independent of the {rho}{sup -} width, it is not the same case for the decay parameters. In the second part, the Chew-Low extrapolation method allows us to determine the {pi}{sup -}{pi}{sup 0} elastic cross section to the pole, and the phase shifts of the P waves in the isospin 1 state and S waves in the isospin 2 state. (author) [French] L'etude de la reaction {pi}{sup -}p {yields} {pi}{sup -}{pi}{sup 0} p a 2.77 GeV/c, effectuee a l'aide de la chambre a bulles a hydrogene liquide de 2 metres du CERN, exposee aupres du synchrotron a protons, montre que 70 pour cent de cette reaction passe par la voie {pi}{sup -}p {yields} {rho}{sup -}p. L'abondance de la statistique a permis de preciser la masse et la largeur de la resonance {rho}{sup -}. D'autre part, si les parametres de la production du {rho}{sup -} sont independants de la largeur de la resonance, il n'en est pas de meme des parametres de la desintegration. Dans la deuxieme partie, la methode d'extrapolation de Chew et Low permet de determiner la section efficace de diffusion elastique {pi}{sup -}{pi}{sup 0} au pole, ainsi que les dephasages des ondes P dans l'etat d'isospin 1 et S dans l'etat d'isospin 2. (auteur)

  2. Approaches to quantitative risk assessment with applications to PP

    International Nuclear Information System (INIS)

    Geiger, G.; Schaefer, A.

    2002-01-01

    Full text: Experience with accidents such as Goiania in Brazil and indications of a considerable number of orphan sources suggest that improved protection would be desirable for some types of radioactive material of wide-spread use such as radiation sources for civil purposes. Regarding large potential health and economic consequences (in particular, if terrorists attacks cannot be excluded), significant costs of preventive actions, and large uncertainties about both the likelihood of occurrence and the potential consequences of PP safety and security incidents, an optimum relationship between preventive and mitigative efforts is likely to be a key issue for successful risk management in this field. Thus, possible violations of physical protection combined with threats of misuse of nuclear materials, including terrorist attack, pose considerable challenges to global security from various perspectives. In view of these challenges, recent advance in applied risk and decision analysis suggests methodological and procedural improvements in quantitative risk assessment, the demarcation of acceptable risk, and risk management. Advance is based on a recently developed model of optimal risky choice suitable for assessing and comparing the cumulative probability distribution functions attached to safety and security risks. Besides quantification of risk (e. g., in economic terms), the standardization of various risk assessment models frequently used in operations research can be approached on this basis. The paper explores possible applications of these improved methods to the safety and security management of nuclear materials, cost efficiency of risk management measures, and the establishment international safety and security standards of PP. Examples will be presented that are based on selected scenarios of misuse involving typical radioactive sources. (author)

  3. Application of quantitative and qualitative methods for determination ...

    African Journals Online (AJOL)

    This article covers the issues of integration of qualitative and quantitative methods applied when justifying management decision-making in companies implementing lean manufacturing. The authors defined goals and subgoals and justified the evaluation criteria which lead to the increased company value if achieved.

  4. Applications and limitations of quantitative sacroiliac joint scintigraphy

    International Nuclear Information System (INIS)

    Goldberg, R.P.; Genant, H.K.; Shimshak, R.; Shames, D.

    1978-01-01

    Evaluation of sacroiliac joint pathology by quantitative analysis of radionuclide bone scanning has been advocated as a useful technique. We have examined this technique in 61 patients and controls. The procedure was useful in detecting early sacroilitis but was of limited value in patients with advanced sacroiliac joint findings radiographically. False positive values were found in patients with metabolic bone disease or structural abnormalities in the low back. Normative data must be determined for each laboratory

  5. Analytical applications of a recycled flow nuclear magnetic resonance system: quantitative analysis of slowly relaxing nuclei

    International Nuclear Information System (INIS)

    Laude, D.A. Jr.; Lee, R.W.K.; Wilkins, C.L.

    1985-01-01

    The utility of a recycled flow system for the efficient quantitative analysis of NMR spectra is demonstrated. Requisite conditions are first established for the quantitative flow experiment and then applied to a variety of compounds. An application of the technique to determination of the average polymer chain length for a silicone polymer by quantitative flow 29 Si NMR is also presented. 10 references, 4 figures, 3 tables

  6. Application of quantitative DTI metrics in sporadic CJD

    Directory of Open Access Journals (Sweden)

    E. Caverzasi

    2014-01-01

    Full Text Available Diffusion Weighted Imaging is extremely important for the diagnosis of probable sporadic Jakob–Creutzfeldt disease, the most common human prion disease. Although visual assessment of DWI MRI is critical diagnostically, a more objective, quantifiable approach might more precisely identify the precise pattern of brain involvement. Furthermore, a quantitative, systematic tracking of MRI changes occurring over time might provide insights regarding the underlying histopathological mechanisms of human prion disease and provide information useful for clinical trials. The purposes of this study were: 1 to describe quantitatively the average cross-sectional pattern of reduced mean diffusivity, fractional anisotropy, atrophy and T1 relaxation in the gray matter (GM in sporadic Jakob–Creutzfeldt disease, 2 to study changes in mean diffusivity and atrophy over time and 3 to explore their relationship with clinical scales. Twenty-six sporadic Jakob–Creutzfeldt disease and nine control subjects had MRIs on the same scanner; seven sCJD subjects had a second scan after approximately two months. Cortical and subcortical gray matter regions were parcellated with Freesurfer. Average cortical thickness (or subcortical volume, T1-relaxiation and mean diffusivity from co-registered diffusion maps were calculated in each region for each subject. Quantitatively on cross-sectional analysis, certain brain regions were preferentially affected by reduced mean diffusivity (parietal, temporal lobes, posterior cingulate, thalamus and deep nuclei, but with relative sparing of the frontal and occipital lobes. Serial imaging, surprisingly showed that mean diffusivity did not have a linear or unidirectional reduction over time, but tended to decrease initially and then reverse and increase towards normalization. Furthermore, there was a strong correlation between worsening of patient clinical function (based on modified Barthel score and increasing mean diffusivity.

  7. Quantitative 3-D imaging topogrammetry for telemedicine applications

    Science.gov (United States)

    Altschuler, Bruce R.

    1994-01-01

    The technology to reliably transmit high-resolution visual imagery over short to medium distances in real time has led to the serious considerations of the use of telemedicine, telepresence, and telerobotics in the delivery of health care. These concepts may involve, and evolve toward: consultation from remote expert teaching centers; diagnosis; triage; real-time remote advice to the surgeon; and real-time remote surgical instrument manipulation (telerobotics with virtual reality). Further extrapolation leads to teledesign and telereplication of spare surgical parts through quantitative teleimaging of 3-D surfaces tied to CAD/CAM devices and an artificially intelligent archival data base of 'normal' shapes. The ability to generate 'topogrames' or 3-D surface numerical tables of coordinate values capable of creating computer-generated virtual holographic-like displays, machine part replication, and statistical diagnostic shape assessment is critical to the progression of telemedicine. Any virtual reality simulation will remain in 'video-game' realm until realistic dimensional and spatial relational inputs from real measurements in vivo during surgeries are added to an ever-growing statistical data archive. The challenges of managing and interpreting this 3-D data base, which would include radiographic and surface quantitative data, are considerable. As technology drives toward dynamic and continuous 3-D surface measurements, presenting millions of X, Y, Z data points per second of flexing, stretching, moving human organs, the knowledge base and interpretive capabilities of 'brilliant robots' to work as a surgeon's tireless assistants becomes imaginable. The brilliant robot would 'see' what the surgeon sees--and more, for the robot could quantify its 3-D sensing and would 'see' in a wider spectral range than humans, and could zoom its 'eyes' from the macro world to long-distance microscopy. Unerring robot hands could rapidly perform machine-aided suturing with

  8. Quantitative estimation of seafloor features from photographs and their application to nodule mining

    Digital Repository Service at National Institute of Oceanography (India)

    Sharma, R.

    Methods developed for quantitative estimation of seafloor features from seabed photographs and their application for estimation of nodule sizes, coverage, abundance, burial, sediment thickness, extent of rock exposure, density of benthic organisms...

  9. Clinical application of quantitative 99Tcm-pertechnetate thyroid imaging

    International Nuclear Information System (INIS)

    Gao Yongju; Xie Jian; Yan Xinhui; Wand Jiebin; Zhu Xuanmin; Liu Lin; Sun Haizhou

    2002-01-01

    Objective: To investigate the clinical value of quantitative 99 Tc m -pertechnetate thyroid imaging for the diagnosis and therapeutic evaluation in patients with thyroid disease. Methods: With the Siemens Orbit SPECT, 99 Tc m sodium pertechnetate thyroid imaging was performed on a control group and 108 patients with Graves' disease, 58 patients with Hashimoto's disease, 41 patients with subacute thyroiditis. Three functional parameters were calculated as follows: AR=5 min thyroid count/1 min thyroid count; UI=20 min thyroid count/thigh count; T d =imaging interval between carotid and thyroid. Results: 1) Three functional parameters were basically concordant with serological parameters in patients with Graves' disease. While uptake was high in patients who had contracted Graves' disease for ≤0.5 year, for those whose disease relapsed within 2 years the 99 Tc m thyroid uptake increased when the antithyroid medication was stopped. 2) Thyroid images of hyperthyroid patients with Hashimoto's disease showed increased perfusion and 99 Tc m uptake, a pattern similar to that found in Graves' disease. Differences in T d , AR , UI were not significant among euthyroid, subclinical hypothyroid patients with Hashimoto's disease, so uptake ratios could indicate the thyroid activity. 3) Delayed thyroid image and diffuse uptake decrease were found in hyperthyroid patients with SAT, however, focal damages were observed in euthyroid patients. Conclusion: Quantitative 99 Tc m -pertechnetate thyroid imaging is a significantly helpful technique in the diagnosis and treatment for common thyroid disorders

  10. Quantitative remote sensing in thermal infrared theory and applications

    CERN Document Server

    Tang, Huajun

    2014-01-01

    This comprehensive technical overview of the core theory of thermal remote sensing and its applications in hydrology, agriculture, and forestry includes a host of illuminating examples and covers everything from the basics to likely future trends in the field.

  11. Quantitative criteria in the application of the principle of precaution

    International Nuclear Information System (INIS)

    Touzet, Rodolfo; Ferrari, Jorge

    2008-01-01

    The Principle of Precaution establishes that 'when an activity represents a threat or damage for the human health or the environment, it is necessary to take precautionary measures, even when it could not have been scientifically demonstrated the cause-effect relationship in conclusive form' This declaration implies acting even in presence of uncertainty, deriving the responsibility and the safety to who create the risk, to analyze the possible alternatives and to use participative methods for taking decisions. This presents practically two dilemmas: How make a cost-benefit analysis when is not yet established the relationship between cause and effect for the health of the exposed persons? (With regard to the ionizing radiations a major information does exist and a factor a is in use, that represents the economic cost of the dose gotten by a person) What criteria to use in the case if it were demonstrated that non ionizing radiations act in synergic form with the ionizing ones? How to integrate a quantitative criterion of optimization with a qualitative criterion of precaution? They will have to appear some temporary hypotheses in order to be able to perform the quantitative corresponding evaluations. In the case of low frequencies the situation was exactly equal in the past, but the epidemiological studies, as well as the experiments in vivo and in vitro, demonstrated that the exposure can increase the risk of cancer in children and induce other health problems in children and adults. A possible temporary hypothesis for the radio frequencies is to assume that the effects are similar in magnitude to those caused by the low frequency fields. In this case it is possible to demonstrate that if this were really true the epidemiological statistics would not even allow demonstrate it, due to the quantity of persons involved, the time devoted to the studied populations and the latency times of the leukaemia. The use of some work hypothesis to make the cost-benefit studies

  12. Comparison of different surface quantitative analysis methods. Application to corium

    International Nuclear Information System (INIS)

    Guilbaud, N.; Blin, D.; Perodeaud, Ph.; Dugne, O.; Gueneau, Ch.

    2000-01-01

    In case of a severe hypothetical accident in a pressurized water reactor, the reactor assembly melts partially or completely. The material formed, called corium, flows out and spreads at the bottom of the reactor. To limit and control the consequences of such an accident, the specifications of the O-U-Zr basic system must be known accurately. To achieve this goal, the corium mix was melted by electron bombardment at very high temperature (3000 K) followed by quenching of the ingot in the Isabel 1 evaporator. Metallographic analyses were then required to validate the thermodynamic databases set by the Thermo-Calc software. The study consists in defining an overall surface quantitative analysis method that is fast and reliable, in order to determine the overall corium composition. The analyzed ingot originated in a [U+Fe+Y+UO 2 +ZrO 2 ) mix, with a total mass of 2253.7 grams. Several successive heating with average power were performed before a very brief plateau at very high temperature, so that the ingot was formed progressively and without any evaporation liable to modify its initial composition. The central zone of the ingot was then analyzed by qualitative and quantitative global surface methods, to yield the volume composition of the analyzed zone. Corium sample analysis happens to be very complex because of the variety and number of elements present, and also because of the presence of oxygen in a heavy element like the uranium based matrix. Three different global quantitative surface analysis methods were used: global EDS analysis (Energy Dispersive Spectrometry), with SEM, global WDS analysis (Wavelength Dispersive Spectrometry) with EPMA, and coupling of image analysis with EDS or WDS point spectroscopic analyses. The difficulties encountered during the study arose from sample preparation (corium is very sensitive to oxidation), and the choice of acquisition parameters of the images and analyses. The corium sample studied consisted of two zones displaying

  13. Quantitative criteria in the application of the principle of precaution

    International Nuclear Information System (INIS)

    Touzet, Rodolfo; Ferrari, Jorge

    2008-01-01

    Full text: The Principle of Precaution establishes that 'when an activity represents a threat or damage for the human health or the environment, it is necessary to take measurements of precaution even when it could not have demonstrated the cause-effect relationship in a scientific and conclusive form'. This declaration implies acting even in presence of uncertainty, deriving the responsibility and the safety to who creates the risk, to analyze the possible alternatives and to use participate methods to take decisions. This presents practically two dilemmas: 1) How to perform a cost-benefit analysis when the relation cause-effect is not even established for the health of the exposed persons? (In case of ionizing radiations a factor α is in use, that represents the economic cost of the dose received by a person); 2) Which criterion must to be used in case of ionizing radiation act synergically with non ionizing radiation? How to integrate the quantitative optimization criterion with a qualitative criterion of precaution?. It will have to appear some temporary hypotheses in order to be able to perform the quantitative corresponding evaluations. In case of the low frequencies the situation was exactly the same in the past; but the epidemiological studies as well as the experiments in vivo and in vitro demonstrated that the exposure can increase the risk of leukaemia in children and induce other problems of health in children and adults. A temporary possible hypothesis for radio frequency is to assume that the effects are similar in magnitude to the ones caused for the fields of low frequency. In this case it is possible to demonstrate that if this was really like that the statistics would not allow to demonstrate it even due to the persons' quantity and the times used in the studied populations and the times of latency of the leukaemia. The use of some hypothesis of work to perform the cost - benefit studies allow us to establish different alternatives for the

  14. Application of neural networks to quantitative spectrometry analysis

    International Nuclear Information System (INIS)

    Pilato, V.; Tola, F.; Martinez, J.M.; Huver, M.

    1999-01-01

    Accurate quantitative analysis of complex spectra (fission and activation products), relies upon experts' knowledge. In some cases several hours, even days of tedious calculations are needed. This is because current software is unable to solve deconvolution problems when several rays overlap. We have shown that such analysis can be correctly handled by a neural network, and the procedure can be automated with minimum laboratory measurements for networks training, as long as all the elements of the analysed solution figure in the training set and provided that adequate scaling of input data is performed. Once the network has been trained, analysis is carried out in a few seconds. On submitting to a test between several well-known laboratories, where unknown quantities of 57 Co, 58 Co, 85 Sr, 88 Y, 131 I, 139 Ce, 141 Ce present in a sample had to be determined, the results yielded by our network classed it amongst the best. The method is described, including experimental device and measures, training set designing, relevant input parameters definition, input data scaling and networks training. Main results are presented together with a statistical model allowing networks error prediction

  15. Application of probabilistic quantitative ecological risk assessment to radiological dose

    International Nuclear Information System (INIS)

    Twining, J.; Ferris, J.; Copplestone, D.; Zinger, I.

    2004-01-01

    Probabilistic ERA is becoming more accepted and applied in evaluations of environmental impacts worldwide. In a previous paper we have shown that the process can be applied in practice to routine effluent releases from a nuclear facility. However, there are practical issues that need to be addressed prior to its regulatory application for criteria setting or for site-specific ERA. Among these issues are a) appropriate data selection for both exposure and dose-response input, because there is a need to carefully characterise and filter the available dose-response data for its ecological relevance, b) A coherent approach is required to the choice of exposure scenarios, and c) there are various questions associated with treatment of exposure to mixed nuclides. In this paper we will evaluate and discuss aspects of these issues, using an illustrative case study approach. (author)

  16. Quantitative and qualitative applications of the neutron-gamma borehole logging

    International Nuclear Information System (INIS)

    Charbucinski, J.; Aylmer, J.A.; Eisler, P.L.; Borsaru, M.

    1989-01-01

    Two neutron-γ borehole logging applications are described. In a quantitative application of the prompt-gamma neutron-activation analysis (PGNAA) technique, research was carried out both in the laboratory and at a mine to establish a suitable borehole logging technology for manganese-grade predictions. As an example of the qualitative application of PGNAA, the use of this method has been demonstrated for the determination of lithology. (author)

  17. Quantitative and qualitative applications of the neutron-gamma borehole logging

    International Nuclear Information System (INIS)

    Charbucinski, J.; Eisler, P.L.; Borsaru, M.; Aylmer, J.A.

    1990-01-01

    Two examples of neutron-gamma borehole logging application are described. In the quantitative application of the PGNAA technique, research was carried out both in the laboratory and at a mine to establish a suitable borehole logging technology for Mn-grade predictions. As an example of qualitative application of PGNAA, use of this method has been demonstrated for determination of lithology. (author). 4 refs, 10 figs, 7 tabs

  18. Digital Holography, a metrological tool for quantitative analysis: Trends and future applications

    Science.gov (United States)

    Paturzo, Melania; Pagliarulo, Vito; Bianco, Vittorio; Memmolo, Pasquale; Miccio, Lisa; Merola, Francesco; Ferraro, Pietro

    2018-05-01

    A review on the last achievements of Digital Holography is reported in this paper, showing that this powerful method can be a key metrological tool for the quantitative analysis and non-invasive inspection of a variety of materials, devices and processes. Nowadays, its range of applications has been greatly extended, including the study of live biological matter and biomedical applications. This paper overviews the main progresses and future perspectives of digital holography, showing new optical configurations and investigating the numerical issues to be tackled for the processing and display of quantitative data.

  19. Mathematics of quantitative kinetic PCR and the application of standard curves.

    Science.gov (United States)

    Rutledge, R G; Côté, C

    2003-08-15

    Fluorescent monitoring of DNA amplification is the basis of real-time PCR, from which target DNA concentration can be determined from the fractional cycle at which a threshold amount of amplicon DNA is produced. Absolute quantification can be achieved using a standard curve constructed by amplifying known amounts of target DNA. In this study, the mathematics of quantitative PCR are examined in detail, from which several fundamental aspects of the threshold method and the application of standard curves are illustrated. The construction of five replicate standard curves for two pairs of nested primers was used to examine the reproducibility and degree of quantitative variation using SYBER Green I fluorescence. Based upon this analysis the application of a single, well- constructed standard curve could provide an estimated precision of +/-6-21%, depending on the number of cycles required to reach threshold. A simplified method for absolute quantification is also proposed, in which quantitative scale is determined by DNA mass at threshold.

  20. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    NARCIS (Netherlands)

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from

  1. MO-E-12A-01: Quantitative Imaging: Techniques, Applications, and Challenges

    International Nuclear Information System (INIS)

    Jackson, E; Jeraj, R; McNitt-Gray, M; Cao, Y

    2014-01-01

    The first symposium in the Quantitative Imaging Track focused on the introduction of quantitative imaging (QI) by illustrating the potential of QI in diagnostic and therapeutic applications in research and patient care, highlighting key challenges in implementation of such QI applications, and reviewing QI efforts of selected national and international agencies and organizations, including the FDA, NCI, NIST, and RSNA. This second QI symposium will focus more specifically on the techniques, applications, and challenges of QI. The first talk of the session will focus on modalityagnostic challenges of QI, beginning with challenges of the development and implementation of QI applications in single-center, single-vendor settings and progressing to the challenges encountered in the most general setting of multi-center, multi-vendor settings. The subsequent three talks will focus on specific QI challenges and opportunities in the modalityspecific settings of CT, PET/CT, and MR. Each talk will provide information on modality-specific QI techniques, applications, and challenges, including current efforts focused on solutions to such challenges. Learning Objectives: Understand key general challenges of QI application development and implementation, regardless of modality. Understand selected QI techniques and applications in CT, PET/CT, and MR. Understand challenges, and potential solutions for such challenges, for the applications presented for each modality

  2. Quantitative carbon-14 autoradiography at the cellular level: principles and application for cell kinetic studies. [Review

    Energy Technology Data Exchange (ETDEWEB)

    Doermer, P [Gesellschaft fuer Strahlen- und Umweltforschung m.b.H., Muenchen (Germany, F.R.). Inst. fuer Haematologie

    1981-03-01

    Amounts of radio-labelled substances as low as 10/sup -18/ moles incorporated into individual cells can be measured by utilizing techniques of quantitative autoradiography. The principles and application of quantitative carbon-14 autoradiography are reviewed. Silver grain densities can be counted by automated microphotometry allowing on-line data processing by an interfaced computer. Rate measurements of /sup 14/C-thymidine incorporation into individual cells yield values of the DNA synthesis rate and the DNA synthesis time of a cell compartment can be derived. This is an essential time parameter for the evaluation of kinetic events in proliferating cell populations. This method is applicable to human cells without radiation hazard to man and provides an optimal source of detailed information on the kinetics of normal and diseased human haematopoiesis. Examples of application consist of thalassaemia, malaria infection, iron deficiency anaemia and acute myelogenous leukaemia.

  3. Quantitative carbon-14 autoradiography at the cellular level: principles and application for cell kinetic studies

    International Nuclear Information System (INIS)

    Doermer, P.

    1981-01-01

    Amounts of radio-labelled substances as low as 10 -18 moles incorporated into individual cells can be measured by utilizing techniques of quantitative autoradiography. The principles and application of quantitative carbon-14 autoradiography are reviewed. Silver grain densities can be counted by automated microphotometry allowing on-line data processing by an interfaced computer. Rate measurements of 14 C-thymidine incorporation into individual cells yield values of the DNA synthesis rate and the DNA synthesis time of a cell compartment can be derived. This is an essential time parameter for the evaluation of kinetic events in proliferating cell populations. This method is applicable to human cells without radiation hazard to man and provides an optimal source of detailed information on the kinetics of normal and diseased human haematopoiesis. Examples of application consist of thalassaemia, malaria infection, iron deficiency anaemia and acute myelogenous leukaemia. (author)

  4. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.

  5. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    OpenAIRE

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from smartphones and survey (perception) data. We find that there are dimensions of domestication that explain how the use of smartphones affects our daily routines. Contributions are stronger for downloaded a...

  6. The Application of the Photographic Plate to the Quantitative Determination of Activities by Track Counts

    Energy Technology Data Exchange (ETDEWEB)

    Broda, E.

    1946-07-01

    This report was written by E. Broda at the Cavendish Laboratory (Cambridge) in August 1946 and is about the application of the photographic plate to the quantitative determination of activities by track counts. This report includes the experiment description and the discussion of the results and consists of 4 parts: 1) Introduction 2) Estimation of Concentrations 3) The uptake of U in different conditions 4) The upper limits of the fission Cross sections of Bi and Pb. (nowak)

  7. The Application of the Photographic Plate to the Quantitative Determination of Activities by Track Counts

    International Nuclear Information System (INIS)

    Broda, E.

    1946-01-01

    This report was written by E. Broda at the Cavendish Laboratory (Cambridge) in August 1946 and is about the application of the photographic plate to the quantitative determination of activities by track counts. This report includes the experiment description and the discussion of the results and consists of 4 parts: 1) Introduction 2) Estimation of Concentrations 3) The uptake of U in different conditions 4) The upper limits of the fission Cross sections of Bi and Pb. (nowak)

  8. Quantitative Phase Imaging Techniques for the Study of Cell Pathophysiology: From Principles to Applications

    Directory of Open Access Journals (Sweden)

    Hyunjoo Park

    2013-03-01

    Full Text Available A cellular-level study of the pathophysiology is crucial for understanding the mechanisms behind human diseases. Recent advances in quantitative phase imaging (QPI techniques show promises for the cellular-level understanding of the pathophysiology of diseases. To provide important insight on how the QPI techniques potentially improve the study of cell pathophysiology, here we present the principles of QPI and highlight some of the recent applications of QPI ranging from cell homeostasis to infectious diseases and cancer.

  9. Brittleness and elastic limit of iron-aluminium 40 at high strain rates; Fragilite et limite elastique du fer-aluminium 40 aux grandes vitesses de deformation

    Energy Technology Data Exchange (ETDEWEB)

    Cottu, J P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-07-01

    Iron-aluminium 40 - a B2 ordered solid solution - was tensile tested to provide information on the brittleness of this alloy and its dependence on strain rate and temperature. For slow strain rates (0.34 per cent s{sup -1}) cleaved fracture prevails when temperature is kept below 400 deg. C, while a ductile rupture is observed, with an almost 100 per cent necking at higher temperatures. In this case, recrystallization occurs during the deformation. For higher strain rates - 335 per cent s{sup -1}), a ductility reduction - owed to intergranular fracture - precedes the brittle-ductile transition. This property may be bound to the peak on the yield stress temperature curve, which is itself connected to the ordered structure of this alloy. (author) [French] Les essais de traction que nous avons effectues sur le fer-aluminium 40, solution solide ordonnee de type B2, ont pour but de preciser l'influence de la vitesse de deformation et de la temperature sur la fragilite de l'alliage. Pour les faibles vitesses (0,34 pour cent s{sup -1}), la rupture est surtout clivee si la temperature est inferieure a 400 deg. C, puis ductile avec une striction voisine de 100 pour cent aux temperatures superieures; la recristallisation intervient alors ou cours meme de la deformation. Aux vitesses elevees (335 pour cent s{sup -1}) la transition fragile-ductile est precedee d'une chute de ductilite liee a une decohesion intergranulaire. Nous avons associe cette derniere propriete a la presence d'un pic de limite elastique apparaissant a chaud, a vitesse elevee et pouvant etre relie au caractere ordonne de l'alliage. (auteur)

  10. A collimator optimization method for quantitative imaging: application to Y-90 bremsstrahlung SPECT.

    Science.gov (United States)

    Rong, Xing; Frey, Eric C

    2013-08-01

    Post-therapy quantitative 90Y bremsstrahlung single photon emission computed tomography (SPECT) has shown great potential to provide reliable activity estimates, which are essential for dose verification. Typically 90Y imaging is performed with high- or medium-energy collimators. However, the energy spectrum of 90Y bremsstrahlung photons is substantially different than typical for these collimators. In addition, dosimetry requires quantitative images, and collimators are not typically optimized for such tasks. Optimizing a collimator for 90Y imaging is both novel and potentially important. Conventional optimization methods are not appropriate for 90Y bremsstrahlung photons, which have a continuous and broad energy distribution. In this work, the authors developed a parallel-hole collimator optimization method for quantitative tasks that is particularly applicable to radionuclides with complex emission energy spectra. The authors applied the proposed method to develop an optimal collimator for quantitative 90Y bremsstrahlung SPECT in the context of microsphere radioembolization. To account for the effects of the collimator on both the bias and the variance of the activity estimates, the authors used the root mean squared error (RMSE) of the volume of interest activity estimates as the figure of merit (FOM). In the FOM, the bias due to the null space of the image formation process was taken in account. The RMSE was weighted by the inverse mass to reflect the application to dosimetry; for a different application, more relevant weighting could easily be adopted. The authors proposed a parameterization for the collimator that facilitates the incorporation of the important factors (geometric sensitivity, geometric resolution, and septal penetration fraction) determining collimator performance, while keeping the number of free parameters describing the collimator small (i.e., two parameters). To make the optimization results for quantitative 90Y bremsstrahlung SPECT more

  11. Phase-shift analysis of pion-nucleon elastic scattering below 1.6 GeV; Analyse en ondes partielles de la diffusion elastique meson {pi} - nucleon au-dessous de 1.6 GeV

    Energy Technology Data Exchange (ETDEWEB)

    Bareyre, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1968-06-01

    Experimental results of pion-nucleon elastic scattering below 1.6 GeV (total cross sections, angular distributions of elastic scattering and recoil nucleon polarizations) have been described by a partial wave analysis. This analysis has been developed, one energy at a time, with a method of least squares fits. A single solution is extracted by continuity with energy of the different solutions. Resonating behaviour has been clearly established for several partial waves. In addition to these important effects some phase shifts show rapid variations with energy. Present experimental situation does not permit to say whether these variations are due to experimental biases or to physical effects. (author) [French] Les resultats experimentaux de la diffusion elastique meson {pi} - nucleon au-dessous de 1.6 GeV (sections efficaces totales, distributions angulaires de diffusion elastique et de polarisation du nucleon de recul) sont decrits a l'aide d'une analyse en ondes partielles. Cette analyse est developpee energie par energie au moyen d'une methode d'ajustement en moindres carres. Un critere empirique de continuite des solutions en fonction de l'energie a permis d'isoler une solution unique. Des resonances sont clairement etablies pour plusieurs ondes partielles, ainsi que certains petits effets moins caracteristiques. Pour ceux-ci, la situation experimentale presente ne permet pas d'affirmer s'ils sont dus a des effets physiques ou a des biais experimentaux. (auteur)

  12. 3.55 GeV/c Kp elastic scattering near 180 deg; Diffusion elastique des K de 3.55 GeV/C par les protons, au voisinage de 180 deg

    Energy Technology Data Exchange (ETDEWEB)

    Duflo, J. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    Backward elastic K{sup +}p and K{sup -}p scattering has been measured in the angular interval 168 deg. < {theta}.c.m. < 177 deg. The experimental apparatus included optical spark chambers to measure the three prongs of the event scattering, a set of scintillation counters and Cerenkov counters to select events ant trigger chambers, and a magnet to determine the moment of the recoil proton. 106000 photographs was taken. Of these, 22 satisfied all requirements as elastic K{sup +}p scattering events. No event was found satisfy the kinematical criteria for K{sup -}p elastic scattering. The correspondent values of differential cross sections are: (d{sigma}/d{omega})K{sup +}p {yields} pK{sup +} = 17 {+-} 4 {mu}b/ster. (d{sigma}/d{omega})K{sup -}p {yields} pK{sup -} {<=} 0.6 {mu}b/ster. Contaminations by {pi}p backward scattering, and inelastic scattering were estimated. Elastic scattering K{sup +}p exhibits a backward peak. A reasonably satisfactory interpretation of our results is obtained by exchange models. There is, in fact, no definitely established particles which could intermediate the Kp{sup -} {yields} pK{sup -} process in exchange cannel. That is in good agreement with our small value of K{sup -}p backward elastic scattering cross section. Our results lend support to the conclusions of the interference model developed by Barger and Cline for the {pi}p backward scattering, and to qualitative previsions of the Quark models. (author) [French] La diffusion elastique en arriere des K{sup +} et K{sup -} par les protons a ete mesuree dans l'intervalle angulaire 168 deg. < {theta}cm < 177 deg. Le dispositif experimental comprenait des chambres a etincelles 'optiques' pour mesurer les trois trajectoires de l'evenement diffuse, un ensemble de compteurs a scintillations et de compteurs Cerenkov pour selectionner les evenements et declencher les chambres, et un aimant pour determiner le moment du proton de recul. 106000 photographies ont ete prises

  13. 3.55 GeV/c Kp elastic scattering near 180 deg; Diffusion elastique des K de 3.55 GeV/C par les protons, au voisinage de 180 deg

    Energy Technology Data Exchange (ETDEWEB)

    Duflo, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-07-01

    Backward elastic K{sup +}p and K{sup -}p scattering has been measured in the angular interval 168 deg. < {theta}.c.m. < 177 deg. The experimental apparatus included optical spark chambers to measure the three prongs of the event scattering, a set of scintillation counters and Cerenkov counters to select events ant trigger chambers, and a magnet to determine the moment of the recoil proton. 106000 photographs was taken. Of these, 22 satisfied all requirements as elastic K{sup +}p scattering events. No event was found satisfy the kinematical criteria for K{sup -}p elastic scattering. The correspondent values of differential cross sections are: (d{sigma}/d{omega})K{sup +}p {yields} pK{sup +} = 17 {+-} 4 {mu}b/ster. (d{sigma}/d{omega})K{sup -}p {yields} pK{sup -} {<=} 0.6 {mu}b/ster. Contaminations by {pi}p backward scattering, and inelastic scattering were estimated. Elastic scattering K{sup +}p exhibits a backward peak. A reasonably satisfactory interpretation of our results is obtained by exchange models. There is, in fact, no definitely established particles which could intermediate the Kp{sup -} {yields} pK{sup -} process in exchange cannel. That is in good agreement with our small value of K{sup -}p backward elastic scattering cross section. Our results lend support to the conclusions of the interference model developed by Barger and Cline for the {pi}p backward scattering, and to qualitative previsions of the Quark models. (author) [French] La diffusion elastique en arriere des K{sup +} et K{sup -} par les protons a ete mesuree dans l'intervalle angulaire 168 deg. < {theta}cm < 177 deg. Le dispositif experimental comprenait des chambres a etincelles 'optiques' pour mesurer les trois trajectoires de l'evenement diffuse, un ensemble de compteurs a scintillations et de compteurs Cerenkov pour selectionner les evenements et declencher les chambres, et un aimant pour determiner le moment du proton de recul. 106000 photographies ont ete prises, dont 22 ont satisfait a

  14. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  15. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    Science.gov (United States)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  16. [The study of tomato fruit weight quantitative trait locus and its application in genetics teaching].

    Science.gov (United States)

    Wang, Hai-yan

    2015-08-01

    The classical research cases, which have greatly promoted the development of genetics in history, can be combined with the content of courses in genetics teaching to train students' ability of scientific thinking and genetic analysis. The localization and clone of gene controlling tomato fruit weight is a pioneer work in quantitative trait locus (QTL) studies and represents a complete process of QTL research in plants. Application of this integrated case in genetics teaching, which showed a wonderful process of scientific discovery and the fascination of genetic research, has inspired students' interest in genetics and achieved a good teaching effect.

  17. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  18. Application of a nitrocellulose immunoassay for quantitation of proteins secreted in cultured media

    International Nuclear Information System (INIS)

    LaDuca, F.M.; Dang, C.V.; Bell, W.R.

    1986-01-01

    A macro immunoassay was developed to quantitate proteins (antigens) secreted in the culture media of primary rat hepatocytes. Dilutions of protein standards and undiluted spent culture media were applied to numbered sheets of nitrocellulose (NC) paper by vacuum filtration (in volumes up to 1 ml) through a specially designed macrofiltration apparatus constructed of plexiglas. Sequential incubation of the NC with bovine serum albumin blocking buffer, monospecific antibody, and 125 I Protein A enabled quantitation of protein concentration by determination of NC bound radioactivity. Linear and reproducible standard curves were obtained with fibrinogen, albumin, transferrin, and haptoglobin. A high degree of coefficient of correlation between radioactivity (cmp) and protein concentration was found. Intra- and inter-test reproducibility was excellent. By using monospecific antibodies, single proteins (i.e., fibrinogen), as low as 32 ng/ml, could be quantified in heterogeneous protein mixtures and in spent culture media. The assay was sensitive to the difference of fibrinogen secretion under nonstimulatory (serum-free hormonally define medium, SFHD) and stimulatory (SFHD plus hydrocortisone) culture conditions. The procedure and techniques described are applicable to the quantitation of any protein in a suitable buffer

  19. Quantitative nuclear medicine imaging: application of computers to the gamma camera and whole-body scanner

    International Nuclear Information System (INIS)

    Budinger, T.F.

    1974-01-01

    The following topics are reviewed: properties of computer systems for nuclear medicine quantitation; quantitative information concerning the relation between organ isotope concentration and detected projections of the isotope distribution; quantitation using two conjugate views; three-dimensional reconstruction from projections; quantitative cardiac radioangiography; and recent advances leading to quantitative nuclear medicine of clinical importance. (U.S.)

  20. Digital Holographic Microscopy: Quantitative Phase Imaging and Applications in Live Cell Analysis

    Science.gov (United States)

    Kemper, Björn; Langehanenberg, Patrik; Kosmeier, Sebastian; Schlichthaber, Frank; Remmersmann, Christian; von Bally, Gert; Rommel, Christina; Dierker, Christian; Schnekenburger, Jürgen

    The analysis of complex processes in living cells creates a high demand for fast and label-free methods for online monitoring. Widely used fluorescence methods require specific labeling and are often restricted to chemically fixated samples. Thus, methods that offer label-free and minimally invasive detection of live cell processes and cell state alterations are of particular interest. In combination with light microscopy, digital holography provides label-free, multi-focus quantitative phase imaging of living cells. In overview, several methods for digital holographic microscopy (DHM) are presented. First, different experimental setups for the recording of digital holograms and the modular integration of DHM into common microscopes are described. Then the numerical processing of digitally captured holograms is explained. This includes the description of spatial and temporal phase shifting techniques, spatial filtering based reconstruction, holographic autofocusing, and the evaluation of self-interference holograms. Furthermore, the usage of partial coherent light and multi-wavelength approaches is discussed. Finally, potentials of digital holographic microscopy for quantitative cell imaging are illustrated by results from selected applications. It is shown that DHM can be used for automated tracking of migrating cells and cell thickness monitoring as well as for refractive index determination of cells and particles. Moreover, the use of DHM for label-free analysis in fluidics and micro-injection monitoring is demonstrated. The results show that DHM is a highly relevant method that allows novel insights in dynamic cell biology, with applications in cancer research and for drugs and toxicity testing.

  1. Applicability of a set of tomographic reconstruction algorithms for quantitative SPECT on irradiated nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Jacobsson Svärd, Staffan, E-mail: staffan.jacobsson_svard@physics.uu.se; Holcombe, Scott; Grape, Sophie

    2015-05-21

    A fuel assembly operated in a nuclear power plant typically contains 100–300 fuel rods, depending on fuel type, which become strongly radioactive during irradiation in the reactor core. For operational and security reasons, it is of interest to experimentally deduce rod-wise information from the fuel, preferably by means of non-destructive measurements. The tomographic SPECT technique offers such possibilities through its two-step application; (1) recording the gamma-ray flux distribution around the fuel assembly, and (2) reconstructing the assembly's internal source distribution, based on the recorded radiation field. In this paper, algorithms for performing the latter step and extracting quantitative relative rod-by-rod data are accounted for. As compared to application of SPECT in nuclear medicine, nuclear fuel assemblies present a much more heterogeneous distribution of internal attenuation to gamma radiation than the human body, typically with rods containing pellets of heavy uranium dioxide surrounded by cladding of a zirconium alloy placed in water or air. This inhomogeneity severely complicates the tomographic quantification of the rod-wise relative source content, and the deduction of conclusive data requires detailed modelling of the attenuation to be introduced in the reconstructions. However, as shown in this paper, simplified models may still produce valuable information about the fuel. Here, a set of reconstruction algorithms for SPECT on nuclear fuel assemblies are described and discussed in terms of their quantitative performance for two applications; verification of fuel assemblies' completeness in nuclear safeguards, and rod-wise fuel characterization. It is argued that a request not to base the former assessment on any a priori information brings constraints to which reconstruction methods that may be used in that case, whereas the use of a priori information on geometry and material content enables highly accurate quantitative

  2. Quantitative imaging of magnetic nanoparticles by magneto-relaxometric tomography for biomedical applications

    International Nuclear Information System (INIS)

    Liebl, Maik

    2016-01-01

    Current biomedical research focuses on the development of novel biomedical applications based on magnetic nanoparticles (MNPs), e.g. for local cancer treatment. These therapy approaches employ MNPs as remotely controlled drug carriers or local heat generators. Since location and quantity of MNPs determine drug enrichment and heat production, quantitative knowledge of the MNP distribution inside a body is essential for the development and success of these therapies. Magnetorelaxometry (MRX) is capable to provide such quantitative information based on the specific response of the MNPs after switching-off an applied magnetic field. Applying a uniform (homogeneous) magnetic field to a MNP distribution and measuring the MNP response by multiple sensors at different locations allows for spatially resolved MNP quantification. However, to reconstruct the MNP distribution from this spatially resolved MRX data, an ill posed inverse problem has to be solved. So far, the solution of this problem was stabilized incorporating a-priori knowledge in the forward model, e.g. by setting priors on the vertical position of the distribution using a 2D reconstruction grid or setting priors on the number and geometry of the MNP sources inside the body. MRX tomography represents a novel approach for quantitative 3D imaging of MNPs, where the inverse solution is stabilized by a series of MRX measurements. In MRX tomography, only parts of the MNP distribution are sequentially magnetized by the use of inhomogeneous magnetic fields. Each magnetizing is followed by detection of the response of the corresponding part of the distribution by multiple sensors. The 3D reconstruction of the MNP distribution is then accomplished by a common evaluation of the distinct MRX measurement series. In this thesis the first experimental setup for MRX tomography was developed for quantitative 3D imaging of biomedical MNP distributions. It is based on a multi-channel magnetizing unit which has been engineered to

  3. ANALYSIS AND QUANTITATIVE ASSESSMENT FOR RESULTS OF EDUCATIONAL PROGRAMS APPLICATION BY MEANS OF DIAGNOSTIC TESTS

    Directory of Open Access Journals (Sweden)

    E. L. Kon

    2015-07-01

    Full Text Available Subject of Research.The problem actuality for creation, control and estimation of results for competence-oriented educational programs is formulated and proved. Competences elements and components, assembled in modules, course units and parts of educational program, are defined as objects of control. Specific tasks of proficiency examination for competences and their components are stated; subject matter of the paper is formulated. Methods of Research. Some adapted statements and methods of technical science are offered to be applied for control tasks solution, decoding and estimation of education results. The approach to quantitative estimation of testing results with the use of additive integrated differential criterion of estimation is offered. Main Results. Statements, defining conditions of certain and uncertain (indeterminacy decision-making about proficiency examination for elements of discipline components controlled by test according to test realization results, are formulated and proved. Probabilistic characteristicsof both decision-making variants are estimated. Variants of determinate and fuzzy logic mathematic methods application for decreasing decision-making indeterminancy are offered; further research direction is selected for development of methods and algorithms for results decoding of diagnostic tests set realization. Practical Relevance. It is shown, that proposed approach to quantitative estimation of testing results will give the possibility to automate the procedure of formation and analysis for education results, specified in the competence format.

  4. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    Science.gov (United States)

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  5. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  6. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    Science.gov (United States)

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.

  7. A quantitative infrared spectral library of vapor phase chemicals: applications to environmental monitoring and homeland defense

    Science.gov (United States)

    Sharpe, Steven W.; Johnson, Timothy J.; Sams, Robert L.

    2004-12-01

    The utility of infrared spectroscopy for monitoring and early warning of accidental or deliberate chemical releases to the atmosphere is well documented. Regardless of the monitoring technique (open-path or extractive) or weather the spectrometer is passive or active (Fourier transform or lidar) a high quality, quantitative reference library is essential for meaningful interpretation of the data. Pacific Northwest National Laboratory through the support of the Department of Energy has been building a library of pure, vapor phase chemical species for the last 4 years. This infrared spectral library currently contains over 300 chemicals and is expected to grow to over 400 chemicals before completion. The library spectra are based on a statistical fit to many spectra at different concentrations, allowing for rigorous error analysis. The contents of the library are focused on atmospheric pollutants, naturally occurring chemicals, toxic industrial chemicals and chemicals specifically designed to do damage. Applications, limitations and technical details of the spectral library will be discussed.

  8. Quantitative analysis of the pendulum test: application to multiple sclerosis patients treated with botulinum toxin.

    Science.gov (United States)

    Bianchi, L; Monaldi, F; Paolucci, S; Iani, C; Lacquaniti, F

    1999-01-01

    The aim of this study was to develop quantitative analytical methods in the application of the pendulum test to both normal and spastic subjects. The lower leg was released by a torque motor from different starting positions. The resulting changes in the knee angle were fitted by means of a time-varying model. Stiffness and viscosity coefficients were derived for each half-cycle oscillation in both flexion and extension, and for all knee starting positions. This method was applied to the assessment of the effects of Botulinum toxin A (BTX) in progressive multiple sclerosis patients in a follow-up study. About half of the patients showed a significant decrement in stiffness and viscosity coefficients.

  9. In-focal-plane characterization of excitation distribution for quantitative fluorescence microscopy applications

    Science.gov (United States)

    Dietrich, Klaus; Brülisauer, Martina; ćaǧin, Emine; Bertsch, Dietmar; Lüthi, Stefan; Heeb, Peter; Stärker, Ulrich; Bernard, André

    2017-06-01

    The applications of fluorescence microscopy span medical diagnostics, bioengineering and biomaterial analytics. Full exploitation of fluorescent microscopy is hampered by imperfections in illumination, detection and filtering. Mainly, errors stem from deviations induced by real-world components inducing spatial or angular variations of propagation properties along the optical path, and they can be addressed through consistent and accurate calibration. For many applications, uniform signal to noise ratio (SNR) over the imaging area is required. Homogeneous SNR can be achieved by quantifying and compensating for the signal bias. We present a method to quantitatively characterize novel reference materials as a calibration reference for biomaterials analytics. The reference materials under investigation comprise thin layers of fluorophores embedded in polymer matrices. These layers are highly homogeneous in their fluorescence response, where cumulative variations do not exceed 1% over the field of view (1.5 x 1.1 mm). An automated and reproducible measurement methodology, enabling sufficient correction for measurement artefacts, is reported. The measurement setup is equipped with an autofocus system, ensuring that the measured film quality is not artificially increased by out-of-focus reduction of the system modulation transfer function. The quantitative characterization method is suitable for analysis of modified bio-materials, especially through patterned protein decoration. The imaging method presented here can be used to statistically analyze protein patterns, thereby increasing both precision and throughput. Further, the method can be developed to include a reference emitter and detector pair on the image surface of the reference object, in order to provide traceable measurements.

  10. Effect of Biofertilizers Application on the Quantitative and Qualitative Characteristics of Linseed (Linum usitatissimum L. Lines

    Directory of Open Access Journals (Sweden)

    B. Motalebizadeh

    2015-09-01

    Full Text Available In order to investigate the effect of bio-fertilizers on the yield and yield components of flax lines, a study was conducted during 2010 growing season at the Agricultural Research Station of Saatlo in Urmia. A split plot design based on randomized complete blocks with four replications was performed in this study. Main factor (a consisted of fertilizer application form (a1 = control without nitrogen fertilizer, a2 = nitrogen fertilizer, a3 = nitroxin + N, a4 = phosphate barvar 2 + N, and a5 = nitroxin + phosphate barvar 2 + N and sub factor (b consisted of five lines of oily flax (b1 = 97-26, b2 = 97-14, b3 = 97-3, b4 = 97-21, b5 = 97-19. Quantitative and qualitative traits such as number of sub stems, leaf weight, capsule weight per main stem and sub stems, seed yield, oil and protein content were calculated or estimated. Results showed that the main factor (fertilizer form had significant effect (at α=0.01 probability level on all the parameters which have been studied in this experiment. Sub factor (linseed lines and interaction between the two factors had statistically significant effects on all traits. The highest seed yield (4781 kg h-1 and the highest seed oil content (36.5% were obtained from applying nitroxin + phosphateye barvare 2 + N on 97-14 and 97-3 lines. Results showed that using of Nitroxin and Phosphateye barvare 2 biofertilizers could be effective in increasing grain yield of linseed. Therefore, application of Nitroxin and Phosphateye barvare 2 biofertilizers could be used to improve soil physio-chemical properties and to increase quantitative and qualitative yield parameters of linseed.

  11. AUTOMATED ANALYSIS OF QUANTITATIVE IMAGE DATA USING ISOMORPHIC FUNCTIONAL MIXED MODELS, WITH APPLICATION TO PROTEOMICS DATA.

    Science.gov (United States)

    Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard

    2011-01-01

    Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method

  12. Contribution to the elastic and inelastic scattering study polarized protons; Contribution a l'etude de la diffusion elastique et inelastique avec un faisceau de protons polarises

    Energy Technology Data Exchange (ETDEWEB)

    Swiniarski, R de [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-10-01

    The elastic and inelastic scattering of the 18.6 MeV polarized proton beam from the Saclay variable energy cyclotron has been studied for the following nuclei: {sup 48}Ti, {sup 50}Ti, {sup 52}C, {sup 54}Fe, {sup 56}Fe, {sup 58}Ni, {sup 62}Ni, {sup 64}Ni, {sup 63}Cu: the targets {sup 52}Cr, {sup 60}Ni and {sup 62}Ni have also been investigated at 16.5 MeV. The measured asymmetries for the strong l = 2 transitions tend to fall into two categories, distinguished by the magnitude of the asymmetries at 30 degrees and 90 degrees. For the transitions studied, only those to the first 2+ state of the 28-neutron nuclei present large asymmetries at these angles. Strong l = 3 and l = 4 transitions show also interesting variations. When the entire optical potential is deformed, coupled channels or DWBA calculations predict the 'small' l = 2 asymmetry reasonably well, but only an abnormal increase of the strength of the spin-orbit distortion or the introduction of an imaginary and negative spin-orbit potential can reproduce the amplitude of the large asymmetries. Calculations with a microscopic model indicate that the asymmetry is sensitive to the form factor and no important differences were found between S=0 and S=1 predictions. (author) [French] Nous avons etudie la diffusion elastique et inelastique a l'aide du faisceau de protons polarises du cyclotron a energie variable de Saclay a 18.6 MeV pour les cibles suivantes: {sup 48}Ti, {sup 50}Ti, {sup 52}Cr, {sup 54}Fe, {sup 56}Fe, {sup 58}Ni, {sup 62}Ni, {sup 63}Cu et {sup 64}Ni: les cibles {sup 52}Cr, {sup 60}Ni et {sup 62}Ni ont egalement ete etudiees a 16.5 MeV. Les asymetries mesurees pour les transitions fortement excitees l = 2 se divisent en deux groupes differant par l'amplitude de l'asymetrie a 30 degres et 90 degres. Seules les asymetries mesurees pour les premiers niveaux 2+ des noyaux a couche complete en neutrons (N=28) sont tres grandes a ces angles. Les asymetries mesurees pour les niveaux 3{sup -} et 4{sup

  13. Quantitative phase imaging using quadri-wave lateral shearing interferometry. Application to X-ray domain

    International Nuclear Information System (INIS)

    Rizzi, Julien

    2013-01-01

    Since Roentgen discovered X-rays, X-ray imaging systems are based on absorption contrast. This technique is inefficient for weakly absorbing objects. As a result, X-ray standard radiography can detect bones lesions, but cannot detect ligament lesions. However, phase contrast imaging can overcome this limitation. Since the years 2000, relying on former works of opticians, X-ray scientists are developing phase sensitive devices compatible with industrial applications such as medical imaging or non destructive control. Standard architectures for interferometry are challenging to implement in the X-ray domain. This is the reason why grating based interferometers became the most promising devices to envision industrial applications. They provided the first x-ray phase contrast images of living human samples. Nevertheless, actual grating based architectures require the use of at least two gratings, and are challenging to adapt on an industrial product. So, the aim of my thesis was to develop a single phase grating interferometer. I demonstrated that such a device can provide achromatic and propagation invariant interference patterns. I used this interferometer to perform quantitative phase contrast imaging of a biological fossil sample and x-ray at mirror metrology. (author)

  14. Potential application of microfocus X-ray techniques for quantitative analysis of bone structure

    International Nuclear Information System (INIS)

    Takahashi, Kenta

    2006-01-01

    With the progress of micro-focused X-ray computed tomography (micro-CT), it has become possible to evaluate the bone structure quantitatively and three-dimensionally. The advantages of micro-CT are that sample preparations are not required and that it provides not only two-dimensional parameters but also three-dimensional stereological indices. This study was carried out to evaluate the potential application of the micro-focus X-ray techniques for quantitative analysis of the new bone produced inside of a hollow chamber of the experimental titanium miniature implant. Twenty-five male wistar rats (9-weeks of age) received experimental titanium miniature implant that had a hollow chamber inside in the left side of the femur. The rats were sacrificed, then the femurs were excised at 4 weeks or 8 weeks after implantation. Micro-CT analysis was performed on the femur samples and the volume of the new bone induced in the hollow chamber of implant was calculated. Percentages of new bone area on the undecalcified histological slides were also measured, linear regression analysis was carried out. In order to evaluate the correlation between pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. New bone formation occurred in experimental titanium miniature implant with a hollow chamber. The volume of new bone was measured by micro CT and the area percentage of new bone area against hollow chamber was calculated on the undecalcified slide. Linear regression analysis showed a high correlation between the pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. Consequently, the new bone produced inside of the hollow chamber of the experimental titanium miniature implant could be quantified as three-dimensional stereological by micro-CT and its precision was supported by the high correlation between the measurement by micro-CT and conservative two-dimensional measurement of histological slide. (author)

  15. Quantitative imaging biomarkers: the application of advanced image processing and analysis to clinical and preclinical decision making.

    Science.gov (United States)

    Prescott, Jeffrey William

    2013-02-01

    The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.

  16. Quantitative microbiological risk assessment in food industry: Theory and practical application.

    Science.gov (United States)

    Membré, Jeanne-Marie; Boué, Géraldine

    2018-04-01

    The objective of this article is to bring scientific background as well as practical hints and tips to guide risk assessors and modelers who want to develop a quantitative Microbiological Risk Assessment (MRA) in an industrial context. MRA aims at determining the public health risk associated with biological hazards in a food. Its implementation in industry enables to compare the efficiency of different risk reduction measures, and more precisely different operational settings, by predicting their effect on the final model output. The first stage in MRA is to clearly define the purpose and scope with stakeholders, risk assessors and modelers. Then, a probabilistic model is developed; this includes schematically three important phases. Firstly, the model structure has to be defined, i.e. the connections between different operational processing steps. An important step in food industry is the thermal processing leading to microbial inactivation. Growth of heat-treated surviving microorganisms and/or post-process contamination during storage phase is also important to take into account. Secondly, mathematical equations are determined to estimate the change of microbial load after each processing step. This phase includes the construction of model inputs by collecting data or eliciting experts. Finally, the model outputs are obtained by simulation procedures, they have to be interpreted and communicated to targeted stakeholders. In this latter phase, tools such as what-if scenarios provide an essential added value. These different MRA phases are illustrated through two examples covering important issues in industry. The first one covers process optimization in a food safety context, the second one covers shelf-life determination in a food quality context. Although both contexts required the same methodology, they do not have the same endpoint: up to the human health in the foie gras case-study illustrating here a safety application, up to the food portion in the

  17. Factors affecting the repeatability of gamma camera calibration for quantitative imaging applications using a sealed source

    International Nuclear Information System (INIS)

    Anizan, N; Wahl, R L; Frey, E C; Wang, H; Zhou, X C

    2015-01-01

    Several applications in nuclear medicine require absolute activity quantification of single photon emission computed tomography images. Obtaining a repeatable calibration factor that converts voxel values to activity units is essential for these applications. Because source preparation and measurement of the source activity using a radionuclide activity meter are potential sources of variability, this work investigated instrumentation and acquisition factors affecting repeatability using planar acquisition of sealed sources. The calibration factor was calculated for different acquisition and geometry conditions to evaluate the effect of the source size, lateral position of the source in the camera field-of-view (FOV), source-to-camera distance (SCD), and variability over time using sealed Ba-133 sources. A small region of interest (ROI) based on the source dimensions and collimator resolution was investigated to decrease the background effect. A statistical analysis with a mixed-effects model was used to evaluate quantitatively the effect of each variable on the global calibration factor variability. A variation of 1 cm in the measurement of the SCD from the assumed distance of 17 cm led to a variation of 1–2% in the calibration factor measurement using a small disc source (0.4 cm diameter) and less than 1% with a larger rod source (2.9 cm diameter). The lateral position of the source in the FOV and the variability over time had small impacts on calibration factor variability. The residual error component was well estimated by Poisson noise. Repeatability of better than 1% in a calibration factor measurement using a planar acquisition of a sealed source can be reasonably achieved. The best reproducibility was obtained with the largest source with a count rate much higher than the average background in the ROI, and when the SCD was positioned within 5 mm of the desired position. In this case, calibration source variability was limited by the quantum

  18. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  19. Development and application of a quantitative multiplexed small GTPase activity assay using targeted proteomics.

    Science.gov (United States)

    Zhang, Cheng-Cheng; Li, Ru; Jiang, Honghui; Lin, Shujun; Rogalski, Jason C; Liu, Kate; Kast, Juergen

    2015-02-06

    Small GTPases are a family of key signaling molecules that are ubiquitously expressed in various types of cells. Their activity is often analyzed by western blot, which is limited by its multiplexing capability, the quality of isoform-specific antibodies, and the accuracy of quantification. To overcome these issues, a quantitative multiplexed small GTPase activity assay has been developed. Using four different binding domains, this assay allows the binding of up to 12 active small GTPase isoforms simultaneously in a single experiment. To accurately quantify the closely related small GTPase isoforms, a targeted proteomic approach, i.e., selected/multiple reaction monitoring, was developed, and its functionality and reproducibility were validated. This assay was successfully applied to human platelets and revealed time-resolved coactivation of multiple small GTPase isoforms in response to agonists and differential activation of these isoforms in response to inhibitor treatment. This widely applicable approach can be used for signaling pathway studies and inhibitor screening in many cellular systems.

  20. Proceedings First Workshop on Quantitative Formal Methods : theory and applications (QFM'09, Eindhoven, The Netherlands, November 3, 2009)

    NARCIS (Netherlands)

    Andova, S.; McIver, A.; D'Argenio, P.R.; Cuijpers, P.J.L.; Markovski, J.; Morgan, C.; Núñez, M.

    2009-01-01

    This volume contains the papers presented at the 1st workshop on Quantitative Formal Methods: Theory and Applications, which was held in Eindhoven on 3 November 2009 as part of the International Symposium on Formal Methods 2009. This volume contains the final versions of all contributions accepted

  1. Quantitative imaging of magnetic nanoparticles by magneto-relaxometric tomography for biomedical applications; Quantitative Bildgebung magnetischer Nanopartikel mittels magnetrelaxometrischer Tomographie fuer biomedizinische Anwendungen

    Energy Technology Data Exchange (ETDEWEB)

    Liebl, Maik

    2016-11-18

    Current biomedical research focuses on the development of novel biomedical applications based on magnetic nanoparticles (MNPs), e.g. for local cancer treatment. These therapy approaches employ MNPs as remotely controlled drug carriers or local heat generators. Since location and quantity of MNPs determine drug enrichment and heat production, quantitative knowledge of the MNP distribution inside a body is essential for the development and success of these therapies. Magnetorelaxometry (MRX) is capable to provide such quantitative information based on the specific response of the MNPs after switching-off an applied magnetic field. Applying a uniform (homogeneous) magnetic field to a MNP distribution and measuring the MNP response by multiple sensors at different locations allows for spatially resolved MNP quantification. However, to reconstruct the MNP distribution from this spatially resolved MRX data, an ill posed inverse problem has to be solved. So far, the solution of this problem was stabilized incorporating a-priori knowledge in the forward model, e.g. by setting priors on the vertical position of the distribution using a 2D reconstruction grid or setting priors on the number and geometry of the MNP sources inside the body. MRX tomography represents a novel approach for quantitative 3D imaging of MNPs, where the inverse solution is stabilized by a series of MRX measurements. In MRX tomography, only parts of the MNP distribution are sequentially magnetized by the use of inhomogeneous magnetic fields. Each magnetizing is followed by detection of the response of the corresponding part of the distribution by multiple sensors. The 3D reconstruction of the MNP distribution is then accomplished by a common evaluation of the distinct MRX measurement series. In this thesis the first experimental setup for MRX tomography was developed for quantitative 3D imaging of biomedical MNP distributions. It is based on a multi-channel magnetizing unit which has been engineered to

  2. UV SPECTROPHOTOMETRY APPLICATION FOR QUANTITATIVE DETERMINATION OF VINPOCETINE IN DRUG FORMULATIONS

    Directory of Open Access Journals (Sweden)

    J. V. Monaykina

    2014-12-01

    procedure was successfully applied for the analysis of two new pharmaceutical formulations. The results obtained by applying the proposed procedure were statistically analyzed. Validation studies of the methods confirmed their proper precision and recovery (for cream: 99,73%, RSD% = 0.924, n = 9; for suppositories: 100,3, RSD% = 0,378, n = 9, linearity (for cream: r =0,9999, n=6; for suppositories: r =0,9998, n=6 The received parameters enable the use of developed methods in quantitative pharmaceutical analysis. Conclusions. The applicability of the new procedure is well established by vinpocetine assay in the new drug formulations, namely 0,01 suppositories and 0,5% nasal cream. The developed UV spectrophotometric methods are potentially useful because of their simplicity, rapidity and accuracy. The methods are valid according to the validation requirements of Ukrainian Pharmacopeia.

  3. APPLICATION OF UV-SPECTROPHOTOMETRY FOR THE QUANTITATIVE DETERMINATION OF CAPTOPRIL IN DRUG

    Directory of Open Access Journals (Sweden)

    Yu. V. Monaykina

    2015-04-01

    formulations. The results obtained by applying the proposed method were statistically analyzed. Validation of the method confirmed its proper precision and recovery (for gel: 100,2%, RSD% = 0,572, n = 9; for suppositories: 99,87%, RSD% = 0,420, n = 9, linearity (for gel: r =0,9978, n=6; for suppositories: r =0,9982, n=6 The received parameters enable the developed procedure to be used in quantitative pharmaceutical analysis. Conclusions. The applicability of the new procedure is well established by the assay the new drug formulations of captopril – 0,05 suppositories and 2,5% nasal gel. The developed UV spectrophotometric method is potentially useful because of its simplicity, rapidity and accuracy. The procedure is valid according to the validation requirements of Ukrainian Pharmacopeia

  4. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    , the abort triggers must have low false negative rates to be sure that real crew-threatening failures are detected, and also low false positive rates to ensure that the crew does not abort from non-crew-threatening launch vehicle behaviors. The analysis process described in this paper is a compilation of over six years of lessons learned and refinements from experiences developing abort triggers for NASA's Constellation Program (Ares I Project) and the SLS Program, as well as the simultaneous development of SHM/FM theory. The paper will describe the abort analysis concepts and process, developed in conjunction with SLS Safety and Mission Assurance (S&MA) to define a common set of mission phase, failure scenario, and Loss of Mission Environment (LOME) combinations upon which the SLS Loss of Mission (LOM) Probabilistic Risk Assessment (PRA) models are built. This abort analysis also requires strong coordination with the Multi-Purpose Crew Vehicle (MPCV) and SLS Structures and Environments (STE) to formulate a series of abortability tables that encapsulate explosion dynamics over the ascent mission phase. The design and assessment of abort conditions and triggers to estimate their Loss of Crew (LOC) Benefits also requires in-depth integration with other groups, including Avionics, Guidance, Navigation and Control(GN&C), the Crew Office, Mission Operations, and Ground Systems. The outputs of this analysis are a critical input to SLS S&MA's LOC PRA models. The process described here may well be the first full quantitative application of SHM/FM theory to the selection of a sensor suite for any aerospace system.

  5. Validity of spherical quantitative refractometry: application to laser-produced plasmas

    International Nuclear Information System (INIS)

    Benattar, R.; Popovics, C.

    1983-01-01

    We report an experimental laser technique of quantitative Schlieren imaging of spherical plasmas combined with streak camera recording. We show that quantitative refractometry applies for small values of refraction angles, i.e., when the law giving the refraction angle versus the impact parameter of rays passing through the plasma is a linearly decreasing function

  6. A New Apparatus for Inelastic, Quasi-Elastic and Elastic Cold Neutron Measurements; Un nouveau appareil pour les mesures de diffusion inelastique , quasi-elastique et elastique des neutrons lents; Novyj pribor dlya izmereniya neuprugogo, kvaziuprugogo i uprugogo rasseyaniya kholodnykh nejtrohov; Nuevo aparato para mediciones inelastic as, cuasi elasticas y elasticas de neutrones frios

    Energy Technology Data Exchange (ETDEWEB)

    Otnes, K; Palevsky, H [Brookhaven National Laboratory, Upton, NY (United States)

    1963-01-15

    despersion de la longueur d'onde (largeur totale a mi-hauteur) seront respectivement de 16 {mu}s et 0,16 A pour des neutrons incidents ayant une longueur d'onde de 4 A; l'intensite de la bouffee sur l'echantillon (4 x 1,6 cm) sera de 2 x 10{sup 6} n/s pouf mesures de diffusion quasi-elastique et elastique, la configuration a trois rotors conviendra parfaitement. La duree de la bouffee et la dispersion de longueur d'onde correspondante peuvent atteindre des valeurs aussi faibles que 8 {mu}s et 0, 04 A, ce qui donne une intensite de 10{sup 4} n/s sur un echantillon de 4 x 0, 8 cm. La longueur d'onde et la resolution en temps peuvent etre ajustees entre les; deux limites susmentionnees, de maniere a obtenir l'intensite de flux maximum pour une experience determinee. (author) [Spanish] Se esta construyendo un nuevo selector mecanico destinado al reactor de flujo intenso de Brookhaven. El aparato es del tipo de tres elementos rotores en fase. Los rotores, de 80 cm de diametro, giran a una velo- cidad maxima de 15000 rev/min y se han disenado de modo que emitan tres rafagas de neutrones monocromaticos por revolucion. Dos de los rotores giran alrededor de un eje horizontal, mientras que el tercero lo hace verticalmente. El sistma puede funcionar con uno, dos or tres elementos selectores, segun el tipo de medicion que se quiera efectuar. Para las mediciones inelasticas en que los neutrones ganan energia, lo mas indicado es utilizar un sistema de dos rotores. En este sistema, la duracion de las rafagas sera de 16 {mu}s y el ensanchamiento de longitudes de onda (amplitud plena a la mitad del valor maximo) de 0,16 A para neutrones incidentes de 4 A; la intensidd en la muestra (4 x 1,6 cm) sera de 2 x 10{sup 6} n/s. para las mediciones cuasi elasticas y elasticas resultara mas apropiado el sistema de tres rotores. La duracion de las rafagas y el ensanchamiento de longitudes de onda pueden llegar a un minimo de 8 {mu}s y 0,04 A, respectivamente, lo que representa una intensidad de 10{sup

  7. Quantitative applications of gamma densitometry in the coal industry: a critique

    International Nuclear Information System (INIS)

    Shea, P.; Sher, R.; Gozani, T.

    1982-01-01

    This paper discusses the use of gamma densitometry to quantitatively assay bulk samples of coal on a continuous basis. Devices using these principles to determine mass flows are on the market, and work is progressing in several countries on instruments to determine ash content. The theoretical limits of applicability and inherent assumptions of these techniques are discussed, primarily as applied to dry bulk coal, but with some discussion of the more complicated problems of slurried coal. Gamma rays are generated by sources, usually a single radioactive element. These have several advantages over XRF, the main one being that no power is required to generate gammas. However, there are a limited number of gamma sources with useful energies, long enough half-lives to be economically useful, and clean spectra (that is, relatively few energies emitted by the source in question). Gamma densitometry measurements by single and multiple-energy transmission and backscatter measurements are discussed. A general formalism for analyzing multiple-energy systems is presented. While multi-energy systems can, in principle, pick out as many groups of elements as energies used, the matrices involved are ill-conditioned and thus require accurate measures of count rate (i.e., long counting times or high source intensities) to achieve acceptable errors. Changes in coal composition and profile of coal on a belt were also seen to be important sources of error. Transmission measurements are more amenable to analysis than backscatter, which are essentially transmission measurements made on a distributed source. In addition, transmission measurements are not restricted to low energy gamma sources, and can survey the entire bulk of coal rather than just the upper portion. The special problems of slurried coal measurements are briefly discussed

  8. Quantitative Morphometric Analysis of Terrestrial Glacial Valleys and the Application to Mars

    Science.gov (United States)

    Allred, Kory

    Although the current climate on Mars is very cold and dry, it is generally accepted that the past environments on the planet were very different. Paleo-environments may have been warm and wet with oceans and rivers. And there is abundant evidence of water ice and glaciers on the surface as well. However, much of that comes from visual interpretation of imagery and other remote sensing data. For example, some of the characteristics that have been utilized to distinguish glacial forms are the presence of landscape features that appear similar to terrestrial glacial landforms, constraining surrounding topography, evidence of flow, orientation, elevation and valley shape. The main purpose of this dissertation is to develop a model that uses quantitative variables extracted from elevation data that can accurately categorize a valley basin as either glacial or non-glacial. The application of this model will limit the inherent subjectivity of image analysis by human interpretation. The model developed uses hypsometric attributes (elevation-area relationship), a newly defined variable similar to the equilibrium line altitude for an alpine glacier, and two neighborhood search functions intended to describe the valley cross-sectional curvature, all based on a digital elevation model (DEM) of a region. The classification model uses data-mining techniques trained on several terrestrial mountain ranges in varied geologic and geographic settings. It was applied to a select set of previously catalogued locations on Mars that resemble terrestrial glaciers. The results suggest that the landforms do have a glacial origin, thus supporting much of the previous research that has identified the glacial landforms. This implies that the paleo-environment of Mars was at least episodically cold and wet, probably during a period of increased planetary obliquity. Furthermore, the results of this research and the implications thereof add to the body of knowledge for the current and past

  9. New journal selection for quantitative survey of infectious disease research: application for Asian trend analysis

    Directory of Open Access Journals (Sweden)

    Okabe Nobuhiko

    2009-10-01

    Full Text Available Abstract Background Quantitative survey of research articles, as an application of bibliometrics, is an effective tool for grasping overall trends in various medical research fields. This type of survey has been also applied to infectious disease research; however, previous studies were insufficient as they underestimated articles published in non-English or regional journals. Methods Using a combination of Scopus™ and PubMed, the databases of scientific literature, and English and non-English keywords directly linked to infectious disease control, we identified international and regional infectious disease journals. In order to ascertain whether the newly selected journals were appropriate to survey a wide range of research articles, we compared the number of original articles and reviews registered in the selected journals to those in the 'Infectious Disease Category' of the Science Citation Index Expanded™ (SCI Infectious Disease Category during 1998-2006. Subsequently, we applied the newly selected journals to survey the number of original articles and reviews originating from 11 Asian countries during the same period. Results One hundred journals, written in English or 7 non-English languages, were newly selected as infectious disease journals. The journals published 14,156 original articles and reviews of Asian origin and 118,158 throughout the world, more than those registered in the SCI Infectious Disease Category (4,621 of Asian origin and 66,518 of the world in the category. In Asian trend analysis of the 100 journals, Japan had the highest percentage of original articles and reviews in the area, and no noticeable increase in articles was revealed during the study period. China, India and Taiwan had relatively large numbers and a high increase rate of original articles among Asian countries. When adjusting the publication of original articles according to the country population and the gross domestic product (GDP, Singapore and

  10. Application of non-quantitative modelling in the analysis of a network warfare environment

    CSIR Research Space (South Africa)

    Veerasamy, N

    2008-07-01

    Full Text Available based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define...

  11. A method of quantitative prediction for sandstone type uranium deposit in Russia and its application

    International Nuclear Information System (INIS)

    Chang Shushuai; Jiang Minzhong; Li Xiaolu

    2008-01-01

    The paper presents the foundational principle of quantitative predication for sandstone type uranium deposits in Russia. Some key methods such as physical-mathematical model construction and deposits prediction are described. The method has been applied to deposits prediction in Dahongshan region of Chaoshui basin. It is concluded that the technique can fortify the method of quantitative predication for sandstone type uranium deposits, and it could be used as a new technique in China. (authors)

  12. Influence of the anisotropy of expansion coefficients on the elastic properties of uranium of zirconium and of zinc; Influence de l'anisotropie des coefficients de dilatation sur les proprietes elastiques de l'uranium du zirconium et du zinc

    Energy Technology Data Exchange (ETDEWEB)

    Calais, Daniel; Saada, Georges; Simenel, Nicole [Commissariat a l' energie atomique et aux energies alternatives - CEA (France)

    1959-07-01

    The anisotropy of the expansion coefficients of uranium, zirconium and zinc provoke internal tensions in the course of cooling these metals. These tensions are eliminated in the case of zinc by restoration to room temperature, but persist in uranium and zirconium and are responsible for the absence of an elastic limit in these two metals. Reprint of a paper published in Comptes rendus des seances de l'Academie des Sciences, t. 249, p. 1225-1227, sitting of 5 October 1959 [French] L'anisotropie des coefficients de dilatation de l'uranium, du zirconium et du zinc provoque au cours du refroidissement de ces metaux des tensions internes. Eliminees par restauration a la temperature ambiante dans le cas du zinc, ces tensions persistent pour l'uranium et le zirconium et sont responsable de l'absence de limite elastique dans ces deux metaux. Reproduction d'un article publie dans les Comptes rendus des seances de l'Academie des Sciences, t. 249, p. 1225-1227, seance du 5 octobre 1959.

  13. Cellular Phone-Based Image Acquisition and Quantitative Ratiometric Method for Detecting Cocaine and Benzoylecgonine for Biological and Forensic Applications

    OpenAIRE

    Cadle, Brian A.; Rasmus, Kristin C.; Varela, Juan A.; Leverich, Leah S.; O’Neill, Casey E.; Bachtell, Ryan K.; Cooper, Donald C.

    2010-01-01

    Here we describe the first report of using low-cost cellular or web-based digital cameras to image and quantify standardized rapid immunoassay strips as a new point-of-care diagnostic and forensics tool with health applications. Quantitative ratiometric pixel density analysis (QRPDA) is an automated method requiring end-users to utilize inexpensive (~ $1 USD/each) immunotest strips, a commonly available web or mobile phone camera or scanner, and internet or cellular service. A model is descri...

  14. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    Science.gov (United States)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  15. Potential Application of Quantitative Prostate-specific Antigen Analysis in Forensic Examination of Seminal Stains

    Directory of Open Access Journals (Sweden)

    Zhenping Liu

    2015-01-01

    Full Text Available The aims of this study are to use quantitative analysis of the prostate-specific antigen (PSA in the seminal stain examination and to explore the practical value of this analysis in forensic science. For a comprehensive analysis, vaginal swabs from 48 rape cases were tested both by a PSA fluorescence analyzer (i-CHROMA Reader and by a conventional PSA strip test. To confirm the results of these PSA tests, seminal DNA was tested following differential extraction. Compared to the PSA strip test, the PSA rapid quantitative fluorescence analyzer provided the more accurate and sensitive results. More importantly, individualized schemes based on quantitative PSA results of samples can be developed to improve the quality and procedural efficiency in the forensic seminal inspection of samples prior to DNA analysis.

  16. Water volume quantitation using nuclear magnetic resonance imaging: application to cerebrospinal fluid

    International Nuclear Information System (INIS)

    Lecouffe, P.; Huglo, D.; Dubois, P.; Rousseau, J.; Marchandise, X.

    1990-01-01

    Quantitation in proton NMR imaging is applied to cerebrospinal fluid (CSF). Total intracranial CSF volume was measured from Condon's method: CSF signal was compared with distilled water standard signal in a single sagittal thick slice. Brain signal was reduced to minimum using a 5000/360/400 sequence. Software constraints did not permit easy implementing on imager and uniformity correction was performed on a microcomputer. Accuracy was better than 4%. Total intracranial CSF was found between 91 and 164 ml in 5 healthy volunteers. Extraventricular CSF quantitation appears very improved by this method, but planimetric methods seem better in order to quantify ventricular CSF. This technique is compared to total lung water measurement from proton density according to Mac Lennan's method. Water volume quantitation confirms ability of NMR imaging to quantify biologic parameters but image defects have to be known by strict quality control [fr

  17. [Application of target restoration space quantity and quantitative relation in precise esthetic prosthodontics].

    Science.gov (United States)

    Haiyang, Yu; Tian, Luo

    2016-06-01

    Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.

  18. Effects of Single and Combined Application of Organic and Biological Fertilizers on Quantitative and Qualitative Yield of Anisum (Pimpinella anisum

    Directory of Open Access Journals (Sweden)

    N Kamayestani

    2015-07-01

    Full Text Available In order to study the effects of single and combined applications of biofertilazer and organic fertilizers on quantitative and qualitative characteristics of anisum (Pimpinella anisum, an experiment was conducted based on a Randomized Complete Block Design with three replications and fifteen treatments at Research Station, Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, in 2011 year. Treatments were: (1 mycorrhiza (Glomus intraradices, (2 mycorrhiza + cow manure, (3 mycorrhiza + vermicompost, (4 mycorrhiza+ compost, (5 mycorrhiza + chemical fertilizer, (6 biosulfur (Thiobacillus sp. + Bentonite, (7 biosulfur + chemical fertilizer, (8 biosulfur + cow manure, (9 biosulfur + vermicompost, (10 biosulfur+compost,11 (cow manure, (12 vermicompost, (13 chemical fertilizer (NPK, (14compost and (15 control. The results showed that application of fertilizer treatments had significant effect on most characteristics of anisum. The highest number of seed per umbelet (7.24, economic yield (1263.4kg/ha were obtained fram biosulfur treatment. The highest dry matter yield (4504.1 kg/ha resulted from combined application of biosulfur + chemical fertilizer and the highest harvest index (25.97% observed in biosulfur+cow manure. The combined application of mycorrhiza affected some qualification traits, as the highest number of umbel per plant (65.7, 1000 seed-weight (3.24 g and essential oil percentage (5.3% resulted from combined application of mycorrhiza+chemical fertilizer. In general, it can be concluded that application of organic and biological fertilizer particularly mycorrhiza and biosulfur had a significant effect on improving of quantitative and qualitative characteristics of anisum. Furthermore, the combined application of organic and biological fertilizer had higher positive effects than their single application.

  19. The quantitative evaluation of false colour photography with application of a red filter.

    NARCIS (Netherlands)

    Clevers, J.G.P.W.; Stokkom, van H.T.C.

    1992-01-01








































    Abstract


    For monitoring (homogeneous) agricultural crops, a quantitative analysis of

  20. Theory of quantitative trend analysis and its application to the South African elections

    CSIR Research Space (South Africa)

    Greben, JM

    2006-02-28

    Full Text Available In this paper the author discusses a quantitative theory of trend analysis. Often trends are based on qualitative considerations and subjective assumptions. In the current approach the author makes use of extensive data bases to optimise the so...

  1. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  3. Real-time quantitative PCR of Staphylococcus aureus and application in restaurant meals.

    Science.gov (United States)

    Berrada, H; Soriano, J M; Mañes, J; Picó, Y

    2006-01-01

    Staphylococcus aureus is considered the second most common pathogen to cause outbreaks of food poisoning, exceeded only by Campylobacter. Consumption of foods containing this microorganism is often identified as the cause of illness. In this study, a rapid, reliable, and sensitive real-time quantitative PCR was developed and compared with conventional culture methods. Real-time quantitative PCR was carried out by purifying DNA extracts of S. aureus with a Staphylococcus sample preparation kit and quantifying it in the LightCycler system with hybridization probes. The assay was linear from a range of 10 to 10(6) S. aureus cells (r2 > 0.997). The PCR reaction presented an efficiency of >85%. Accuracy of the PCR-based assay, expressed as percent bias, was around 13%, and the precision, expressed as a percentage of the coefficient of variation, was 7 to 10%. Intraday and interday variability were studied at 10(2) CFU/g and was 12 and 14%, respectively. The proposed method was applied to the analysis of 77 samples of restaurant meals in Valencia (Spain). In 11.6% of samples S. aureus was detected by real-time quantitative PCR, as well as by the conventional microbiological method. An excellent correspondence between real-time quantitative PCR and microbiological numbers (CFU/g) was observed with deviations of < 28%.

  4. Evaluating quantitative and qualitative models: An application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  5. Evaluating quantitative and qualitative models: an application for nationwide water erosion assessment in Ethiopia

    NARCIS (Netherlands)

    Sonneveld, B.G.J.S.; Keyzer, M.A.; Stroosnijder, L.

    2011-01-01

    This paper tests the candidacy of one qualitative response model and two quantitative models for a nationwide water erosion hazard assessment in Ethiopia. After a descriptive comparison of model characteristics the study conducts a statistical comparison to evaluate the explanatory power of the

  6. Quantum formalism in gravitation quantitative application to the Titus-Bode law

    International Nuclear Information System (INIS)

    Louise, R.

    1982-01-01

    Quantum conception of virtual energy exchange between masses leads to the Newton's law. A guide wave similar to the De Broglie's one (1923) is able to account quantitatively for the Titius-Bode law occurring in the planetary system as well as the satellite systems. (Auth.)

  7. Model development for quantitative evaluation of nuclear fuel cycle alternatives and its application

    International Nuclear Information System (INIS)

    Ko, Won Il

    2000-02-01

    This study addresses the quantitative evaluation of the proliferation resistance and the economics which are important factors of the alternative nuclear fuel cycle system. In this study, model was developed to quantitatively evaluate the proliferation resistance of the nuclear fuel cycles, and a fuel cycle cost analysis model was suggested to incorporate various uncertainties in the fuel cycle cost calculation. The proposed models were then applied to Korean environment as a sample study to provide better references for the determination of future nuclear fuel cycle system in Korea. In order to quantify the proliferation resistance of the nuclear fuel cycle, the proliferation resistance index was defined in imitation of an electrical circuit with an electromotive force and various electrical resistance components. In this model, the proliferation resistance was described an a relative size of the barrier that must be overcome in order to acquire nuclear weapons. Therefore, a larger barriers means that the risk of failure is great, expenditure of resources is large and the time scales for implementation is long. The electromotive force was expressed as the political motivation of the potential proliferators, such as an unauthorized party or a national group to acquire nuclear weapons. The electrical current was then defined as a proliferation resistance index. There are two electrical circuit models used in the evaluation of the proliferation resistance: the series and the parallel circuits. In the series circuit model of the proliferation resistance, a potential proliferator has to overcome all resistance barriers to achieve the manufacturing of the nuclear weapons. This phenomenon could be explained by the fact that the IAEA(International Atomic Energy Agency)'s safeguards philosophy relies on the defense-in-depth principle against nuclear proliferation at a specific facility. The parallel circuit model was also used to imitate the risk of proliferation for

  8. Proficiency testing as a basis for estimating uncertainty of measurement: application to forensic alcohol and toxicology quantitations.

    Science.gov (United States)

    Wallace, Jack

    2010-05-01

    While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.

  9. A combined usage of stochastic and quantitative risk assessment methods in the worksites: Application on an electric power provider

    International Nuclear Information System (INIS)

    Marhavilas, P.K.; Koulouriotis, D.E.

    2012-01-01

    An individual method cannot build either a realistic forecasting model or a risk assessment process in the worksites, and future perspectives should focus on the combined forecasting/estimation approach. The main purpose of this paper is to gain insight into a risk prediction and estimation methodological framework, using the combination of three different methods, including the proportional quantitative-risk-assessment technique (PRAT), the time-series stochastic process (TSP), and the method of estimating the societal-risk (SRE) by F–N curves. In order to prove the usefulness of the combined usage of stochastic and quantitative risk assessment methods, an application on an electric power provider industry is presented to, using empirical data.

  10. Flipping interferometry and its application for quantitative phase microscopy in a micro-channel.

    Science.gov (United States)

    Roitshtain, Darina; Turko, Nir A; Javidi, Bahram; Shaked, Natan T

    2016-05-15

    We present a portable, off-axis interferometric module for quantitative phase microscopy of live cells, positioned at the exit port of a coherently illuminated inverted microscope. The module creates on the digital camera an interference pattern between the image of the sample and its flipped version. The proposed simplified module is based on a retro-reflector modification in an external Michelson interferometer. The module does not contain any lenses, pinholes, or gratings and its alignment is straightforward. Still, it allows full control of the off-axis angle and does not suffer from ghost images. As experimentally demonstrated, the module is useful for quantitative phase microscopy of live cells rapidly flowing in a micro-channel.

  11. Advances in quantitative UV-visible spectroscopy for clinical and pre-clinical application in cancer.

    Science.gov (United States)

    Brown, J Quincy; Vishwanath, Karthik; Palmer, Gregory M; Ramanujam, Nirmala

    2009-02-01

    Methods of optical spectroscopy that provide quantitative, physically or physiologically meaningful measures of tissue properties are an attractive tool for the study, diagnosis, prognosis, and treatment of various cancers. Recent development of methodologies to convert measured reflectance and fluorescence spectra from tissue to cancer-relevant parameters such as vascular volume, oxygenation, extracellular matrix extent, metabolic redox states, and cellular proliferation have significantly advanced the field of tissue optical spectroscopy. The number of publications reporting quantitative tissue spectroscopy results in the UV-visible wavelength range has increased sharply in the past three years, and includes new and emerging studies that correlate optically measured parameters with independent measures such as immunohistochemistry, which should aid in increased clinical acceptance of these technologies.

  12. Ion-solid interaction at low energies: principles and application of quantitative ISS

    International Nuclear Information System (INIS)

    Niehus, H.; Spitzl, R.

    1991-01-01

    Quantitative surface analysis with low-energy (500-5000 eV) ion scattering spectroscopy is known to be difficult, most often because of strong charge transfer and multiple scattering effects occurring during ion-surface interaction. In order to avoid neutralization problems, either alkali primary ions or noble gas ions in combination with the detection of all scattered particles was applied. Multiple scattering occurs predominantly at forward scattering and might confound the analysis. Backward scattering (i.e. 180 o impact collision ion scattering) bypasses strongly the multiple scattering complication and has been used successfully for the analysis of a number of surface structures for metals, semiconductors and binary alloys. A simple triangulation concept gives access to mass-selective qualitative surface crystallography. Quantitative surface structures were determined by comparison with computer simulations. (author)

  13. Application of LC–MS/MS for quantitative analysis of glucocorticoids and stimulants in biological fluids

    OpenAIRE

    Haneef, Jamshed; Shaharyar, Mohammad; Husain, Asif; Rashid, Mohd; Mishra, Ravinesh; Parveen, Shama; Ahmed, Niyaz; Pal, Manoj; Kumar, Deepak

    2013-01-01

    Liquid chromatography tandem mass chromatography (LCâMS/MS) is an important hyphenated technique for quantitative analysis of drugs in biological fluids. Because of high sensitivity and selectivity, LCâMS/MS has been used for pharmacokinetic studies, metabolites identification in the plasma and urine. This manuscript gives comprehensive analytical review, focusing on chromatographic separation approaches (column packing materials, column length and mobile phase) as well as different acquisiti...

  14. Application of LC–MS/MS for quantitative analysis of glucocorticoids and stimulants in biological fluids

    OpenAIRE

    Haneef, Jamshed; Shaharyar, Mohammad; Husain, Asif; Rashid, Mohd; Mishra, Ravinesh; Parveen, Shama; Ahmed, Niyaz; Pal, Manoj; Kumar, Deepak

    2013-01-01

    Liquid chromatography tandem mass chromatography (LC–MS/MS) is an important hyphenated technique for quantitative analysis of drugs in biological fluids. Because of high sensitivity and selectivity, LC–MS/MS has been used for pharmacokinetic studies, metabolites identification in the plasma and urine. This manuscript gives comprehensive analytical review, focusing on chromatographic separation approaches (column packing materials, column length and mobile phase) as well as different acquisiti...

  15. Quantitation of anti-tetanus and anti-diphtheria antibodies by enzymoimmunoassay: methodology and applications.

    Science.gov (United States)

    Virella, G; Hyman, B

    1991-01-01

    We have developed enzymoimmunoassays (EIA) for the quantitation of antibodies (Ab) to tetanus and diphtheria toxoids (TT, DT) using Immulon I plates coated with the appropriate toxoid. A preparation of human tetanus immunoglobulin with a known concentration of anti-TT Ab was used as calibrator of the anti-TT antibody assay. The assay of anti-DT Ab is calibrated with a pool of human sera whose anti-DT Ab concentration was determined by quantitative immunoelectrophoresis, using a horse anti-DT with known Ab concentration as calibrator. A peroxidase-conjugated anti-human IgG was used in both assays. ABTS was used as substrate, and the reaction was stopped after 1 min incubation with citric acid and the OD measured at 414 nm on a Vmax reader. The assays have been applied to a variety of clinical situations. In patients suspected of having tetanus, the quantitation of antibodies has been helpful in establishing a diagnosis. In patients with a history of hypersensitivity to tetanus toxoid, verification of the levels of anti-TT antibody may prevent unnecessary and potentially harmful immunizations. The assays have also been used for the diagnostic evaluation of the humoral immune response to TT and DT, both in pediatric patients and in immunosuppressed patients. Several non-responders have been detected, and we have recently used the assay to monitor the effects of fish oil administration on the humoral immune response.(ABSTRACT TRUNCATED AT 250 WORDS)

  16. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    Science.gov (United States)

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification

  17. The quantitation of buffering action II. Applications of the formal & general approach

    Science.gov (United States)

    Schmitt, Bernhard M

    2005-01-01

    Background The paradigm of "buffering" originated in acid-base physiology, but was subsequently extended to other fields and is now used for a wide and diverse set of phenomena. In the preceding article, we have presented a formal and general approach to the quantitation of buffering action. Here, we use that buffering concept for a systematic treatment of selected classical and other buffering phenomena. Results H+ buffering by weak acids and "self-buffering" in pure water represent "conservative buffered systems" whose analysis reveals buffering properties that contrast in important aspects from classical textbook descriptions. The buffering of organ perfusion in the face of variable perfusion pressure (also termed "autoregulation") can be treated in terms of "non-conservative buffered systems", the general form of the concept. For the analysis of cytoplasmic Ca++ concentration transients (also termed "muffling"), we develop a related unit that is able to faithfully reflect the time-dependent quantitative aspect of buffering during the pre-steady state period. Steady-state buffering is shown to represent the limiting case of time-dependent muffling, namely for infinitely long time intervals and infinitely small perturbations. Finally, our buffering concept provides a stringent definition of "buffering" on the level of systems and control theory, resulting in four absolute ratio scales for control performance that are suited to measure disturbance rejection and setpoint tracking, and both their static and dynamic aspects. Conclusion Our concept of buffering provides a powerful mathematical tool for the quantitation of buffering action in all its appearances. PMID:15771784

  18. La PCR quantitative en temps réel : application à la quantification des OGM

    Directory of Open Access Journals (Sweden)

    Alary Rémi

    2002-11-01

    Full Text Available Suite à l’obligation d’étiquetage, au seuil de 1 %, des aliments contenant des OGM autorisés, il est nécessaire de disposer de méthodes fiables de quantification. Pour répondre à cette obligation, la technique de PCR quantitative en temps réel semble actuellement la mieux adaptée. Son principe, ses avantages et sa mise en oeuvre pour la détermination de la teneur en OGM de farines de soja sont présentés. Les PCR simplex et duplex sont comparées.

  19. Application of quantitative image analysis to the investigation of macroporosity of graphitic materials

    International Nuclear Information System (INIS)

    Delle, W.; Koizlik, K.; Hoven, H.; Wallura, E.

    1978-01-01

    The essence of quantitative image analysis is that the classification of graphitic materials to be inspected is possible on the basis of the grey value contrast between pores (dark) and carbon (bright). Macroporosity is defined as total of all pores with diameters larger than 0.2 μm. The pore size distributions and pore shapes of graphites based on petroleum, pitch, gilsonite and fluid coke as well as graphitic fuel matrices and pyrolytic carbons were investigated. The relationships between maximum grain size, macroporosity and total porosity as well as the anisotropies of macroporosity and electrical resistivity of graphite were established. (orig./GSC) [de

  20. Quantitative application of positron annihilation lifetime spectroscopy to chemical systems in liquid solutions: typical examples

    International Nuclear Information System (INIS)

    Duplatre, G.

    2007-01-01

    The published works refer only to a few, although large, classes of applications. Nevertheless, the potential applications of Positron Annihilation Lifetime Spectroscopy technique (PALS) are essentially limited by imagination. In the present contribution, the bases of physical particle, positronium (Ps) applications will be illustrated in two cases: first, it will be explained how equilibrium constants can be derived through PALS experiments; next, some more elaborate approaches will be shown to characterize various properties of direct micellar systems as have been developed in recent years in Strasbourg

  1. Application of new least-squares methods for the quantitative infrared analysis of multicomponent samples

    International Nuclear Information System (INIS)

    Haaland, D.M.; Easterling, R.G.

    1982-01-01

    Improvements have been made in previous least-squares regression analyses of infrared spectra for the quantitative estimation of concentrations of multicomponent mixtures. Spectral baselines are fitted by least-squares methods, and overlapping spectral features are accounted for in the fitting procedure. Selection of peaks above a threshold value reduces computation time and data storage requirements. Four weighted least-squares methods incorporating different baseline assumptions were investigated using FT-IR spectra of the three pure xylene isomers and their mixtures. By fitting only regions of the spectra that follow Beer's Law, accurate results can be obtained using three of the fitting methods even when baselines are not corrected to zero. Accurate results can also be obtained using one of the fits even in the presence of Beer's Law deviations. This is a consequence of pooling the weighted results for each spectral peak such that the greatest weighting is automatically given to those peaks that adhere to Beer's Law. It has been shown with the xylene spectra that semiquantitative results can be obtained even when all the major components are not known or when expected components are not present. This improvement over previous methods greatly expands the utility of quantitative least-squares analyses

  2. Quantitative and qualitative research across cultures and languages: cultural metrics and their application.

    Science.gov (United States)

    Wagner, Wolfgang; Hansen, Karolina; Kronberger, Nicole

    2014-12-01

    Growing globalisation of the world draws attention to cultural differences between people from different countries or from different cultures within the countries. Notwithstanding the diversity of people's worldviews, current cross-cultural research still faces the challenge of how to avoid ethnocentrism; comparing Western-driven phenomena with like variables across countries without checking their conceptual equivalence clearly is highly problematic. In the present article we argue that simple comparison of measurements (in the quantitative domain) or of semantic interpretations (in the qualitative domain) across cultures easily leads to inadequate results. Questionnaire items or text produced in interviews or via open-ended questions have culturally laden meanings and cannot be mapped onto the same semantic metric. We call the culture-specific space and relationship between variables or meanings a 'cultural metric', that is a set of notions that are inter-related and that mutually specify each other's meaning. We illustrate the problems and their possible solutions with examples from quantitative and qualitative research. The suggested methods allow to respect the semantic space of notions in cultures and language groups and the resulting similarities or differences between cultures can be better understood and interpreted.

  3. Application of Fault Management Theory to the Quantitive Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial

  4. Application of microcomputed tomography for quantitative analysis of dental root canal obturations

    Directory of Open Access Journals (Sweden)

    Anna Kierklo

    2014-03-01

    Full Text Available Introduction: The aim of the study was to apply microcomputed tomography to quantitative evaluation of voids and to test any specific location of voids in tooth’s root canal obturations. Materials and Methods: Twenty root canals were prepared and obturated with gutta-percha and Tubli-Seal sealer using the thermoplastic compaction method (System B + Obtura II. Roots were scanned and three-dimensional visualization was obtained. The volume and Feret’s diameter of I-voids (at the filling/dentine interface and S-voids (surrounded by filling material were measured.Results: The results revealed that none of the scanned root canal fillings were void-free. For I-voids, the volume fraction was significantly larger, but their number was lower (P = 0.0007, than for S-voids. Both types of voids occurred in characteristic regions (P < 0.001. I-voids occurred mainly in the apical third, while S-voids in the coronal third of the canal filling.Conclusions: Within the limitations of this study, our results indicate that microtomography, with proposed semi-automatic algorithm, is a useful tools for three-dimensional quantitative evaluation of dental root canal fillings. In canals filled with thermoplastic gutta-percha and Tubli-Seal, voids at the interface between the filling and canal dentine deserve special attention due to of their periapical location, which might promote apical microleakage. Further studies might help to elucidate the clinical relevance of these results.

  5. Elastic and inelastic scattering of 2 to 10 MeV protons by lithium isotopes; Diffusion elastique et inelastique des protons de 2 a 10 MeV par les isotopes du lithium

    Energy Technology Data Exchange (ETDEWEB)

    Laurat, M [Commissariat a l' Energie Atomique, Bruyeres-le-Chatel (France). Centre d' Etudes

    1969-07-01

    A description is given of the experimental set-up which has been devised for carrying out spectrometric and absolute cross-section measurements on the reactions induced by protons accelerated in a 12 MeV Van de Graaff Tandem. The particles are detected by silicon junctions; the weight of the targets (about ten {mu}g/cm{sup 2}) is determined by the quartz method. The experimental equipment has been controlled by a study of proton scattering by lithium-6, and has made it possible to evaluate the elastic and inelastic scattering (1. level excitation) by lithium 7 of 2 to 9 MeV protons. The most probable spin and parity values for the six levels of {sup 8}Be between 19 and 25 MeV excitation energy have been determined from a knowledge of the observed structure. (author) [French] Nous decrivons le dispositif experimental mis au point pour effectuer les mesures de spectrometrie et de section efficace absolue pour les reactions induites par des protons acceleres par un Van de Graaff Tandem 12 MeV. Les particules sont detectees par des jonctions au silicium, le poids des cibles (de l'ordre d'une dizaine de {mu}g/cm{sup 2}), mesure par la methode du quartz. L'ensemble de l'appareillage a ete controle par l'etude de la diffusion des protons par le lithium 6, et nous a permis de preciser les diffusions elastiques et inelastiques (excitation du 1er niveau) des protons de 2 a 9 MeV par le lithium 7. La structure observee a permis de determiner les spin et parite les plus probables de six niveaux du {sup 8}Be entre 19 et 25 MeV d'energie d'excitation. (auteur)

  6. Contribution to the study of proton elastic and inelastic scattering on {sup 12}C; Contribution a l'etude des diffusions elastiques et inelastiques des protons sur le carbone 12

    Energy Technology Data Exchange (ETDEWEB)

    Sadeghi, A

    1966-07-01

    The results of absolute measurements of cross sections for the scattering of protons by {sup 12}C to the two first excited levels are given. The measurements were made from 4.6 to 11.4 MeV at 17 angles for (p,p) and at 15 angles for (p,p') (1. excited level) as well as 8 angles for (p,p'') (2. excited level). A gaseous target with differential pumping was used. The elastic scattering was analyzed using the R-matrix theory with the optical model. Then a new analysis of both (p,p) and (p,p') was achieved using the coupled-wave formalism. The information on the levels of the compound nucleus was completed and was confirmed. (author) [French] Cette these rapporte le resultat de mesures absolues des sections efficaces de diffusion p,p et pp' (conduisant aux deux premiers niveaux excites) de protons par '1'2C. Ces mesures ont ete faites de 4,6 a 11,4 MeV, a 17 angles pour (p,p), a 15 angles pour pp' (1er niveau excite) et a 8 angles pour pp'' (2eme niveau excite). Une chambre a cible gazeuse avec pompage differentiel a ete utilisee. La diffusion elastique a ete analysee au moyen de la theorie de la matrice R avec modele optique pour (p,p). Cette analyse a ete reprise en meme temps que celle de la diffusion inelastique par l'emploi d'equations couplees. Les resultats anterieurs sur les niveaux du noyau compose ont ete confirmes et completes. (auteur)

  7. Elastic and inelastic scattering of 2 to 10 MeV protons by lithium isotopes; Diffusion elastique et inelastique des protons de 2 a 10 MeV par les isotopes du lithium

    Energy Technology Data Exchange (ETDEWEB)

    Laurat, M. [Commissariat a l' Energie Atomique, Bruyeres-le-Chatel (France). Centre d' Etudes

    1969-07-01

    A description is given of the experimental set-up which has been devised for carrying out spectrometric and absolute cross-section measurements on the reactions induced by protons accelerated in a 12 MeV Van de Graaff Tandem. The particles are detected by silicon junctions; the weight of the targets (about ten {mu}g/cm{sup 2}) is determined by the quartz method. The experimental equipment has been controlled by a study of proton scattering by lithium-6, and has made it possible to evaluate the elastic and inelastic scattering (1. level excitation) by lithium 7 of 2 to 9 MeV protons. The most probable spin and parity values for the six levels of {sup 8}Be between 19 and 25 MeV excitation energy have been determined from a knowledge of the observed structure. (author) [French] Nous decrivons le dispositif experimental mis au point pour effectuer les mesures de spectrometrie et de section efficace absolue pour les reactions induites par des protons acceleres par un Van de Graaff Tandem 12 MeV. Les particules sont detectees par des jonctions au silicium, le poids des cibles (de l'ordre d'une dizaine de {mu}g/cm{sup 2}), mesure par la methode du quartz. L'ensemble de l'appareillage a ete controle par l'etude de la diffusion des protons par le lithium 6, et nous a permis de preciser les diffusions elastiques et inelastiques (excitation du 1er niveau) des protons de 2 a 9 MeV par le lithium 7. La structure observee a permis de determiner les spin et parite les plus probables de six niveaux du {sup 8}Be entre 19 et 25 MeV d'energie d'excitation. (auteur)

  8. Effects of ROI definition and reconstruction method on quantitative outcome and applicability in a response monitoring trial

    International Nuclear Information System (INIS)

    Krak, Nanda C.; Boellaard, R.; Hoekstra, Otto S.; Hoekstra, Corneline J.; Twisk, Jos W.R.; Lammertsma, Adriaan A.

    2005-01-01

    Quantitative measurement of tracer uptake in a tumour can be influenced by a number of factors, including the method of defining regions of interest (ROIs) and the reconstruction parameters used. The main purpose of this study was to determine the effects of different ROI methods on quantitative outcome, using two reconstruction methods and the standard uptake value (SUV) as a simple quantitative measure of FDG uptake. Four commonly used methods of ROI definition (manual placement, fixed dimensions, threshold based and maximum pixel value) were used to calculate SUV (SUV [MAN] , SUV 15 mm , SUV 50 , SUV 75 and SUV max , respectively) and to generate ''metabolic'' tumour volumes. Test-retest reproducibility of SUVs and of ''metabolic'' tumour volumes and the applicability of ROI methods during chemotherapy were assessed. In addition, SUVs calculated on ordered subsets expectation maximisation (OSEM) and filtered back-projection (FBP) images were compared. ROI definition had a direct effect on quantitative outcome. On average, SUV [MAN] , SUV 15 mm , SUV 50 and SUV 75 , were respectively 48%, 27%, 34% and 15% lower than SUV max when calculated on OSEM images. No statistically significant differences were found between SUVs calculated on OSEM and FBP reconstructed images. Highest reproducibility was found for SUV 15 mm and SUV [MAN] (ICC 0.95 and 0.94, respectively) and for ''metabolic'' volumes measured with the manual and 50% threshold ROIs (ICC 0.99 for both). Manual, 75% threshold and maximum pixel ROIs could be used throughout therapy, regardless of changes in tumour uptake or geometry. SUVs showed the same trend in relative change in FDG uptake after chemotherapy, irrespective of the ROI method used. The method of ROI definition has a direct influence on quantitative outcome. In terms of simplicity, user-independence, reproducibility and general applicability the threshold-based and fixed dimension methods are the best ROI methods. Threshold methods are in

  9. The development and application of a quantitative peptide microarray platform to SH2 domain specificity space

    Science.gov (United States)

    Engelmann, Brett Warren

    The Src homology 2 (SH2) domains evolved alongside protein tyrosine kinases (PTKs) and phosphatases (PTPs) in metazoans to recognize the phosphotyrosine (pY) post-translational modification. The human genome encodes 121 SH2 domains within 111 SH2 domain containing proteins that represent the primary mechanism for cellular signal transduction immediately downstream of PTKs. Despite pY recognition contributing to roughly half of the binding energy, SH2 domains possess substantial binding specificity, or affinity discrimination between phosphopeptide ligands. This specificity is largely imparted by amino acids (AAs) adjacent to the pY, typically from positions +1 to +4 C-terminal to the pY. Much experimental effort has been undertaken to construct preferred binding motifs for many SH2 domains. However, due to limitations in previous experimental methodologies these motifs do not account for the interplay between AAs. It was therefore not known how AAs within the context of individual peptides function to impart SH2 domain specificity. In this work we identified the critical role context plays in defining SH2 domain specificity for physiological ligands. We also constructed a high quality interactome using 50 SH2 domains and 192 physiological ligands. We next developed a quantitative high-throughput (Q-HTP) peptide microarray platform to assess the affinities four SH2 domains have for 124 physiological ligands. We demonstrated the superior characteristics of our platform relative to preceding approaches and validated our results using established biophysical techniques, literature corroboration, and predictive algorithms. The quantitative information provided by the arrays was leveraged to investigate SH2 domain binding distributions and identify points of binding overlap. Our microarray derived affinity estimates were integrated to produce quantitative interaction motifs capable of predicting interactions. Furthermore, our microarrays proved capable of resolving

  10. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    International Nuclear Information System (INIS)

    Waters, Michael; Jackson, Marcus

    2008-01-01

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  11. Quantitative High Resolution Transmission Electron Microscopy (HRTEM): a novel approach towards application oriented basic research

    International Nuclear Information System (INIS)

    Kisielowski, Christian; Weber, Eicke R.; Liliental-Weber, Zuzanna

    1996-01-01

    This paper reviews recent developments of microscopic methods that base on a quantitative analysis of electron micrographs to access subsurface systems at the atomic scale. It focuses on non-equilibrium diffusion processes that are observed in nano structured MBE grown materials if a low growth temperature was used and on local deviations from a stoichiometric composition of materials. As examples we investigate Ga As/Al As and Si/Ge Si heterostructures and Ga N single crystals. The purpose of the research is twofold. On the one hand it helps understanding physical processes at the atomic scale. On the other hand we can use the results to link basic physical knowledge with the performance of semiconductor devices made from nano structured materials. (author). 28 refs., 15 figs

  12. Quantitative x-ray microanalysis in an AEM: instrumental considerations and applications to materials science

    International Nuclear Information System (INIS)

    Zaluzec, N.J.

    1979-01-01

    There are a wide variety of instrumental problems which are present to some degree in all AEM instruments. The nature and magnitude of these artifacts can in some instances preclude the simple quantitative interpretation of the recorded x-ray emission spectrum using a thin-film electron excitation model; however, by judicious modifications to the instrument these complications can be effectively eliminated. The specific operating conditions of the microscope necessarily vary from one analysis to another depending on the type of specimen and experiment being performed. In general, however, the overall performance of the AEM system during x-ray analysis is optimized using the highest attainable incident electron energy; selecting the maximum probe diameter and probe current consistent with experimental limitations; and positioning the x-ray detector in a geometry such that it records information from the electron entrance surface of the specimen

  13. Future Applications in Quantitative Isotopic Tracing using Homogeneously Carbon-13 Labelled Plant Material

    International Nuclear Information System (INIS)

    Slaets, Johanna I.F.; Chen, Janet; Resch, Christian; Mayr, Leopold; Weltin, Georg; Heiling, Maria; Gruber, Roman; Dercon, Gerd

    2017-01-01

    Carbon-13 ("1"3C) and nitrogen-15 ("1"5N) labelled plant material is increasingly being used to trace the fate of plant-derived C and N into the atmosphere, soil, water and organisms in many studies, including those investigating the potential of soils to store greenhouse gases belowground. Storage of C in soils can offset and even reduce atmospheric levels of the greenhouse gas, CO_2, and interest in such studies is growing due to problems associated with anthropogenic greenhouse gas emissions impacting climate change. Reduction of N loss in soils is also of great interest, as it reduces release of the greenhouse gas, N_2O, into the atmosphere. However, accurate quantitative tracing of plant-derived C and N in such research is only possible if plant material is labelled both homogeneously and in sufficient quantities.

  14. Synthesis, quantitative structure-property relationship study of novel fluorescence active 2-pyrazolines and application

    Science.gov (United States)

    Girgis, Adel S.; Basta, Altaf H.; El-Saied, Houssni; Mohamed, Mohamed A.; Bedair, Ahmad H.; Salim, Ahmad S.

    2018-03-01

    A variety of fluorescence-active fluorinated pyrazolines 13-33 was synthesized in good yields through cyclocondensation reaction of propenones 1-9 with aryl hydrazines 10-12. Some of the synthesized compounds provided promising fluorescence properties with quantum yield (Φ) higher than that of quinine sulfate (standard reference). Quantitative structure-property relationship studies were undertaken supporting the exhibited fluorescence properties and estimating the parameters governing properties. Five synthesized fluorescence-active pyrazolines (13, 15, 18, 19 and 23) with variable Φ were selected for treating two types of paper sheets (Fabriano and Bible paper). These investigated fluorescence compounds, especially compounds 19 and 23, provide improvements in strength properties of paper sheets. Based on the observed performance they can be used as markers in security documents.

  15. Clinical application of quantitative computed tomography in osteogenesis imperfecta-suspected cat.

    Science.gov (United States)

    Won, Sungjun; Chung, Woo-Jo; Yoon, Junghee

    2017-09-30

    One-year-old male Persian cat presented with multiple fractures and no known traumatic history. Marked decrease of bone radiopacity and thin cortices of all long bones were identified on radiography. Tentative diagnosis was osteogenesis imperfecta, a congenital disorder characterized by fragile bone. To determine bone mineral density (BMD), quantitative computed tomography (QCT) was performed. The QCT results revealed a mean trabecular BMD of vertebral bodies of 149.9 ± 86.5 mg/cm 3 . After bisphosphonate therapy, BMD of the same site increased significantly (218.5 ± 117.1 mg/cm 3 , p < 0.05). QCT was a useful diagnostic tool to diagnose osteopenia and quantify response to medical treatment.

  16. Determination of γ-rays emitting radionuclides in surface water: application of a quantitative biosensing method

    International Nuclear Information System (INIS)

    Wolterbeek, H. Th.; Van der Meer, A. J. G. M.

    1995-01-01

    A quantitative biosensing method has been developed for the determination of γ-rays emitting radionuclides in surface water. The method is based on the concept that at equilibrium the specific radioactivity in the biosensor is equal to the specific radioactivity in water. The method consists of the measurement of both the radionuclide and the related stable isotope (element) in the biosensor and the determination of the element in water. This three-way analysis eliminates problems such as unpredictable biosensor behaviour, effects of water elemental composition or further abiotic parameters on accumulation levels: what remains is the generally high enrichment (bioaccumulation factor BCF) of elements and radionuclides in the biosensor material. Using water plants, the method is shown to be three to five orders of magnitude more sensitive than the direct analysis of water. (author)

  17. Determination of {gamma}-rays emitting radionuclides in surface water: application of a quantitative biosensing method

    Energy Technology Data Exchange (ETDEWEB)

    Wolterbeek, H Th; Van der Meer, A. J. G. M. [Delft University of Technology, Interfaculty Reactor Institute, Mekelweg 15, 2629 JB Delft (Netherlands)

    1995-12-01

    A quantitative biosensing method has been developed for the determination of {gamma}-rays emitting radionuclides in surface water. The method is based on the concept that at equilibrium the specific radioactivity in the biosensor is equal to the specific radioactivity in water. The method consists of the measurement of both the radionuclide and the related stable isotope (element) in the biosensor and the determination of the element in water. This three-way analysis eliminates problems such as unpredictable biosensor behaviour, effects of water elemental composition or further abiotic parameters on accumulation levels: what remains is the generally high enrichment (bioaccumulation factor BCF) of elements and radionuclides in the biosensor material. Using water plants, the method is shown to be three to five orders of magnitude more sensitive than the direct analysis of water. (author)

  18. Application of quantitative autoradiography to the measurement of biochemical processes in vivo

    International Nuclear Information System (INIS)

    Sokoloff, L.

    1985-01-01

    Quantitative autoradiography makes it possible to measure the concentrations of isotopes in tissues of animals labeled in vivo. In a few cases, the administration of a judiciously selected labeled chemical compound and a properly designed procedure has made it possible to use this capability to measure the rate of a chemical process in animals in vivo. Emission tomography, and particularly positron emission tomography, provides a means to extend this capability to man and to assay the rates of biochemical processes in human tissues in vivo. It does not, however, obviate the need to adhere to established principles of chemical and enzyme kinetics and tracer theory. Generally, all such methods, whether to be used in man with positron emission tomography or in animals with autoradiography, must first be developed by research in animals with autoradiography, because it is only in animals that the measurements needed to validate the basic assumptions of the methods can be tested and evaluated

  19. The application of high-speed cinematography for the quantitative analysis of equine locomotion.

    Science.gov (United States)

    Fredricson, I; Drevemo, S; Dalin, G; Hjertën, G; Björne, K

    1980-04-01

    Locomotive disorders constitute a serious problem in horse racing which will only be rectified by a better understanding of the causative factors associated with disturbances of gait. This study describes a system for the quantitative analysis of the locomotion of horses at speed. The method is based on high-speed cinematography with a semi-automatic system of analysis of the films. The recordings are made with a 16 mm high-speed camera run at 500 frames per second (fps) and the films are analysed by special film-reading equipment and a mini-computer. The time and linear gait variables are presented in tabular form and the angles and trajectories of the joints and body segments are presented graphically.

  20. Quantitative real-time PCR approaches for microbial community studies in wastewater treatment systems: applications and considerations.

    Science.gov (United States)

    Kim, Jaai; Lim, Juntaek; Lee, Changsoo

    2013-12-01

    Quantitative real-time PCR (qPCR) has been widely used in recent environmental microbial ecology studies as a tool for detecting and quantifying microorganisms of interest, which aids in better understandings of the complexity of wastewater microbial communities. Although qPCR can be used to provide more specific and accurate quantification than other molecular techniques, it does have limitations that must be considered when applying it in practice. This article reviews the principle of qPCR quantification and its applications to microbial ecology studies in various wastewater treatment environments. Here we also address several limitations of qPCR-based approaches that can affect the validity of quantification data: template nucleic acid quality, nucleic acid extraction efficiency, specificity of group-specific primers and probes, amplification of nonviable DNA, gene copy number variation, and limited number of sequences in the database. Even with such limitations, qPCR is reportedly among the best methods for quantitatively investigating environmental microbial communities. The application of qPCR is and will continue to be increasingly common in studies of wastewater treatment systems. To obtain reliable analyses, however, the limitations that have often been overlooked must be carefully considered when interpreting the results. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Application of short-wave infrared (SWIR) spectroscopy in quantitative estimation of clay mineral contents

    International Nuclear Information System (INIS)

    You, Jinfeng; Xing, Lixin; Pan, Jun; Meng, Tao; Liang, Liheng

    2014-01-01

    Clay minerals are significant constituents of soil which are necessary for life. This paper studied three types of clay minerals, kaolinite, illite, and montmorillonite, for they are not only the most common soil forming materials, but also important indicators of soil expansion and shrinkage potential. These clay minerals showed diagnostic absorption bands resulting from vibrations of hydroxyl groups and structural water molecules in the SWIR wavelength region. The short-wave infrared reflectance spectra of the soil was obtained from a Portable Near Infrared Spectrometer (PNIS, spectrum range: 1300∼2500 nm, interval: 2 nm). Due to the simplicity, quickness, and the non-destructiveness analysis, SWIR spectroscopy has been widely used in geological prospecting, chemical engineering and many other fields. The aim of this study was to use multiple linear regression (MLR) and partial least squares (PLS) regression to establish the optimizing quantitative estimation models of the kaolinite, illite and montmorillonite contents from soil reflectance spectra. Here, the soil reflectance spectra mainly refers to the spectral reflectivity of soil (SRS) corresponding to the absorption-band position (AP) of kaolinite, illite, and montmorillonite representative spectra from USGS spectral library, the SRS corresponding to the AP of soil spectral and soil overall spectrum reflectance values. The optimal estimation models of three kinds of clay mineral contents showed that the retrieval accuracy was satisfactory (Kaolinite content: a Root Mean Square Error of Calibration (RMSEC) of 1.671 with a coefficient of determination (R 2 ) of 0.791; Illite content: a RMSEC of 1.126 with a R 2 of 0.616; Montmorillonite content: a RMSEC of 1.814 with a R 2 of 0.707). Thus, the reflectance spectra of soil obtained form PNIS could be used for quantitative estimation of kaolinite, illite and montmorillonite contents in soil

  2. Culture Sustainability: Culture Quotient (CQ and Its Quantitative Empirical Application to Chinese Cities

    Directory of Open Access Journals (Sweden)

    Jing Lin

    2016-11-01

    Full Text Available Culture sustainability is one of the indispensable components of sustainability. Culture has likely always been an important element for promoting urban and rural sustainable development. It is now playing an increasingly significant role in sparking and incubating innovation, which is becoming the main driver of economic growth and competitiveness. Unfortunately, little research has been conducted on how much culture matters to economic performance in a quantitative way. Therefore, in this paper, which is based on an intensive literature review, we try to specifically quantify the importance of culture to urban development in general and urban economic performance in particular, by proposing an index system dubbed as the Culture Quotient (CQ. Following this, an integrated database of 297 prefectural-level cities in China is accordingly established. By manipulating the database, the CQ value for each city is then calculated by using principal component analysis with SPSS (19.0. Afterwards, spatial pattern by CQ value tier is presented and illustrates urban China’s “winner-take-all” phenomenon, with the predominance by the three giant urban clusters in the coastal area, i.e., the Jing (Beijing-Jin (Tianjin-Ji (Hebei province-based Bohai rim region, Yangtze River delta, Pearl River delta, as well as some mega-cities such as Chengdu and Wuhan in other parts of China. More precisely, the regression analysis shows that there is a strong positive relationship between CQ and gross domestic product (GDP, with the striking result that every increase of one percentage point in CQ will induce a five percentage point increment in GDP. Although the finding makes an impressive and convincing case that culture does exert a great impact on urban economic development, and can also be measured in a quantitative way in Chinese cases, more cases from other countries need to be included for further verification and confirmation. We therefore urgently call for

  3. Application of ovine luteinizing hormone (LH) radioimmunoassay in the quantitation of LH in different mammalian species

    International Nuclear Information System (INIS)

    Millar, R.P.; Aehnelt, C.

    1977-01-01

    A sensitive double antibody radioimmunoassay has been developed for measuring luteinizing hormone (LH) in various African mammalian species, using rabbit anti-ovine LH serum (GDN 15) and radioiodinated rat LH or ovine LH. Serum and pituitary homogenates from some African mammals (hyrax, reedbuck, sable, impala, tsessebe, thar, spring-hare, ground squirrel and cheetah, as well as the domestic sheep, cow and horse and laboratory rat and hamster) produced displacement curves parallel to that of the ovine LH standards. The specificity of the assay was examined in detail for one species, the rock hyrax. Radioimmunoassay and bioassay estimates of LH in hyrax pituitaries containing widely differing quantities of pituitary hormones were similar. In sexually active male hyrax mean plasma LH was 12.1 ng/ml and pituitary LH 194 μg/gland, but in sexually quiescent hyrax mean plasma LH was 2.4 ng/ml and mean pituitary LH 76 μg/gland. Intravenous injection of 10 μg of luteinizing hormone releasing hormone increased mean LH levels in hyrax from 0.9 ng/ml to 23.2 ng/ml by 30 min. Conversely, im injection of 250 μg testosterone induced a fall in LH levels in male hyrax from 1.7 ng/ml to 0.7 ng/ml 6 h after administration. Although the specificity of the assay for quantitating plasma LH in other species was not categorically established, there was a good correlation between plasma LH concentration and reproductive state in the bontebok, impala, spring-hare, thar, cheetah, domestic horse and laboratory rat, suggesting the potential use of the antiserum in quantitating LH in a variety of mammalian species

  4. The power of joint application of LEED and DFT in quantitative surface structure determination

    International Nuclear Information System (INIS)

    Heinz, K; Hammer, L; Mueller, S

    2008-01-01

    It is demonstrated for several cases that the joint application of low-energy electron diffraction (LEED) and structural calculations using density functional theory (DFT) can retrieve the correct surface structure even though single application of both methods fails. On the experimental side (LEED) the failure can be due to the simultaneous presence of weak and very strong scatterers or to an insufficient data base leaving different structures with the same quality of fit between experimental data and calculated model intensities. On the theory side (DFT) it can be difficult to predict the coverage of an adsorbate or two different structures may own almost the same total energy, but only one of the structures is assumed in experiment due to formation kinetics. It is demonstrated how in the different cases the joint application of both methods-which yield about the same structural precision-offers a way out of the dilemma

  5. Quantitative intracellular flux modeling and applications in biotherapeutic development and production using CHO cell cultures.

    Science.gov (United States)

    Huang, Zhuangrong; Lee, Dong-Yup; Yoon, Seongkyu

    2017-12-01

    Chinese hamster ovary (CHO) cells have been widely used for producing many recombinant therapeutic proteins. Constraint-based modeling, such as flux balance analysis (FBA) and metabolic flux analysis (MFA), has been developing rapidly for the quantification of intracellular metabolic flux distribution at a systematic level. Such methods would produce detailed maps of flows through metabolic networks, which contribute significantly to better understanding of metabolism in cells. Although these approaches have been extensively established in microbial systems, their application to mammalian cells is sparse. This review brings together the recent development of constraint-based models and their applications in CHO cells. The further development of constraint-based modeling approaches driven by multi-omics datasets is discussed, and a framework of potential modeling application in cell culture engineering is proposed. Improved cell culture system understanding will enable robust developments in cell line and bioprocess engineering thus accelerating consistent process quality control in biopharmaceutical manufacturing. © 2017 Wiley Periodicals, Inc.

  6. Quantitative measures of corrosion and prevention: application to corrosion in agriculture

    NARCIS (Netherlands)

    Schouten, J.C.; Gellings, P.J.

    1987-01-01

    The corrosion protection factor (c.p.f.) and the corrosion condition (c.c.) are simple instruments for the study and evaluation of the contribution and efficiency of several methods of corrosion prevention and control. The application of c.p.f. and c.c. to corrosion and prevention in agriculture in

  7. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo [University of Ljubljana, Faculty of Electrical Engineering, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2008-04-07

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm ({+-}0.6 mm) for the first and 2.1 mm ({+-}1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and

  8. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    International Nuclear Information System (INIS)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo

    2008-01-01

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm (±0.6 mm) for the first and 2.1 mm (±1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and CA

  9. Application of the quantitative autoradiography for determination of specific activity of labelled non-metallic inclusions

    International Nuclear Information System (INIS)

    Kowalczyk, J.T.; Wilczynski, A.W.

    1983-01-01

    The knowledge of specific activity of labelled non-metallic inclusions, i.e. the knowledge of the content of the radiotracer in a single inclusion, allows to obtain new information about the mechanism and the kinetics of steel deoxidation. In order to determine this specific activity quantitative autoradiography was used. Fo; this purpose, various standards of aluminium oxides with different amounts of cerium oxide Ce 2 O 3 and an aluminium-cerium alloy were prepared. The standards and the alloy were activated with thermal neutrons. Then several autoradiographs were made for these standards (ORWO AF-3 films were used). The autoradiographs served as the basis for evaluation of the standardization curves: optical density versus dimension of particles for a constant cerium concentration; optical density versus concentration of cerium for a constant dimension of particle. The samples of liquid steel were deoxidated with Al-Ce alloy. After labelled non-metallic inclusions had been isolated, the autoradiographs were made under the same conditions as for the standards. The standardization curves were used to determine the cerium content in the single inclusions. (author)

  10. Quantitation in PET using isotopes emitting prompt single gammas: application to yttrium-86

    International Nuclear Information System (INIS)

    Walrand, Stephan; Jamar, Francois; Mathieu, Isabelle; De Camps, Joelle; Lonneux, Max; Pauwels, Stanislas; Sibomana, Merence; Labar, Daniel; Michel, Christian

    2003-01-01

    Several yttrium-90 labelled somatostatin analogues are now available for cancer radiotherapy. After injection, a large amount of the compound is excreted via the urinary tract, while a variable part is trapped in the tumour(s), allowing the curative effect. Unfortunately, the compound may also be trapped in critical tissues such as kidney or bone marrow. As a consequence, a method for assessment of individual biodistribution and pharmacokinetics is required to predict the maximum dose that can be safely injected into patients. However, 90 Y, a pure β - particle emitter, cannot be used for quantitative imaging. Yttrium-86 is a positron emitter that allows imaging of tissue uptake using a PET camera. In addition to the positron, 86 Y also emits a multitude of prompt single γ-rays, leading to significant overestimation of uptake when using classical reconstruction methods. We propose a patient-dependent correction method based on sinogram tail fitting using an 86 Y point spread function library. When applied to abdominal phantom acquisition data, the proposed correction method significantly improved the accuracy of the quantification: the initial overestimation of background activity by 117% was reduced to 9%, while the initial error in respect of kidney uptake by 84% was reduced to 5%. In patient studies, the mean discrepancy between PET total body activity and the activity expected from urinary collections was reduced from 92% to 7%, showing the benefit of the proposed correction method. (orig.)

  11. Quantitative Investigation of Roasting-magnetic Separation for Hematite Oolitic-ores: Mechanisms and Industrial Application

    Directory of Open Access Journals (Sweden)

    Peng Tiefeng

    2017-12-01

    Full Text Available Natural high-quality iron can be directly applied to pyro-metallurgy process, however, the availability of these ores has become less and less due to exploitation. This research reports a systematic approach using reduction roasting and magnetic separation for oolitic iron ores from west Hubei Province. Firstly, a mineralogical study was performed and it was shown that the oolitic particles were mainly composed of hematite, with some silicon-quartz inside the oolitic particle. Then, the roasting temperature was examined and shown to have significant influence on both Fe recovery and the Fe content of the concentrate. Also the Fe content gradually increased as the temperature increased from 700 to 850 °C. The most important aspects are the quantitative investigation of change of mineral phases, and reduction area (with ratio during the reduction roasting process. The results showed that Fe2O3 decreased with temperature, and Fe3O4 (magnetite increased considerably from 600 to 800 °C. The reductive reaction was found to occur from the outside in, the original oolitic structure and embedding relationship among the minerals did not change after roasting. Finally, 5% surrounding rock was added to mimic real industrial iron beneficiation. This study could provides useful insight and practical support for the utilization of such iron ores.

  12. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  13. Quantitative Clinical Chemistry Proteomics (qCCP) using mass spectrometry: general characteristics and application.

    Science.gov (United States)

    Lehmann, Sylvain; Hoofnagle, Andrew; Hochstrasser, Denis; Brede, Cato; Glueckmann, Matthias; Cocho, José A; Ceglarek, Uta; Lenz, Christof; Vialaret, Jérôme; Scherl, Alexander; Hirtz, Christophe

    2013-05-01

    Proteomics studies typically aim to exhaustively detect peptides/proteins in a given biological sample. Over the past decade, the number of publications using proteomics methodologies has exploded. This was made possible due to the availability of high-quality genomic data and many technological advances in the fields of microfluidics and mass spectrometry. Proteomics in biomedical research was initially used in 'functional' studies for the identification of proteins involved in pathophysiological processes, complexes and networks. Improved sensitivity of instrumentation facilitated the analysis of even more complex sample types, including human biological fluids. It is at that point the field of clinical proteomics was born, and its fundamental aim was the discovery and (ideally) validation of biomarkers for the diagnosis, prognosis, or therapeutic monitoring of disease. Eventually, it was recognized that the technologies used in clinical proteomics studies [particularly liquid chromatography-tandem mass spectrometry (LC-MS/MS)] could represent an alternative to classical immunochemical assays. Prior to deploying MS in the measurement of peptides/proteins in the clinical laboratory, it seems likely that traditional proteomics workflows and data management systems will need to adapt to the clinical environment and meet in vitro diagnostic (IVD) regulatory constraints. This defines a new field, as reviewed in this article, that we have termed quantitative Clinical Chemistry Proteomics (qCCP).

  14. Application of quantitative variables in the sampling method to evaluate the sugarcane rust brown

    Directory of Open Access Journals (Sweden)

    Joaquín Montalván Delgado

    2017-01-01

    Full Text Available To develop a system that increase the precision in the resistance evaluations to sugarcane brown rust disease through the use of the quantitative sampling, six cultivars of differential behavior versus the disease (PR980, My5514, Ja60-5, C334-64, C323-68 and B4362 were studied. A random block experimental design with three replications was used in a heavy infections conditions obtained from the cultivar highly susceptible B4362. The evaluations were done at three and five months of age, in the three-thirds: bottom, half and top of the +1, +3 and +5 sugarcane plant leaves of 10 plants for replica. The variable total leaf area affected of the leaf and in each third was analyzed. In 2 cm2 were observed the long and wide of the biggest and more frequent pustule, the total of pustule and the area of the biggest and more frequent pustule, and the area percentage occupied by the most frequent pustule by each cm2 were determined. Variance analysis and Tukey tests as well as confidence intervals analysis to determine the coefficient to use as constant of the pustule width, due to the little variation of this parameter were realized. The +3 leaf represented the half infection of the incidence of the brown rust, constituting for it the most appropriate to carry out the observations and the half third. An equation was also obtained to calculate the area occupied by pustules with a high level of confidence.

  15. LEGO plot for simultaneous application of multiple quality requirements during trueness verification of quantitative laboratory tests.

    Science.gov (United States)

    Park, Hae-il; Chae, Hyojin; Kim, Myungshin; Lee, Jehoon; Kim, Yonggoo

    2014-03-01

    We developed a two-dimensional plot for viewing trueness that takes into account potential shift and variable quality requirements to verify trueness using certified reference material (CRM). Glucose, total cholesterol (TC), and creatinine levels were determined by two kinds of assay in two levels of a CRM. Available quality requirements were collected, codified, and sorted in an ascending order in the plot's header row. Centering on the mean of measured values from CRM, the "mean ± US CLIA '88 allowable total error" was located in the header of the leftmost and rightmost columns. Twenty points were created in intervening columns as potential shifts. Uncertainties were calculated according to regression between certified values and uncertainties of CRM, and positioned in the corresponding columns. Cells were assigned different colors where column and row intersected based on comparison of the 95% confidence interval of the percentage bias with each quality requirement. A glucose assay failed to meet the highest quality criteria, for which shift of +0.13-0.14 mmol/l was required. A TC assay met the quality requirement and a shift of ±0.03 mmol/l was tolerable. A creatinine assay also met the quality requirement but any shift was not tolerable. The plot provides a systematic view of the trueness of quantitative laboratory tests. © 2014 Wiley Periodicals, Inc.

  16. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  17. Quantitative prediction of twinning stress in fcc alloys: Application to Cu-Al

    Science.gov (United States)

    Kibey, Sandeep A.; Wang, L. L.; Liu, J. B.; Johnson, H. T.; Sehitoglu, H.; Johnson, D. D.

    2009-06-01

    Twinning is one of most prevalent deformation mechanisms in materials. Having established a quantitative theory to predict onset twinning stress τcrit in fcc elemental metals from their generalized planar-fault-energy (GPFE) surface, we exemplify its use in alloys where the Suzuki effect (i.e., solute energetically favors residing at and near planar faults) is operative; specifically, we apply it in Cu-xAl ( x is 0, 5, and 8.3at.% ) in comparison with experimental data. We compute the GPFE via density-functional theory, and we predict the solute dependence of the GPFE and τcrit , in agreement with measured values. We show that τcrit correlates monotonically with the unstable twin fault energies (the barriers to twin nucleation) rather than the stable intrinsic stacking-fault energies typically suggested. We correlate the twinning behavior and electronic structure with changes in solute content and proximity to the fault planes through charge-density redistribution at the fault and changes to the layer- and site-resolved density of states, where increased bonding charge correlates with decrease in fault energies and τcrit .

  18. Radiation applications in art and archaeometry X-ray fluorescence applications to archaeometry. Possibility of obtaining non-destructive quantitative analyses

    International Nuclear Information System (INIS)

    Milazzo, Mario

    2004-01-01

    The possibility of obtaining quantitative XRF analysis in archaeometric applications is considered in the following cases: - Examinations of metallic objects with irregular surface: coins, for instance. - Metallic objects with a natural or artificial patina on the surface. - Glass or ceramic samples for which the problems for quantitative analysis rise from the non-detectability of matrix low Z elements. The fundamental parameter method for quantitative XRF analysis is based on a numerical procedure involving he relative values of XRF lines intensity. As a consequence it can be applied also to the experimental XRF spectra obtained for metallic objects if the correction for the irregular shape consists only in introducing a constant factor which does not affect the XRF intensity relative value. This is in fact possible in non-very-restrictive conditions for the experimental set up. The finenesses of coins with a superficial patina can be evaluated by resorting to the measurements of Rayleigh to Compton scattering intensity ratio at an incident energy higher than the one of characteristic X-ray. For glasses and ceramics the measurements of the Compton scattered intensity of the exciting radiation and the use of a proper scaling law make possible to evaluate the matrix absorption coefficients for all characteristic X-ray line energies

  19. SU-E-T-661: Quantitative MRI Assessment of a Novel Direction-Modulated Brachytherapy Tandem Applicator for Cervical Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Soliman, A; Elzibak, A; Fatemi, A; Safigholi, H; Leung, E; Ravi, A; Song, W [Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); Han, D [Sunnybrook Research Institute, Sunnybrook Health Sciences Centre, Toronto, Ontario (Canada); University of California, San Diego, La Jolla, CA (United States)

    2015-06-15

    Purpose: To quantitatively evaluate the MR image quality of a novel direction modulated brachytherapy (DMBT) tandem applicator for cervical cancer, using the clinical MRI scanning protocol for image guided brachytherapy. Methods: The tungsten alloy-based applicator was placed in a water phantom and clinical imaging protocol was performed. Axial images were acquired using 2D turbo-spin echo (TSE) T2-weighted sequence on a 1.5T GE 450w MR scanner and an 8-channel body coil. As multi-channel receiver coil was used, inhomogeneities in the B1 receive field must be considered before performing the quantification process. Therefore the applicator was removed from the phantom and the whole imaging session was performed again for the water phantom with the same parameters. Images from the two scans were then subtracted, resulting in a difference image that only shows the applicator with its surrounding magnetic susceptibility dipole artifact. Line profiles were drawn and plotted on the difference image at various angles and locations along the tandem. Full width at half maximum (FWHM) was measured at all the line profiles to quantify the extent of the artifact. Additionally, the extent of the artifact along the diameter of the tandem was measured at various angles and locations. Results: After removing the background inhomogeneities of the receiver coil, FWHM of the tandem measured 5.75 ± 0.35 mm (the physical tandem diameter is 5.4 mm). The average extent of the artifacts along the diameter of the tandem measured is 2.14 ± 0.56 mm. In contrast to CT imaging of the same applicator (not shown here), the tandem can be easily identified without additional correction algorithms. Conclusion: This work demonstrated that the novel DMBT tandem applicator has minimal susceptibility artifact in T2-weighted images employed in clinical practice for MRI-guided brachytherapy of cervical cancer.

  20. Tracers application method for the quantitative determination of the source of oxygenic inclusions in steel

    International Nuclear Information System (INIS)

    Rewienska-Kosciukowa, B.; Dalecki, W.; Michalik, J.S.

    1976-01-01

    The sense and the possibility of radioactive and nonradioactive isotopic tracers application in investigations of the origin of oxygenic nonmetalic inclusions is presented. The discussed methods touch the investigations such as the origin of egzogenic inclusions which passed from external sources (fireproof lining, slag) to the steel or as the endogenic ones formed during the process of steel deoxidisation. The question of the tracers choice for refractory material and the further investigations concerned the determination of the origin of nonmetallic inclusions are discussed. The question of so called isotopic replacement tracers for the main steel deoxidizing agents is considered. The criterion of determination of oxygenic inclusions formed during the process of steel deoxidization is also discussed. Several results of laboratory and industrial investigations and also the examples of application of the discussed methods in the industrial scale are presented. (author)

  1. Quantitative Infrared Spectroscopy in Challenging Environments: Applications to Passive Remote Sensing and Process Monitoring

    Science.gov (United States)

    2012-12-01

    and its corresponding kinetics,16,17 fermentation processes,18 and polymer extrusion.19,20 The primary challenge for NIR measurement applications lies...variables, the regression coefficient for that term in the model will be related to the analyte concentration. According to the Beer -Lambert law, the...additive, the Beer -Lambert law relation is shown in Eq. 3.11, in which Ai denotes the total absorbance at wavelength i, εij is the absorptivity of

  2. Quantitative evaluation of geodiversity: development of methodological procedures with application to territorial management

    Science.gov (United States)

    Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.

    2012-04-01

    Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity

  3. Clinical applicability of quantitative nailfold capillaroscopy in differential diagnosis of connective tissue diseases with Raynaud's phenomenon.

    Science.gov (United States)

    Wu, Po-Chang; Huang, Min-Nung; Kuo, Yu-Min; Hsieh, Song-Chou; Yu, Chia-Li

    2013-08-01

    Nailfold capillaroscopy is a useful tool to distinguish primary from secondary Raynaud's phenomenon (RP) by examining the morphology of nailfold capillaries but its role in disease diagnosis is not clearly established. The purpose of this study was to evaluate the roles of quantitative nailfold capillaroscopy in differential diagnosis of connective tissue diseases (CTDs) with RP. The data between the year 2005 and 2009 were retrieved from the nailfold capillaroscopic database of National Taiwan University Hospital (NTUH). Only the data from the patients with RP were analyzed. The criteria for interpretation of capillaroscopic findings were predefined. The final diagnoses of the patients were based on the American College of Rheumatology classification criteria for individual diseases, independent of nailfold capillaroscopic findings. The sensitivity and the specificity of each capillaroscopic pattern to the diseases were determined. The data from a total of 67 patients were qualified for the current study. We found the sensitivity and specificity of scleroderma pattern for systemic sclerosis (SSc) were 89.47% and 80%, and the specificity of the early, active, and late scleroderma patterns for SSc reached 87.5%, 97.5%, and 95%, respectively. The sensitivity/specificity of systemic lupus erythematosus (SLE) pattern for SLE and polymyositis/dermatomyositis (PM/DM) pattern for PM/DM were 33.33%/95.45% and 60%/96.3%, respectively. The sensitivity/specificity of mixed connective tissue disease (MCTD) pattern for MCTD were 20%/100%. The nailfold capillaroscopic (NC) patterns may be useful in the differential diagnosis of CTDs with RP. The NC patterns for SSc and PM/DM are both sensitive and specific to the diseases, while the SLE and MCTD patterns exhibit high specificity but relatively low sensitivity. Copyright © 2012. Published by Elsevier B.V.

  4. Applications of quantitative time lapse holographic imaging to the development of complex pharmaceutical nano formulations

    Science.gov (United States)

    Luther, Ed; Mendes, Livia; Pan, Jiayi; Costa, Daniel; Sarisozen, Can; Torchilin, Vladimir

    2018-02-01

    We rely on in vitro cellular cultures to evaluate the effects of the components of multifunctional nano-based formulations under development. We employ an incubator-adapted, label-free holographic imaging cytometer HoloMonitor M4® (Phase Holographic Imaging, Lund, Sweden) to obtain multi-day time-lapse sequences at 5- minute intervals. An automated stage allows hand-free acquisition of multiple fields of view. Our system is based on the Mach-Zehnder interferometry principle to create interference patterns which are deconvolved to produce images of the optical thickness of the field of view. These images are automatically segmented resulting in a full complement of quantitative morphological features, such as optical volume, thickness, and area amongst many others. Precise XY cell locations and the time of acquisition are also recorded. Visualization is best achieved by novel 4-Dimensional plots, where XY position is plotted overtime time (Z-directions) and cell-thickness is coded as color or gray scale brightness. Fundamental events of interest, i.e., cells undergoing mitosis or mitotic dysfunction, cell death, cell-to-cell interactions, motility are discernable. We use both 2D and 3D models of the tumor microenvironment. We report our new analysis method to track feature changes over time based on a 4-sample version of the Kolmogorov-Smirnov test. Feature A is compared to Control A, and Feature B is compared to Control B to give a 2D probability plot of the feature changes over time. As a result, we efficiently obtain vectors quantifying feature changes over time in various sample conditions, i.e., changing compound concentrations or multi-compound combinations.

  5. Application of droplet digital PCR for quantitative detection of Spiroplasma citri in comparison with real time PCR.

    Directory of Open Access Journals (Sweden)

    Yogita Maheshwari

    Full Text Available Droplet digital polymerase chain reaction (ddPCR is a method for performing digital PCR that is based on water-oil emulsion droplet technology. It is a unique approach to measure the absolute copy number of nucleic acid targets without the need of external standards. This study evaluated the applicability of ddPCR as a quantitative detection tool for the Spiroplasma citri, causal agent of citrus stubborn disease (CSD in citrus. Two sets of primers, SP1, based on the spiral in housekeeping gene, and a multicopy prophage gene, SpV1 ORF1, were used to evaluate ddPCR in comparison with real time (quantitative PCR (qPCR for S. citri detection in citrus tissues. Standard curve analyses on tenfold dilution series showed that both ddPCR and qPCR exhibited good linearity and efficiency. However, ddPCR had a tenfold greater sensitivity than qPCR and accurately quantified up to one copy of spiralin gene. Receiver operating characteristic analysis indicated that the ddPCR methodology was more robust for diagnosis of CSD and the area under the curve was significantly broader compared to qPCR. Field samples were used to validate ddPCR efficacy and demonstrated that it was equal or better than qPCR to detect S. citri infection in fruit columella due to a higher pathogen titer. The ddPCR assay detected both the S. citri spiralin and the SpV1 ORF1 targets quantitatively with high precision and accuracy compared to qPCR assay. The ddPCR was highly reproducible and repeatable for both the targets and showed higher resilience to PCR inhibitors in citrus tissue extract for the quantification of S. citri compare to qPCR.

  6. Nano-graphene oxide carboxylation for efficient bioconjugation applications: a quantitative optimization approach

    Science.gov (United States)

    Imani, Rana; Emami, Shahriar Hojjati; Faghihi, Shahab

    2015-02-01

    A method for carboxylation of graphene oxide (GO) with chloroacetic acid that precisely optimizes and controls the efficacy of the process for bioconjugation applications is proposed. Quantification of COOH groups on nano-graphene oxide sheets (NGOS) is performed by novel colorimetric methylene blue (MB) assay. The GO is synthesized and carboxylated by chloroacetic acid treatment under strong basic condition. The size and morphology of the as-prepared NGOS are characterized by scanning electron microscopy, transmission electron microscopy (TEM), and atomic force microscopy (AFM). The effect of acid to base molar ratio on the physical, chemical, and morphological properties of NGOS is analyzed by Fourier-transformed infrared spectrometry (FTIR), UV-Vis spectroscopy, X-ray diffraction (XRD), AFM, and zeta potential. For evaluation of bioconjugation efficacy, the synthesized nano-carriers with different carboxylation ratios are functionalized by octaarginine peptide sequence (R8) as a biomolecule model containing amine groups. The quantification of attached R8 peptides to graphene nano-sheets' surface is performed with a colorimetric-based assay which includes the application of 2,4,6-Trinitrobenzene sulfonic acid (TNBS). The results show that the thickness and lateral size of nano-sheets are dramatically decreased to 0.8 nm and 50-100 nm after carboxylation process, respectively. X-ray analysis shows the nano-sheets interlaying space is affected by the alteration of chloroacetic acid to base ratio. The MB assay reveals that the COOH groups on the surface of NGOS are maximized at the acid to base ratio of 2 which is confirmed by FTIR, XRD, and zeta potential. The TNBS assay also shows that bioconjugation of the optimized carboxylated NGOS sample with octaarginine peptide is 2.5 times more efficient compared to bare NGOS. The present work provides evidence that treatment of GO by chloroacetic acid under an optimized condition would create a functionalized high surface

  7. Recombinant plasmid-based quantitative Real-Time PCR analysis of Salmonella enterica serotypes and its application to milk samples.

    Science.gov (United States)

    Gokduman, Kurtulus; Avsaroglu, M Dilek; Cakiris, Aris; Ustek, Duran; Gurakan, G Candan

    2016-03-01

    The aim of the current study was to develop, a new, rapid, sensitive and quantitative Salmonella detection method using a Real-Time PCR technique based on an inexpensive, easy to produce, convenient and standardized recombinant plasmid positive control. To achieve this, two recombinant plasmids were constructed as reference molecules by cloning the two most commonly used Salmonella-specific target gene regions, invA and ttrRSBC. The more rapid detection enabled by the developed method (21 h) compared to the traditional culture method (90 h) allows the quantitative evaluation of Salmonella (quantification limits of 10(1)CFU/ml and 10(0)CFU/ml for the invA target and the ttrRSBC target, respectively), as illustrated using milk samples. Three advantages illustrated by the current study demonstrate the potential of the newly developed method to be used in routine analyses in the medical, veterinary, food and water/environmental sectors: I--The method provides fast analyses including the simultaneous detection and determination of correct pathogen counts; II--The method is applicable to challenging samples, such as milk; III--The method's positive controls (recombinant plasmids) are reproducible in large quantities without the need to construct new calibration curves. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Applications in Bioastronautics and Bioinformatics: Early Radiation Cataracts Detected by Noninvasive, Quantitative, and Remote Means

    Science.gov (United States)

    Ansari, Rafat R.; King, James F.; Giblin, Frank J.

    2000-01-01

    (transparent) to several micrometers (cloudy). Ansari and Datiles have shown that DLS can detect cataracts at least two to three orders of magnitude earlier noninvasively and quantitatively than the best imaging (Scheimpflug) techniques in clinical use today (ref. 3).

  9. ffect of Nitrogen and Zinc Foliar Application on Quantitative Traits of Tea Rosslle (Hibiscus sabdariffa in Jiroft Zone

    Directory of Open Access Journals (Sweden)

    abdolreza raisi sarbijan

    2017-02-01

    Full Text Available Introduction: Nitrogen is an essential element forplants and in combination withelements such as carbon, oxygen, hydrogen and sulfur results ineven more valuable materials such as amino acids, nucleic acids, alkaloids. Hibiscus tea (Hibiscus sabdariffa from Malvaceaefamily is known by different names in different parts of the world. In Iran it is calledthe Maki tea, tea Meccaorred tea.As an important plant,it is decided to investigate its growth and development in Jiroft. Materials and Methods The experiment was conducted as factorial based on randomized complete block design with three replications in farm research of Islamic Azad University of Jiroft during 2010. The first factor was nitrogen foliar application in four levels (0, 1, 2 and 3 percent and second factor was foliar application of zinc at twolevels (0 and 1 percent. The measured quantitative characteristics were stem diameter, plant height, calycle fresh weight,calycle dry weight, plant fresh weight,plant dry weight, leaf fresh weight,leaf dry weight, mucilage percentage and mucilage yield. Results and Discussion:The results of ANOVA showed that nitrogen foliar application on leaf dry weight, calycle fresh and dry weight was effective. Plant fresh weight, dry weight, stem diameter, plant height, mucilage percentageandmucilage yield showedsignificanteffects. Zinc foliar application significantly affected leaf fresh weight,leafdry weight, calycle fresh weight, plant fresh weight,plant dry weight, mucilage percentage andmucilage yield.The interaction effect of nitrogen and zinc on leaf dry weight, plant freshweight and plant dry weight was also significant. The mean comparison of studied characteristics revealed that byincreasing the amount of nitrogen up to N2 level, the stem diameter, plant height, leaf dry weight, calycle dry weight, mucilage percentage and yield increased but there was no significant difference between N2 and N3 levels. Plant fresh weight and plantdry weight

  10. Nano-graphene oxide carboxylation for efficient bioconjugation applications: a quantitative optimization approach

    Energy Technology Data Exchange (ETDEWEB)

    Imani, Rana; Emami, Shahriar Hojjati, E-mail: semami@aut.ac.ir [Amirkabir University of Technology, Department of Biomedical Engineering (Iran, Islamic Republic of); Faghihi, Shahab, E-mail: shahabeddin.faghihi@mail.mcgill.ca, E-mail: sfaghihi@nigeb.ac.ir [National Institute of Genetic Engineering and Biotechnology, Tissue Engineering and Biomaterials Division (Iran, Islamic Republic of)

    2015-02-15

    A method for carboxylation of graphene oxide (GO) with chloroacetic acid that precisely optimizes and controls the efficacy of the process for bioconjugation applications is proposed. Quantification of COOH groups on nano-graphene oxide sheets (NGOS) is performed by novel colorimetric methylene blue (MB) assay. The GO is synthesized and carboxylated by chloroacetic acid treatment under strong basic condition. The size and morphology of the as-prepared NGOS are characterized by scanning electron microscopy, transmission electron microscopy (TEM), and atomic force microscopy (AFM). The effect of acid to base molar ratio on the physical, chemical, and morphological properties of NGOS is analyzed by Fourier-transformed infrared spectrometry (FTIR), UV–Vis spectroscopy, X-ray diffraction (XRD), AFM, and zeta potential. For evaluation of bioconjugation efficacy, the synthesized nano-carriers with different carboxylation ratios are functionalized by octaarginine peptide sequence (R8) as a biomolecule model containing amine groups. The quantification of attached R8 peptides to graphene nano-sheets’ surface is performed with a colorimetric-based assay which includes the application of 2,4,6-Trinitrobenzene sulfonic acid (TNBS). The results show that the thickness and lateral size of nano-sheets are dramatically decreased to 0.8 nm and 50–100 nm after carboxylation process, respectively. X-ray analysis shows the nano-sheets interlaying space is affected by the alteration of chloroacetic acid to base ratio. The MB assay reveals that the COOH groups on the surface of NGOS are maximized at the acid to base ratio of 2 which is confirmed by FTIR, XRD, and zeta potential. The TNBS assay also shows that bioconjugation of the optimized carboxylated NGOS sample with octaarginine peptide is 2.5 times more efficient compared to bare NGOS. The present work provides evidence that treatment of GO by chloroacetic acid under an optimized condition would create a functionalized high

  11. Nano-graphene oxide carboxylation for efficient bioconjugation applications: a quantitative optimization approach

    International Nuclear Information System (INIS)

    Imani, Rana; Emami, Shahriar Hojjati; Faghihi, Shahab

    2015-01-01

    A method for carboxylation of graphene oxide (GO) with chloroacetic acid that precisely optimizes and controls the efficacy of the process for bioconjugation applications is proposed. Quantification of COOH groups on nano-graphene oxide sheets (NGOS) is performed by novel colorimetric methylene blue (MB) assay. The GO is synthesized and carboxylated by chloroacetic acid treatment under strong basic condition. The size and morphology of the as-prepared NGOS are characterized by scanning electron microscopy, transmission electron microscopy (TEM), and atomic force microscopy (AFM). The effect of acid to base molar ratio on the physical, chemical, and morphological properties of NGOS is analyzed by Fourier-transformed infrared spectrometry (FTIR), UV–Vis spectroscopy, X-ray diffraction (XRD), AFM, and zeta potential. For evaluation of bioconjugation efficacy, the synthesized nano-carriers with different carboxylation ratios are functionalized by octaarginine peptide sequence (R8) as a biomolecule model containing amine groups. The quantification of attached R8 peptides to graphene nano-sheets’ surface is performed with a colorimetric-based assay which includes the application of 2,4,6-Trinitrobenzene sulfonic acid (TNBS). The results show that the thickness and lateral size of nano-sheets are dramatically decreased to 0.8 nm and 50–100 nm after carboxylation process, respectively. X-ray analysis shows the nano-sheets interlaying space is affected by the alteration of chloroacetic acid to base ratio. The MB assay reveals that the COOH groups on the surface of NGOS are maximized at the acid to base ratio of 2 which is confirmed by FTIR, XRD, and zeta potential. The TNBS assay also shows that bioconjugation of the optimized carboxylated NGOS sample with octaarginine peptide is 2.5 times more efficient compared to bare NGOS. The present work provides evidence that treatment of GO by chloroacetic acid under an optimized condition would create a functionalized high

  12. Improving quantitative gas chromatography-electron ionization mass spectrometry results using a modified ion source: demonstration for a pharmaceutical application.

    Science.gov (United States)

    D'Autry, Ward; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Van Schepdael, Ann

    2011-07-01

    Gas chromatography-mass spectrometry is a well established analytical technique. However, mass spectrometers with electron ionization sources may suffer from signal drifts, hereby negatively influencing quantitative performance. To demonstrate this phenomenon for a real application, a static headspace-gas chromatography method in combination with electron ionization-quadrupole mass spectrometry was optimized for the determination of residual dichloromethane in coronary stent coatings. Validating the method, the quantitative performance of an original stainless steel ion source was compared to that of a modified ion source. Ion source modification included the application of a gold coating on the repeller and exit plate. Several validation aspects such as limit of detection, limit of quantification, linearity and precision were evaluated using both ion sources. It was found that, as expected, the stainless steel ion source suffered from signal drift. As a consequence, non-linearity and high RSD values for repeated analyses were obtained. An additional experiment was performed to check whether an internal standard compound would lead to better results. It was found that the signal drift patterns of the analyte and internal standard were different, consequently leading to high RSD values for the response factor. With the modified ion source however, a more stable signal was observed resulting in acceptable linearity and precision. Moreover, it was also found that sensitivity improved compared to the stainless steel ion source. Finally, the optimized method with the modified ion source was applied to determine residual dichloromethane in the coating of coronary stents. The solvent was detected but found to be below the limit of quantification. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Quantitative application of Fermi-Dirac functions of two- and three-dimensional systems

    International Nuclear Information System (INIS)

    Grimmer, D.P.; Luszczynski, K.; Salibi, N.

    1981-01-01

    Expressions for the various physical parameters of the ideal Fermi-Dirac gas in two dimensions are derived and compared to the corresponding three-dimensional expressions. These derivations show that the Fermi-Dirac functions most applicable to the two-dimensional problem are F/sub o/(eta), F 1 (eta), and F' 0 (eta). Analogous to the work of McDougall and Stoner in three dimensions, these functions and parameters derived from them are tabulated over the range of the argument, -4 3 He monolayer and bulk liquid 3 He nuclear magnetic susceptibilities, respectively, are considered. Calculational procedures of fitting data to theoretical parameters and criteria for judging the quality of fit of data to both two- and three-dimensional Fermi-Dirac values are discussed

  14. Improving the quantitative accuracy of optical-emission computed tomography by incorporating an attenuation correction: application to HIF1 imaging

    Science.gov (United States)

    Kim, E.; Bowsher, J.; Thomas, A. S.; Sakhalkar, H.; Dewhirst, M.; Oldham, M.

    2008-10-01

    Optical computed tomography (optical-CT) and optical-emission computed tomography (optical-ECT) are new techniques for imaging the 3D structure and function (including gene expression) of whole unsectioned tissue samples. This work presents a method of improving the quantitative accuracy of optical-ECT by correcting for the 'self'-attenuation of photons emitted within the sample. The correction is analogous to a method commonly applied in single-photon-emission computed tomography reconstruction. The performance of the correction method was investigated by application to a transparent cylindrical gelatin phantom, containing a known distribution of attenuation (a central ink-doped gelatine core) and a known distribution of fluorescing fibres. Attenuation corrected and uncorrected optical-ECT images were reconstructed on the phantom to enable an evaluation of the effectiveness of the correction. Significant attenuation artefacts were observed in the uncorrected images where the central fibre appeared ~24% less intense due to greater attenuation from the surrounding ink-doped gelatin. This artefact was almost completely removed in the attenuation-corrected image, where the central fibre was within ~4% of the others. The successful phantom test enabled application of attenuation correction to optical-ECT images of an unsectioned human breast xenograft tumour grown subcutaneously on the hind leg of a nude mouse. This tumour cell line had been genetically labelled (pre-implantation) with fluorescent reporter genes such that all viable tumour cells expressed constitutive red fluorescent protein and hypoxia-inducible factor 1 transcription-produced green fluorescent protein. In addition to the fluorescent reporter labelling of gene expression, the tumour microvasculature was labelled by a light-absorbing vasculature contrast agent delivered in vivo by tail-vein injection. Optical-CT transmission images yielded high-resolution 3D images of the absorbing contrast agent, and

  15. Improving the quantitative accuracy of optical-emission computed tomography by incorporating an attenuation correction: application to HIF1 imaging

    International Nuclear Information System (INIS)

    Kim, E; Bowsher, J; Thomas, A S; Sakhalkar, H; Dewhirst, M; Oldham, M

    2008-01-01

    Optical computed tomography (optical-CT) and optical-emission computed tomography (optical-ECT) are new techniques for imaging the 3D structure and function (including gene expression) of whole unsectioned tissue samples. This work presents a method of improving the quantitative accuracy of optical-ECT by correcting for the 'self'-attenuation of photons emitted within the sample. The correction is analogous to a method commonly applied in single-photon-emission computed tomography reconstruction. The performance of the correction method was investigated by application to a transparent cylindrical gelatin phantom, containing a known distribution of attenuation (a central ink-doped gelatine core) and a known distribution of fluorescing fibres. Attenuation corrected and uncorrected optical-ECT images were reconstructed on the phantom to enable an evaluation of the effectiveness of the correction. Significant attenuation artefacts were observed in the uncorrected images where the central fibre appeared ∼24% less intense due to greater attenuation from the surrounding ink-doped gelatin. This artefact was almost completely removed in the attenuation-corrected image, where the central fibre was within ∼4% of the others. The successful phantom test enabled application of attenuation correction to optical-ECT images of an unsectioned human breast xenograft tumour grown subcutaneously on the hind leg of a nude mouse. This tumour cell line had been genetically labelled (pre-implantation) with fluorescent reporter genes such that all viable tumour cells expressed constitutive red fluorescent protein and hypoxia-inducible factor 1 transcription-produced green fluorescent protein. In addition to the fluorescent reporter labelling of gene expression, the tumour microvasculature was labelled by a light-absorbing vasculature contrast agent delivered in vivo by tail-vein injection. Optical-CT transmission images yielded high-resolution 3D images of the absorbing contrast agent

  16. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    Science.gov (United States)

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation

  17. Quantitative Evaluation of Stereo Visual Odometry for Autonomous Vessel Localisation in Inland Waterway Sensing Applications

    Directory of Open Access Journals (Sweden)

    Thomas Kriechbaumer

    2015-12-01

    Full Text Available Autonomous survey vessels can increase the efficiency and availability of wide-area river environment surveying as a tool for environment protection and conservation. A key challenge is the accurate localisation of the vessel, where bank-side vegetation or urban settlement preclude the conventional use of line-of-sight global navigation satellite systems (GNSS. In this paper, we evaluate unaided visual odometry, via an on-board stereo camera rig attached to the survey vessel, as a novel, low-cost localisation strategy. Feature-based and appearance-based visual odometry algorithms are implemented on a six degrees of freedom platform operating under guided motion, but stochastic variation in yaw, pitch and roll. Evaluation is based on a 663 m-long trajectory (>15,000 image frames and statistical error analysis against ground truth position from a target tracking tachymeter integrating electronic distance and angular measurements. The position error of the feature-based technique (mean of ±0.067 m is three times smaller than that of the appearance-based algorithm. From multi-variable statistical regression, we are able to attribute this error to the depth of tracked features from the camera in the scene and variations in platform yaw. Our findings inform effective strategies to enhance stereo visual localisation for the specific application of river monitoring.

  18. Applications of rule-induction in the derivation of quantitative structure-activity relationships

    Science.gov (United States)

    A-Razzak, Mohammed; Glen, Robert C.

    1992-08-01

    Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.

  19. A fortran program for elastic scattering of deuterons with an optical model containing tensorial potentials; Programme fortran pour la diffusion elastique de deutons avec un modele optique contenant des termes tensoriels

    Energy Technology Data Exchange (ETDEWEB)

    Raynal, J. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1963-07-01

    The optical model has been applied with success to the elastic scattering of particles of spin 0 and 1/2 and to a lesser degree to that of deuterons. For particles of spin l/2, an LS coupling term is ordinarily used; this term is necessary to obtain a polarization; for deuterons, this coupling has been already introduced, but the possible forms of potentials are more numerous (in this case, scalar products of a second rank spin tensor with a tensor of the same rank in space or momentum can occur). These terms which may be necessary are primarily important for the tensor polarization. This problem is of particular interest at Saclay since a beam of polarized deuterons has become available. The FORTRAN program SPM 037 permits the study of the effect of tensorial potentials constructed from the distance of the deuteron from the target and its angular momentum with respect to it. This report should make possible the use and even the modification of the program. It consists of: a description of the problem and of the notation employed, a presentation of the methods adopted, an indication of the necessary data and how they should be introduced, and finally tables of symbols which are in equivalence or common statements: these tables must be considered when making any modification. (author) [French] Le modele optique a ete applique avec succes a la diffusion elastique des particules de spin nul et 1/2 et dans une moindre mesure a celle des deutons. Pour les particules de spin 1/2, on utilise habituellement un couplage LS, necessaire pour calculer la polarisation; pour les deutons, ce couplage a deja ete introduit, mais les formes de potentiel possibles sont plus nombreuses (intervention de produits scalaires d'un tenseur d'ordre 2 de spin avec un tenseur du meme ordre d'espace ou d'impulsion) et celles qui peuvent etre eventuellement necessaires ont une importance capitale pour la polarisation tensorielle. Ce probleme revet a Saclay un interet

  20. A fortran program for elastic scattering of deuterons with an optical model containing tensorial potentials; Programme fortran pour la diffusion elastique de deutons avec un modele optique contenant des termes tensoriels

    Energy Technology Data Exchange (ETDEWEB)

    Raynal, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1963-07-01

    The optical model has been applied with success to the elastic scattering of particles of spin 0 and 1/2 and to a lesser degree to that of deuterons. For particles of spin l/2, an LS coupling term is ordinarily used; this term is necessary to obtain a polarization; for deuterons, this coupling has been already introduced, but the possible forms of potentials are more numerous (in this case, scalar products of a second rank spin tensor with a tensor of the same rank in space or momentum can occur). These terms which may be necessary are primarily important for the tensor polarization. This problem is of particular interest at Saclay since a beam of polarized deuterons has become available. The FORTRAN program SPM 037 permits the study of the effect of tensorial potentials constructed from the distance of the deuteron from the target and its angular momentum with respect to it. This report should make possible the use and even the modification of the program. It consists of: a description of the problem and of the notation employed, a presentation of the methods adopted, an indication of the necessary data and how they should be introduced, and finally tables of symbols which are in equivalence or common statements: these tables must be considered when making any modification. (author) [French] Le modele optique a ete applique avec succes a la diffusion elastique des particules de spin nul et 1/2 et dans une moindre mesure a celle des deutons. Pour les particules de spin 1/2, on utilise habituellement un couplage LS, necessaire pour calculer la polarisation; pour les deutons, ce couplage a deja ete introduit, mais les formes de potentiel possibles sont plus nombreuses (intervention de produits scalaires d'un tenseur d'ordre 2 de spin avec un tenseur du meme ordre d'espace ou d'impulsion) et celles qui peuvent etre eventuellement necessaires ont une importance capitale pour la polarisation tensorielle. Ce probleme revet a Saclay un interet particulier depuis la mise

  1. Application of Thermal Infrared Remote Sensing for Quantitative Evaluation of Crop Characteristics

    Science.gov (United States)

    Shaw, J.; Luvall, J.; Rickman, D.; Mask, P.; Wersinger, J.; Sullivan, D.; Arnold, James E. (Technical Monitor)

    2002-01-01

    Evidence suggests that thermal infrared emittance (TIR) at the field-scale is largely a function of the integrated crop/soil moisture continuum. Because soil moisture dynamics largely determine crop yields in non-irrigated farming (85 % of Alabama farms are non-irrigated), TIR may be an effective method of mapping within field crop yield variability, and possibly, absolute yields. The ability to map yield variability at juvenile growth stages can lead to improved soil fertility and pest management, as well as facilitating the development of economic forecasting. Researchers at GHCC/MSFC/NASA and Auburn University are currently investigating the role of TIR in site-specific agriculture. Site-specific agriculture (SSA), or precision farming, is a method of crop production in which zones and soils within a field are delineated and managed according to their unique properties. The goal of SSA is to improve farm profits and reduce environmental impacts through targeted agrochemical applications. The foundation of SSA depends upon the spatial and temporal characterization of soil and crop properties through the creation of management zones. Management zones can be delineated using: 1) remote sensing (RS) data, 2) conventional soil testing and soil mapping, and 3) yield mapping. Portions of this research have concentrated on using remote sensing data to map yield variability in corn (Zea mays L.) and soybean (Glycine max L.) crops. Remote sensing data have been collected for several fields in the Tennessee Valley region at various crop growth stages during the last four growing seasons. Preliminary results of this study will be presented.

  2. [Clinical application and optimization of HEAD-US quantitative ultrasound assessment scale for hemophilic arthropathy].

    Science.gov (United States)

    Li, J; Guo, X J; Ding, X L; Lyu, B M; Xiao, J; Sun, Q L; Li, D S; Zhang, W F; Zhou, J C; Li, C P; Yang, R C

    2018-02-14

    Objective: To assess the feasibility of HEAD-US scale in the clinical application of hemophilic arthropathy (HA) and propose an optimized ultrasound scoring system. Methods: From July 2015 to August 2017, 1 035 joints ultrasonographic examinations were performed in 91 patients. Melchiorre, HEAD-US (Hemophilic Early Arthropathy Detection with UltraSound) and HEAD-US-C (HEAD-US in China) scale scores were used respectively to analyze the results. The correlations between three ultrasound scales and Hemophilia Joint Health Scores (HJHS) were evaluated. The sensitivity differences of the above Ultrasonic scoring systems in evaluation of HA were compared. Results: All the 91 patients were male, with median age of 16 (4-55) years old, including 86 cases of hemophilia A and 5 cases hemophilia B. The median ( P 25 , P 75 ) of Melchiorre, HEAD-US and HEAD-US-C scores of 1 035 joints were 2(0,6), 1(0,5) and 2(0,6), respectively, and the correlation coefficients compared with HJHS was 0.747, 0.762 and 0.765 respectively, with statistical significance ( P cases of asymptomatic joints, the positive rates of Melchiorre, HEAD-US-C and HEAD-US scale score were 25.0% (95% CI 20.6%-29.6%), 17.0% (95% CI 12.6%-21.1%) and 11.9% (95% CI 8.4%-15.7%) respectively, and the difference was statistically significant ( P joints of 40 patients. The difference in variation amplitude of HEAD-US-C scores and HEAD-US scores before and after joint bleeding was statistically significant ( P <0.001). Conclusion: Compared with Melchiorre, there were similar good correlations between HEAD-US, HEAD-US-C and HJHS. HEAD-US ultrasound scoring system is quick, convenient and simple to use. The optimized HEAD-US-C scale score is more sensitive than HEAD-US, especially for patients with HA who have subclinical state, which make up for insufficiency of sensitivity in HEAD-US scoring system.

  3. Semi-quantitative approach of tracer uptake abnormalities in myocardial SPECT: application to inferior defects

    International Nuclear Information System (INIS)

    Damien, J.; Bontemps, L.; Gabain, M.; Felecan, R.; Itti, R.

    1997-01-01

    This study was designed in order to evaluate in a more objective manner than visual inspection, the detection of myocardial inferior wall hypo-fixations in perfusion SPECT. We have studied 90 patients, divided into four groups: GO (7 M, 13 F, 55 ± 21 years) and G1 (9 M, 12 F, 49 ± 26 years) are groups of patients considered as normal; G2 (20 M, 3 F, 60 ± 12 years) corresponds to patients with reversible ischaemia, where the stress examination is abnormal but the resting one is close to normality; G3 (21 H, 5 F, 63 ± 8 years) includes infarcts where both examinations are definitely abnormal. Intra and inter-group statistical comparisons were first done using polar maps (Bull's eye) and an iterative method has then been developed for comparing each image of groups 1, 2 and to the mean normal data (GO). Finally, we have built a ROC (receiver operating characteristic) curves network for determining the best confidence interval and the optimal normality / abnormality criterion (number of pixels located without the confidence interval). The results are expressed in terms of sensitivity and specificity using the most favourable situation derived from the ROC curves. For 2.5 standard deviations, we obtain, for G2 (reversible injury) compared to G1, 78.3 % sensibility and 76.2 % for specificity at rest with at maximum 20 abnormal pixels as normality criterion, and at stress 81 % and 82.6 % for pixels. For G3 (permanent injury) compared to G1, the values are respectively : sensitivity = 88.5 % and specificity = 85.2 % at rest for 40 pixels; sensitivity = 92.3 % and specificity = 85.2 % at stress for 80 pixels. The method developed seems to be applicable on a wider scale, not only limited to inferior area abnormalities. It is able to optimise, for each situation, the confidence interval for an abnormal image definition and the most significant criterion, in terms of number of abnormal pixels, to detect a diseased myocardial area. (authors)

  4. Propagation of error from parameter constraints in quantitative MRI: Example application of multiple spin echo T2 mapping.

    Science.gov (United States)

    Lankford, Christopher L; Does, Mark D

    2018-02-01

    Quantitative MRI may require correcting for nuisance parameters which can or must be constrained to independently measured or assumed values. The noise and/or bias in these constraints propagate to fitted parameters. For example, the case of refocusing pulse flip angle constraint in multiple spin echo T 2 mapping is explored. An analytical expression for the mean-squared error of a parameter of interest was derived as a function of the accuracy and precision of an independent estimate of a nuisance parameter. The expression was validated by simulations and then used to evaluate the effects of flip angle (θ) constraint on the accuracy and precision of T⁁2 for a variety of multi-echo T 2 mapping protocols. Constraining θ improved T⁁2 precision when the θ-map signal-to-noise ratio was greater than approximately one-half that of the first spin echo image. For many practical scenarios, constrained fitting was calculated to reduce not just the variance but the full mean-squared error of T⁁2, for bias in θ⁁≲6%. The analytical expression derived in this work can be applied to inform experimental design in quantitative MRI. The example application to T 2 mapping provided specific cases, depending on θ⁁ accuracy and precision, in which θ⁁ measurement and constraint would be beneficial to T⁁2 variance or mean-squared error. Magn Reson Med 79:673-682, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  5. The X-ray spectrometry Si(Li) system and it's application in quantitative analysis of rare-earth elements

    International Nuclear Information System (INIS)

    Barbosa, J.B.S.

    1985-11-01

    The basic principles involved in Si(Li) system used in X-ray spectrometry is described. It also demonstrates its application in the energy range where the resolution is better than that characteristic of conventional spectrometers. The theoretical principles underlying the interaction between the electromagnetic radiation and matter, and a review on semiconductors are presented at first. It emphasizes the fluorescence phenomenon and the process of photon detection by semiconductor crystals whose properties and characteristics allow, in the specific case of Si-crystal, the confection of detectors with large sensitivity volume useful for X-ray spectrometry. In addition, the components of the Si(Li) system are described individually, with special attention to the operating aspects, and to the parameters affecting the quality of pulse height spectrum. Finally, the spectrometer performance is experimentally evaluated though the quantitative analyses of rare-earth element oxides (La, Ce, Pr, Nd). It should be stressed that this research indicates that the X-ray emission-transmission analysis is the most adequate method under the activation conditions provided by the spectrometer, where Am 241 emissor UPSILON of 60KeV is the photon source for the fluorescence. Therefore, the experimental work was extended in order to include all the necessary treatment. (Author) [pt

  6. Quantitative Segmentation of Fluorescence Microscopy Images of Heterogeneous Tissue: Application to the Detection of Residual Disease in Tumor Margins.

    Science.gov (United States)

    Mueller, Jenna L; Harmany, Zachary T; Mito, Jeffrey K; Kennedy, Stephanie A; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G; Willett, Rebecca M; Brown, J Quincy; Ramanujam, Nimmi

    2013-01-01

    To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features. TISSUE EXCISED FROM A GENETICALLY ENGINEERED MOUSE MODEL OF SARCOMA WAS IMAGED USING A SUBCELLULAR RESOLUTION MICROENDOSCOPE AFTER TOPICAL APPLICATION OF A FLUORESCENT ANATOMICAL CONTRAST AGENT: acriflavine. An algorithm based on sparse component analysis (SCA) and the circle transform (CT) was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma. Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity). For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach. The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue.

  7. Detection of nonauthorized genetically modified organisms using differential quantitative polymerase chain reaction: application to 35S in maize.

    Science.gov (United States)

    Cankar, Katarina; Chauvensy-Ancel, Valérie; Fortabat, Marie-Noelle; Gruden, Kristina; Kobilinsky, André; Zel, Jana; Bertheau, Yves

    2008-05-15

    Detection of nonauthorized genetically modified organisms (GMOs) has always presented an analytical challenge because the complete sequence data needed to detect them are generally unavailable although sequence similarity to known GMOs can be expected. A new approach, differential quantitative polymerase chain reaction (PCR), for detection of nonauthorized GMOs is presented here. This method is based on the presence of several common elements (e.g., promoter, genes of interest) in different GMOs. A statistical model was developed to study the difference between the number of molecules of such a common sequence and the number of molecules identifying the approved GMO (as determined by border-fragment-based PCR) and the donor organism of the common sequence. When this difference differs statistically from zero, the presence of a nonauthorized GMO can be inferred. The interest and scope of such an approach were tested on a case study of different proportions of genetically modified maize events, with the P35S promoter as the Cauliflower Mosaic Virus common sequence. The presence of a nonauthorized GMO was successfully detected in the mixtures analyzed and in the presence of (donor organism of P35S promoter). This method could be easily transposed to other common GMO sequences and other species and is applicable to other detection areas such as microbiology.

  8. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications.

    Science.gov (United States)

    Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M

    2017-05-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.

  9. Potential application of a semi-quantitative method for mercury determination in soils, sediments and gold mining residues

    International Nuclear Information System (INIS)

    Yallouz, A.V.; Cesar, R.G.; Egler, S.G.

    2008-01-01

    An alternative, low cost method for analyzing mercury in soil, sediment and gold mining residues was developed, optimized and applied to 30 real samples. It is semiquantitative, performed using an acid extraction pretreatment step, followed by mercury reduction and collection in a detecting paper containing cuprous iodide. A complex is formed with characteristic color whose intensity is proportional to mercury concentration in the original sample. The results are reported as range of concentration and the minimum detectable is 100 ng/g. Method quality assurance was performed by comparing results obtained using the alternative method and the Cold Vapor Atomic Absorption Spectrometry techniques. The average results from duplicate analysis by CVAAS were 100% coincident with alternative method results. The method is applicable for screening tests and can be used in regions where a preliminary diagnosis is necessary, at programs of environmental surveillance or by scientists interested in investigating mercury geochemistry. - Semi-quantitative low-cost method for mercury determination in soil, sediments and mining residues

  10. Applicability of integrated cell culture reverse transcriptase quantitative PCR (ICC-RTqPCR) for the simultaneous detection of the four human enteric enterovirus species in disinfection studies

    Science.gov (United States)

    A newly developed integrated cell culture reverse transcriptase quantitative PCR (ICC-RTqPCR) method and its applicability in UV disinfection studies is described. This method utilizes a singular cell culture system coupled with four RTqPCR assays to detect infectious serotypes t...

  11. Quantitative Segmentation of Fluorescence Microscopy Images of Heterogeneous Tissue: Application to the Detection of Residual Disease in Tumor Margins.

    Directory of Open Access Journals (Sweden)

    Jenna L Mueller

    Full Text Available To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features.TISSUE EXCISED FROM A GENETICALLY ENGINEERED MOUSE MODEL OF SARCOMA WAS IMAGED USING A SUBCELLULAR RESOLUTION MICROENDOSCOPE AFTER TOPICAL APPLICATION OF A FLUORESCENT ANATOMICAL CONTRAST AGENT: acriflavine. An algorithm based on sparse component analysis (SCA and the circle transform (CT was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma.Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity. For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach.The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue.

  12. Application of Quantitative MRI for Brain Tissue Segmentation at 1.5 T and 3.0 T Field Strengths

    Science.gov (United States)

    West, Janne; Blystad, Ida; Engström, Maria; Warntjes, Jan B. M.; Lundberg, Peter

    2013-01-01

    Background Brain tissue segmentation of white matter (WM), grey matter (GM), and cerebrospinal fluid (CSF) are important in neuroradiological applications. Quantitative Mri (qMRI) allows segmentation based on physical tissue properties, and the dependencies on MR scanner settings are removed. Brain tissue groups into clusters in the three dimensional space formed by the qMRI parameters R1, R2 and PD, and partial volume voxels are intermediate in this space. The qMRI parameters, however, depend on the main magnetic field strength. Therefore, longitudinal studies can be seriously limited by system upgrades. The aim of this work was to apply one recently described brain tissue segmentation method, based on qMRI, at both 1.5 T and 3.0 T field strengths, and to investigate similarities and differences. Methods In vivo qMRI measurements were performed on 10 healthy subjects using both 1.5 T and 3.0 T MR scanners. The brain tissue segmentation method was applied for both 1.5 T and 3.0 T and volumes of WM, GM, CSF and brain parenchymal fraction (BPF) were calculated on both field strengths. Repeatability was calculated for each scanner and a General Linear Model was used to examine the effect of field strength. Voxel-wise t-tests were also performed to evaluate regional differences. Results Statistically significant differences were found between 1.5 T and 3.0 T for WM, GM, CSF and BPF (p3.0 T. The mean differences between 1.5 T and 3.0 T were -66 mL WM, 40 mL GM, 29 mL CSF and -1.99% BPF. Voxel-wise t-tests revealed regional differences of WM and GM in deep brain structures, cerebellum and brain stem. Conclusions Most of the brain was identically classified at the two field strengths, although some regional differences were observed. PMID:24066153

  13. Study of elasticity and limit analysis of joints and branch pipe tee connections; Etude elastique et analyse limite des piquages et des tes

    Energy Technology Data Exchange (ETDEWEB)

    Plancq, David [Nantes Univ., 44 (France)

    1997-09-24

    The industrial context of this study is the behaviour and sizing the pipe joints in PWR and fast neutron reactors. Two aspects have been approached in this framework. The first issue is the elastic behaviour of the pipe joining with a plane or spherical surface or with another pipe in order to get a better understanding of this components usually modelled in classical calculations in a very simplified way. We focused our search on the bending of an intersecting pipe. In the case of the intersection with a plane surface we have conducted our study on the basis of literature results. In the case of intersection on a spherical surface we have also solved entirely the problem by using a sphere shell description different from that usually utilized. Finally, we give an approach to obtain a simple result for the bending of branch pipe tee joints allowing the formulation of a specific finite element. The second issue approached is the limit analysis which allows characterising the plastic failure of this structures and defining reference constraints. This constraints are used in numerous applications. We mention here the rules of pipe sizing and analyzing under primary load, the mechanics of cracks and the definition of global plasticity criteria. To solve this problem we concentrated our studies on the development of a new calculation techniques for the limit load called elastic compensation method (ECM). We have tested it on a large number of classical structures and on the branch pipe tee connections. We propose also a very simple result regarding the lower limit of the bending of a tee junction 111 refs., 88 figs., 8 tabs.

  14. Discriminant content validity: a quantitative methodology for assessing content of theory-based measures, with illustrative applications.

    Science.gov (United States)

    Johnston, Marie; Dixon, Diane; Hart, Jo; Glidewell, Liz; Schröder, Carin; Pollard, Beth

    2014-05-01

    In studies involving theoretical constructs, it is important that measures have good content validity and that there is not contamination of measures by content from other constructs. While reliability and construct validity are routinely reported, to date, there has not been a satisfactory, transparent, and systematic method of assessing and reporting content validity. In this paper, we describe a methodology of discriminant content validity (DCV) and illustrate its application in three studies. Discriminant content validity involves six steps: construct definition, item selection, judge identification, judgement format, single-sample test of content validity, and assessment of discriminant items. In three studies, these steps were applied to a measure of illness perceptions (IPQ-R) and control cognitions. The IPQ-R performed well with most items being purely related to their target construct, although timeline and consequences had small problems. By contrast, the study of control cognitions identified problems in measuring constructs independently. In the final study, direct estimation response formats for theory of planned behaviour constructs were found to have as good DCV as Likert format. The DCV method allowed quantitative assessment of each item and can therefore inform the content validity of the measures assessed. The methods can be applied to assess content validity before or after collecting data to select the appropriate items to measure theoretical constructs. Further, the data reported for each item in Appendix S1 can be used in item or measure selection. Statement of contribution What is already known on this subject? There are agreed methods of assessing and reporting construct validity of measures of theoretical constructs, but not their content validity. Content validity is rarely reported in a systematic and transparent manner. What does this study add? The paper proposes discriminant content validity (DCV), a systematic and transparent method

  15. Cellular phone-based image acquisition and quantitative ratiometric method for detecting cocaine and benzoylecgonine for biological and forensic applications.

    Science.gov (United States)

    Cadle, Brian A; Rasmus, Kristin C; Varela, Juan A; Leverich, Leah S; O'Neill, Casey E; Bachtell, Ryan K; Cooper, Donald C

    2010-01-01

    Here we describe the first report of using low-cost cellular or web-based digital cameras to image and quantify standardized rapid immunoassay strips as a new point-of-care diagnostic and forensics tool with health applications. Quantitative ratiometric pixel density analysis (QRPDA) is an automated method requiring end-users to utilize inexpensive (∼ $1 USD/each) immunotest strips, a commonly available web or mobile phone camera or scanner, and internet or cellular service. A model is described whereby a central computer server and freely available IMAGEJ image analysis software records and analyzes the incoming image data with time-stamp and geo-tag information and performs the QRPDA using custom JAVA based macros (http://www.neurocloud.org). To demonstrate QRPDA we developed a standardized method using rapid immunotest strips directed against cocaine and its major metabolite, benzoylecgonine. Images from standardized samples were acquired using several devices, including a mobile phone camera, web cam, and scanner. We performed image analysis of three brands of commercially available dye-conjugated anti-cocaine/benzoylecgonine (COC/BE) antibody test strips in response to three different series of cocaine concentrations ranging from 0.1 to 300 ng/ml and BE concentrations ranging from 0.003 to 0.1 ng/ml. This data was then used to create standard curves to allow quantification of COC/BE in biological samples. Across all devices, QRPDA quantification of COC and BE proved to be a sensitive, economical, and faster alternative to more costly methods, such as gas chromatography-mass spectrometry, tandem mass spectrometry, or high pressure liquid chromatography. The limit of detection was determined to be between 0.1 and 5 ng/ml. To simulate conditions in the field, QRPDA was found to be robust under a variety of image acquisition and testing conditions that varied temperature, lighting, resolution, magnification and concentrations of biological fluid in a sample. To

  16. Cellular Phone-Based Image Acquisition and Quantitative Ratiometric Method for Detecting Cocaine and Benzoylecgonine for Biological and Forensic Applications

    Directory of Open Access Journals (Sweden)

    Brian A. Cadle

    2010-01-01

    Full Text Available Here we describe the first report of using low-cost cellular or web-based digital cameras to image and quantify standardized rapid immunoassay strips as a new point-of-care diagnostic and forensics tool with health applications. Quantitative ratiometric pixel density analysis (QRPDA is an automated method requiring end-users to utilize inexpensive (~ $1 USD/each immunotest strips, a commonly available web or mobile phone camera or scanner, and internet or cellular service. A model is described whereby a central computer server and freely available IMAGEJ image analysis software records and analyzes the incoming image data with time-stamp and geo-tag information and performs the QRPDA using custom JAVA based macros ( http://www.neurocloud.org . To demonstrate QRPDA we developed a standardized method using rapid immunotest strips directed against cocaine and its major metabolite, benzoylecgonine. Images from standardized samples were acquired using several devices, including a mobile phone camera, web cam, and scanner. We performed image analysis of three brands of commercially available dye-conjugated anti-cocaine/benzoylecgonine (COC/BE antibody test strips in response to three different series of cocaine concentrations ranging from 0.1 to 300 ng/ml and BE concentrations ranging from 0.003 to 0.1 ng/ml. This data was then used to create standard curves to allow quantification of COC/BE in biological samples. Across all devices, QRPDA quantification of COC and BE proved to be a sensitive, economical, and faster alternative to more costly methods, such as gas chromatography-mass spectrometry, tandem mass spectrometry, or high pressure liquid chromatography. The limit of detection was determined to be between 0.1 and 5 ng/ml. To simulate conditions in the field, QRPDA was found to be robust under a variety of image acquisition and testing conditions that varied temperature, lighting, resolution, magnification and concentrations of biological fluid

  17. Quantitative capillary electrophoresis and its application in analysis of alkaloids in tea, coffee, coca cola, and theophylline tablets.

    Science.gov (United States)

    Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao

    2009-01-01

    A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.

  18. Application of sensitivity analysis to a quantitative assessment of neutron cross-section requirements for the TFTR: an interim report

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.; Dudziak, D.J.; Muir, D.W.

    1975-09-01

    A computational method to determine cross-section requirements quantitatively is described and applied to the Tokamak Fusion Test Reactor (TFTR). In order to provide a rational basis for the priorities assigned to new cross-section measurements or evaluations, this method includes quantitative estimates of the uncertainty of currently available data, the sensitivity of important nuclear design parameters to selected cross sections, and the accuracy desired in predicting nuclear design parameters. Perturbation theory is used to combine estimated cross-section uncertainties with calculated sensitivities to determine the variance of any nuclear design parameter of interest

  19. Elastic and plastic properties of iron-aluminium alloys. Special problems raised by the brittleness of alloys of high aluminium content; Proprietes elastiques et plastiques des alliages fer-aluminium. Problemes particuliers poses par la fragilite des alliages a forte teneur en aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Mouturat, P. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-06-01

    The present study embodies the results obtained with iron-aluminium alloys whose composition runs from 0 to nearly 50 atoms per cent aluminium. Conditions of elaboration and transformation have been studied successively, as well as the Young's modulus and the flow stress; the last chapter embodies, a study of the Portevin-le-Chatelier effect in alloys of 40 atoms per cent of aluminium. I) The principal difficulty to clear up consisted in the intergranular brittleness of ordered alloys; this brittleness has been considerably reduced with appropriate conditions of elaboration and transformation. II) The studies upon the Young's modulus are in connection with iron-aluminium alloys; transformation temperatures are well shown up. The formation of covalent bonds on and after 25 atoms per cent show the highest values of the modulus. III) The analysis of variations of the flow stress according to the temperature show some connection with ordered structures, the existence of antiphase domains and the existence of sur-structure dislocations. IV) In the ordered Fe Al domain the kinetics of the Portevin-le-Chatelier effect could be explained by a mechanism of diffusion of vacancies. The role they play has been specified by the influence they exert upon the dislocations; this has led us to the inhomogeneous Rudman order; this inhomogeneous order could explain the shape of the traction curves. (author) [French] Cette etude comporte les resultats obtenus avec des alliages fer-aluminium dont la composition s'etend de 0 a pres de 50 atomes pour cent d'aluminium. Nous avons etudie successivement les conditions d'elaboration et de transformation, le module elastique et la limite elastique; un dernier chapitre est consacre a l'etude du phenomene Portevin-le-Chatelier dans les alliages a 40 atomes pour cent d'aluminium. I) La principale difficulte a resoudre residait dans la fragilite intergranulaire des alliages ordonnes; celle-ci a ete

  20. Elastic and plastic properties of iron-aluminium alloys. Special problems raised by the brittleness of alloys of high aluminium content; Proprietes elastiques et plastiques des alliages fer-aluminium. Problemes particuliers poses par la fragilite des alliages a forte teneur en aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Mouturat, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-06-01

    The present study embodies the results obtained with iron-aluminium alloys whose composition runs from 0 to nearly 50 atoms per cent aluminium. Conditions of elaboration and transformation have been studied successively, as well as the Young's modulus and the flow stress; the last chapter embodies, a study of the Portevin-le-Chatelier effect in alloys of 40 atoms per cent of aluminium. I) The principal difficulty to clear up consisted in the intergranular brittleness of ordered alloys; this brittleness has been considerably reduced with appropriate conditions of elaboration and transformation. II) The studies upon the Young's modulus are in connection with iron-aluminium alloys; transformation temperatures are well shown up. The formation of covalent bonds on and after 25 atoms per cent show the highest values of the modulus. III) The analysis of variations of the flow stress according to the temperature show some connection with ordered structures, the existence of antiphase domains and the existence of sur-structure dislocations. IV) In the ordered Fe Al domain the kinetics of the Portevin-le-Chatelier effect could be explained by a mechanism of diffusion of vacancies. The role they play has been specified by the influence they exert upon the dislocations; this has led us to the inhomogeneous Rudman order; this inhomogeneous order could explain the shape of the traction curves. (author) [French] Cette etude comporte les resultats obtenus avec des alliages fer-aluminium dont la composition s'etend de 0 a pres de 50 atomes pour cent d'aluminium. Nous avons etudie successivement les conditions d'elaboration et de transformation, le module elastique et la limite elastique; un dernier chapitre est consacre a l'etude du phenomene Portevin-le-Chatelier dans les alliages a 40 atomes pour cent d'aluminium. I) La principale difficulte a resoudre residait dans la fragilite intergranulaire des alliages ordonnes; celle-ci a ete considerablement reduite par des conditions

  1. Application of crown ethers to selective extraction and quantitative analysis of technetium 99, iodine 129 and cesium 135 in effluents

    International Nuclear Information System (INIS)

    Paviet, P.

    1992-01-01

    Properties of crown ethers are first recalled. Then extraction of technetium 99 is studied in actual radioactive effluents. Quantitative analysis is carried out by liquid scintillation and interference of tritium is corrected. Iodine 129 is extracted from radioactive effluents and determined by gamma spectrometry. Finally cesium 135 is extracted and determined by thermo ionization mass spectroscopy

  2. Application of myocardial perfusion quantitative imaging for the evaluation of therapeutic effect in canine with myocardial infarction

    International Nuclear Information System (INIS)

    Liang Hong; Chen Ju; Liu Sheng; Zeng Shiquan

    2000-01-01

    Myocardial blood perfusion (MBP) ECT and quantitative analysis were performed in 10 canines with experimental acute myocardial infarct (AMI). The accuracy of main myocardial quantitative index, including defect volume (DV) and defect fraction (DF), was estimated and correlated with histochemical staining (HS) of infarcted area. Other 21/AMI canines were divided into Nd:YAG laser trans-myocardial revascularization treated group LTMR and control group. All canines were performed MBP ECT after experimental AMI. Results found that the infarcted volume (IV) measured by HS has well correlated (r 0.88) with DV estimated by myocardial quantitative analysis. But the DF values calculated by both methods was not significantly different (t = 1.28 P > 0.05). In LTMR group 27.5% +- 3.9%, the DF is smaller than control group 32.1% +- 4.6% (t = 2.49 P 99m Tc-MIBI myocardial perfusion SPECT and quantitative study can accurately predict the myocardial blood flow and magnitude of injured myocardium. Nd:YAG LTMR could improve myocardial blood perfusion of ischemic myocardium and decrease effectively the infarct areas

  3. Application of quantitative real-time PCR compared to filtration methods for the enumeration of Escherichia coli in surface waters within Vietnam.

    Science.gov (United States)

    Vital, Pierangeli G; Van Ha, Nguyen Thi; Tuyet, Le Thi Hong; Widmer, Kenneth W

    2017-02-01

    Surface water samples in Vietnam were collected from the Saigon River, rural and suburban canals, and urban runoff canals in Ho Chi Minh City, Vietnam, and were processed to enumerate Escherichia coli. Quantification was done through membrane filtration and quantitative real-time polymerase chain reaction (PCR). Mean log colony-forming unit (CFU)/100 ml E. coli counts in the dry season for river/suburban canals and urban canals were log 2.8 and 3.7, respectively, using a membrane filtration method, while using Taqman quantitative real-time PCR they were log 2.4 and 2.8 for river/suburban canals and urban canals, respectively. For the wet season, data determined by the membrane filtration method in river/suburban canals and urban canals samples had mean counts of log 3.7 and 4.1, respectively. While mean log CFU/100 ml counts in the wet season using quantitative PCR were log 3 and 2, respectively. Additionally, the urban canal samples were significantly lower than those determined by conventional culture methods for the wet season. These results show that while quantitative real-time PCR can be used to determine levels of fecal indicator bacteria in surface waters, there are some limitations to its application and it may be impacted by sources of runoff based on surveyed samples.

  4. Baseline correction combined partial least squares algorithm and its application in on-line Fourier transform infrared quantitative analysis.

    Science.gov (United States)

    Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping

    2011-04-01

    In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Application of LC–MS/MS for quantitative analysis of glucocorticoids and stimulants in biological fluids

    Directory of Open Access Journals (Sweden)

    Jamshed Haneef

    2013-10-01

    Full Text Available Liquid chromatography tandem mass chromatography (LC–MS/MS is an important hyphenated technique for quantitative analysis of drugs in biological fluids. Because of high sensitivity and selectivity, LC–MS/MS has been used for pharmacokinetic studies, metabolites identification in the plasma and urine. This manuscript gives comprehensive analytical review, focusing on chromatographic separation approaches (column packing materials, column length and mobile phase as well as different acquisition modes (SIM, MRM for quantitative analysis of glucocorticoids and stimulants. This review is not meant to be exhaustive but rather to provide a general overview for detection and confirmation of target drugs using LC–MS/MS and thus useful in the doping analysis, toxicological studies as well as in pharmaceutical analysis. Keywords: LC–MS/MS, Ionization techniques, Glucocorticoids, Stimulants, Hyphenated techniques, Biological fluid

  6. Application of 3D and 2D quantitative shear wave elastography (SWE) to differentiate between benign and malignant breast masses.

    Science.gov (United States)

    Tian, Jie; Liu, Qianqi; Wang, Xi; Xing, Ping; Yang, Zhuowen; Wu, Changjun

    2017-01-20

    As breast cancer tissues are stiffer than normal tissues, shear wave elastography (SWE) can locally quantify tissue stiffness and provide histological information. Moreover, tissue stiffness can be observed on three-dimensional (3D) colour-coded elasticity maps. Our objective was to evaluate the diagnostic performances of quantitative features in differentiating breast masses by two-dimensional (2D) and 3D SWE. Two hundred ten consecutive women with 210 breast masses were examined with B-mode ultrasound (US) and SWE. Quantitative features of 3D and 2D SWE were assessed, including elastic modulus standard deviation (E SD E ) measured on SWE mode images and E SD U measured on B-mode images, as well as maximum elasticity (E max ). Adding quantitative features to B-mode US improved the diagnostic performance (p < 0.05) and reduced false-positive biopsies (p < 0.0001). The area under the receiver operating characteristic curve (AUC) of 3D SWE was similar to that of 2D SWE for E SD E (p = 0.026) and E SD U (p = 0.159) but inferior to that of 2D SWE for E max (p = 0.002). Compared with E SD U , E SD E showed a higher AUC on 2D (p = 0.0038) and 3D SWE (p = 0.0057). Our study indicates that quantitative features of 3D and 2D SWE can significantly improve the diagnostic performance of B-mode US, especially 3D SWE E SD E , which shows considerable clinical value.

  7. Using quantitative image analysis to classify axillary lymph nodes on breast MRI: A new application for the Z 0011 Era

    Energy Technology Data Exchange (ETDEWEB)

    Schacht, David V., E-mail: dschacht@radiology.bsd.uchicago.edu; Drukker, Karen, E-mail: kdrukker@uchicago.edu; Pak, Iris, E-mail: irisgpak@gmail.com; Abe, Hiroyuki, E-mail: habe@radiology.bsd.uchicago.edu; Giger, Maryellen L., E-mail: m-giger@uchicago.edu

    2015-03-15

    Highlights: •Quantitative image analysis showed promise in evaluating axillary lymph nodes. •13 of 28 features performed better than guessing at metastatic status. •When all features were used in together, a considerably higher AUC was obtained. -- Abstract: Purpose: To assess the performance of computer extracted feature analysis of dynamic contrast enhanced (DCE) magnetic resonance images (MRI) of axillary lymph nodes. To determine which quantitative features best predict nodal metastasis. Methods: This institutional board-approved HIPAA compliant study, in which informed patient consent was waived, collected enhanced T1 images of the axilla from patients with breast cancer. Lesion segmentation and feature analysis were performed on 192 nodes using a laboratory-developed quantitative image analysis (QIA) workstation. The importance of 28 features were assessed. Classification used the features as input to a neural net classifier in a leave-one-case-out cross-validation and evaluated with receiver operating characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) values for features in the task of distinguishing between positive and negative nodes ranged from just over 0.50 to 0.70. Five features yielded AUCs greater than 0.65: two morphological and three textural features. In cross-validation, the neural net classifier obtained an AUC of 0.88 (SE 0.03) for the task of distinguishing between positive and negative nodes. Conclusion: QIA of DCE MRI demonstrated promising performance in discriminating between positive and negative axillary nodes.

  8. Implantation of the method of quantitative analysis by proton induced X-ray analysis and application to the analysis of aerosols

    International Nuclear Information System (INIS)

    Margulis, W.

    1977-09-01

    Fundamental aspects for the implementation of the method of quantitative analysis by proton induced X-ray spectroscopy are discussed. The calibration of the system was made by determining a response coefficient for selected elements, both by irradiating known amounts of these elements as well as by the use of theoretical and experimental parameters. The results obtained by these two methods agree within 5% for the analysed elements. A computer based technique of spectrum decomposition was developed to facilitate routine analysis. Finally, aerosol samples were measured as an example of a possible application of the method, and the results are discussed. (Author) [pt

  9. 2L-PCA: a two-level principal component analyzer for quantitative drug design and its applications.

    Science.gov (United States)

    Du, Qi-Shi; Wang, Shu-Qing; Xie, Neng-Zhong; Wang, Qing-Yan; Huang, Ri-Bo; Chou, Kuo-Chen

    2017-09-19

    A two-level principal component predictor (2L-PCA) was proposed based on the principal component analysis (PCA) approach. It can be used to quantitatively analyze various compounds and peptides about their functions or potentials to become useful drugs. One level is for dealing with the physicochemical properties of drug molecules, while the other level is for dealing with their structural fragments. The predictor has the self-learning and feedback features to automatically improve its accuracy. It is anticipated that 2L-PCA will become a very useful tool for timely providing various useful clues during the process of drug development.

  10. Application of a series of artificial neural networks to on-site quantitative analysis of lead into real soil samples by laser induced breakdown spectroscopy

    International Nuclear Information System (INIS)

    El Haddad, J.; Bruyère, D.; Ismaël, A.; Gallou, G.; Laperche, V.; Michel, K.; Canioni, L.; Bousquet, B.

    2014-01-01

    Artificial neural networks were applied to process data from on-site LIBS analysis of soil samples. A first artificial neural network allowed retrieving the relative amounts of silicate, calcareous and ores matrices into soils. As a consequence, each soil sample was correctly located inside the ternary diagram characterized by these three matrices, as verified by ICP-AES. Then a series of artificial neural networks were applied to quantify lead into soil samples. More precisely, two models were designed for classification purpose according to both the type of matrix and the range of lead concentrations. Then, three quantitative models were locally applied to three data subsets. This complete approach allowed reaching a relative error of prediction close to 20%, considered as satisfying in the case of on-site analysis. - Highlights: • Application of a series of artificial neural networks (ANN) to quantitative LIBS • Matrix-based classification of the soil samples by ANN • Concentration-based classification of the soil samples by ANN • Series of quantitative ANN models dedicated to the analysis of data subsets • Relative error of prediction lower than 20% for LIBS analysis of soil samples

  11. Development of methods for quantitative in vivo NMR and their application to the study of hepatic encephalopathy in the brain

    International Nuclear Information System (INIS)

    Graaf, A.A. de.

    1989-01-01

    The aim of the work presented in this thesis was to develop reliable methods for quantitative MRS that are medically relevant for the study of Hepatic Encephalopathy (HE) in rats. The required modifications of the initiation and control software of the 7 Tesla spectrometer system of the Spin Imaging group at the Technical University Delft (Netherlands), are described. Experimental methods for localized, water suppressed 1 H MRS with a surface coil, including Spectroscopic Imaging, were developed in order to solve the problems of irreproducibility and spectral overlap caused by water and lipid signals. A method for correction of line-shape distortions as a consequence of static magnetic field imperfections was developed and evaluated both theoretically and experimentally. An approach to solve the problems in the quantification of the 1 H MRS spectra, caused especially by spectral overlap, frequency dependent intensity distortions and intensity modulations in coupled spin systems, was developed and evaluated. The brain energy state during HE was investigated using 31 P MRS. The developed methods for quantitative 1 H MRS were applied to monitor the concentrations of severeal important brain amino acids and other metabolic compounds during the development of acute HE, and during the development of ammonia induced encephalopathy in two different animal models. (author). 201 refs.; 32 figs.; 28 schemes.; 11 tabs

  12. Qualitative vs. quantitative data: Controls on the accuracy of PID field screening in petroleum contamination assessment applications

    International Nuclear Information System (INIS)

    Luessen, M.J.; Allex, M.K.; Holzel, F.R.

    1995-01-01

    The use of photoionization detectors (PIDs) for field screening of soils for volatile organic contaminants has become a standard industry practice. PID screening data is generally utilized as a qualitative basis for selection of samples for laboratory analysis to quantify concentrations of specific contaminants of concern. Both qualitative field screening data and quantitative laboratory analytical data were reviewed for more than 100 hydrogeologic assessment sites in Ohio to evaluate controls on the effectiveness of field screening data. Assessment data evaluated was limited to sites at which the suspected contaminant source was a gasoline underground storage tanks system. In each case, a 10.0 eV (or greater) PID calibrated for benzene was used to screen soils which were analyzed for benzene, toluene, ethylbenzene and xylene (BTEX) by SW 846 method 8020. Controls on field screening which were evaluated for each site included (1) soil classification, (2) soil moisture, (3) weather conditions, (4) background levels, (5) equipment quality, (6) screening methodology, and (7) laboratory QA/QC. Statistical data analysis predictably indicated a general overestimate of total BTEX levels based on field screening (gasoline is approximately 25 weight percent BTEX). However, data locally indicated cases of both significant (i.e., more than an order of magnitude difference) over- and under-estimation of actual BTEX concentrations (i.e., quantitative laboratory data) by field screening data

  13. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography.

    Science.gov (United States)

    Jha, Abhinav K; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M

    2017-01-01

    Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis.

  14. Development and application of an oligonucleotide microarray and real-time quantitative PCR for detection of wastewater bacterial pathogens

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Dae-Young [National Water Research Institute, Environment Canada, 867 Lakeshore Road, Burlington, Ontario, L7R 4A6 (Canada)], E-mail: daeyoung.lee@ec.gc.ca; Lauder, Heather; Cruwys, Heather; Falletta, Patricia [National Water Research Institute, Environment Canada, 867 Lakeshore Road, Burlington, Ontario, L7R 4A6 (Canada); Beaudette, Lee A. [Environmental Science and Technology Centre, Environment Canada, 335 River Road South, Ottawa, Ontario, K1A 0H3 (Canada)], E-mail: lee.beaudette@ec.gc.ca

    2008-07-15

    Conventional microbial water quality test methods are well known for their technical limitations, such as lack of direct pathogen detection capacity and low throughput capability. The microarray assay has recently emerged as a promising alternative for environmental pathogen monitoring. In this study, bacterial pathogens were detected in municipal wastewater using a microarray equipped with short oligonucleotide probes targeting 16S rRNA sequences. To date, 62 probes have been designed against 38 species, 4 genera, and 1 family of pathogens. The detection sensitivity of the microarray for a waterborne pathogen Aeromonas hydrophila was determined to be approximately 1.0% of the total DNA, or approximately 10{sup 3}A. hydrophila cells per sample. The efficacy of the DNA microarray was verified in a parallel study where pathogen genes and E. coli cells were enumerated using real-time quantitative PCR (qPCR) and standard membrane filter techniques, respectively. The microarray and qPCR successfully detected multiple wastewater pathogen species at different stages of the disinfection process (i.e. secondary effluents vs. disinfected final effluents) and at two treatment plants employing different disinfection methods (i.e. chlorination vs. UV irradiation). This result demonstrates the effectiveness of the DNA microarray as a semi-quantitative, high throughput pathogen monitoring tool for municipal wastewater.

  15. Quantitative Structure-Use Relationship Model thresholds for Model Validation, Domain of Applicability, and Candidate Alternative Selection

    Data.gov (United States)

    U.S. Environmental Protection Agency — This file contains value of the model training set confusion matrix, domain of applicability evaluation based on training set to predicted chemicals structural...

  16. Quantitation of clevidipine in dog blood by liquid chromatography tandem mass spectrometry: application to a pharmacokinetic study.

    Science.gov (United States)

    Wei, Huihui; Gu, Yuan; Liu, Yanping; Chen, Yong; Liu, Changxiao; Si, Duanyun

    2014-11-15

    Clevidipine, a vascular selective calcium channel antagonist of the dihydropyridine class, is rapidly metabolized by ester hydrolysis because of incorporation of an ester linkage into the drug molecule. To characterize its pharmacokinetic profiles in dogs, a simple, rapid and sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed and validated for quantitation of clevidipine in dog blood. After one-step protein precipitation with methanol, the chromatographic separation was carried out on an Ecosil C18 column (150mm×4.6mm, 5μm) with a gradient mobile phase consisting of methanol and 5mM ammonium formate at a flow rate of 0.5mL/min. The quantitation analysis was performed using multiple reaction monitoring (MRM) at the specific ion transitions of m/z 454.1 [M-H](-)→m/z 234.1 for clevidipine and m/z 256.1 [M-H](-)→m/z 227.1 for elofesalamide (internal standard) in the negative ion mode with electrospray ionization (ESI) source. This validated LC-MS/MS method showed good linearity over the range 0.5-100ng/mL with the lower limit of quantitation (LLOQ) of 0.5ng/mL together with the satisfied intra- and inter-day precision, accuracy, extraction recovery and matrix effect. Stability testing indicated that clevidipine in dog blood with the addition of denaturant methanol was stable on workbench for 1h, at -80°C for up to 30 days, and after three freeze-thaw cycles. Extracted samples were also observed to be stable over 24h in an auto-sampler at 4°C. The validated method has been successfully applied to a pharmacokinetic study of clevidipine injection to 8 healthy Beagle dogs following intravenous infusion at a flow rate of 5mg/h for 0.5h. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Application of a Multiplex Quantitative PCR to Assess Prevalence and Intensity Of Intestinal Parasite Infections in a Controlled Clinical Trial

    DEFF Research Database (Denmark)

    Llewellyn, Stacey; Inpankaew, Tawin; Nery, Susana Vaz

    2016-01-01

    Background Accurate quantitative assessment of infection with soil transmitted helminths and protozoa is key to the interpretation of epidemiologic studies of these parasites, as well as for monitoring large scale treatment efficacy and effectiveness studies. As morbidity and transmission...... of helminth infections are directly related to both the prevalence and intensity of infection, there is particular need for improved techniques for assessment of infection intensity for both purposes. The current study aimed to evaluate two multiplex PCR assays to determine prevalence and intensity...... of intestinal parasite infections, and compare them to standard microscopy. Methodology/Principal Findings Faecal samples were collected from a total of 680 people, originating from rural communities in Timor-Leste (467 samples) and Cambodia (213 samples). DNA was extracted from stool samples and subject to two...

  18. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    Science.gov (United States)

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  19. Qualitative and Quantitative Analysis of Congested Marine Traffic Environment – An Application Using Marine Traffic Simulation System

    Directory of Open Access Journals (Sweden)

    Kazuhiko Hasegawa

    2013-06-01

    Full Text Available Difficulty of sailing is quite subjective matter. It depends on various factors. Using Marine Traffic Simulation System (MTSS developed by Osaka University this challenging subject is discussed. In this system realistic traffic flow including collision avoidance manoeuvres can be reproduced in a given area. Simulation is done for southward of Tokyo Bay, Strait of Singapore and off-Shanghai area changing traffic volume from 5 or 50 to 150 or 200% of the present volume. As a result, strong proportional relation between near-miss ratio and traffic density per hour per sailed area is found, independent on traffic volume, area size and configuration. The quantitative evaluation index of the difficulty of sailing, here called risk rate of the area is defined using thus defined traffic density and near-miss ratio.

  20. Impact of the Application of Redecision Methods in Executive Coaching Workshops on Psychological Wellbeing: a Quantitative Evaluation of Effectiveness

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2016-07-01

    Full Text Available Previous research has found that participants in redecision marathons experience increased personal growth and improvements in psychological well-being (McNeel, 1982; Noriega-Gayol, 1997; Widdowson & Rosseau, 2014. In this article, the authors conducted a quantitative analysis based on the use of the Ryff Scales of Psychological Wellbeing to determine whether participants (n=49 at an executive coaching redecision marathon would experience an increase in psychological well-being. The findings show statistically significant improvements in psychological well-being overall, and specifically within the sub-scales of autonomy, environmental mastery, personal growth and self-acceptance, suggesting that redecision-based workshops are effective for improving subjective psychological well-being.

  1. Pentobarbital quantitation using EMIT serum barbiturate assay reagents: application to monitoring of high-dose pentobarbital therapy.

    Science.gov (United States)

    Pape, B E; Cary, P L; Clay, L C; Godolphin, W

    1983-01-01

    Pentobarbital serum concentrations associated with a high-dose therapeutic regimen were determined using EMIT immunoassay reagents. Replicate analyses of serum controls resulted in a within-assay coefficient of variation of 5.0% and a between-assay coefficient of variation of 10%. Regression analysis of 44 serum samples analyzed by this technique (y) and a reference procedure (x) were y = 0.98x + 3.6 (r = 0.98; x = ultraviolet spectroscopy) and y = 1.04x + 2.4 (r = 0.96; x = high-performance liquid chromatography). Clinical evaluation of the results indicates the immunoassay is sufficiently sensitive and selective for pentobarbital to allow accurate quantitation within the therapeutic range associated with high-dose therapy.

  2. New quantitative methods for mineral and porosity mapping in clayey materials: application to the compacted bentonites of engineered barriers

    International Nuclear Information System (INIS)

    Pret, D.

    2003-12-01

    Clayey materials are well known for their non permeable properties and their textural changes between the dry and hydrated states. Their porous network is classically investigated in the dry state using bulk measurements. However, the relationship between porosity and mineral spatial heterogeneities in the hydrated state is poorly understood. The textural analysis limits induce some difficulties to understand the migration of solute species into compacted bentonites (as for nuclear waste repository). The goal of this work is to improve the analysis techniques for hydrated clayey materials in order to provide a multi-scale quantitative petrography. The bentonite samples are impregnated using a resin whose properties are close to water ones. The classical petrographic study reveals strong heterogeneities of spatial and size distributions of porosity and minerals. SEM images analysis allows a quantification and a simple mapping of pores and minerals into unaltered bentonites. Nevertheless, as alterations are suspected to happen in the repository context, two methods for the analysis of all types of materials have been also developed. Two specific softwares permits the treatments of autoradiographs and chemical element maps obtained using electron microprobe. The results are quantitative maps highlighting the spatial porosity heterogeneities from the decimetric to the micrometric scales. All pore sizes are taken into account including clay interlayer spaces. Moreover, an accurate mineral mapping is also supplied on millimetric areas with a spatial resolution close to the micrometer. In a widely point of view, this work provides new complementary tools for the textural analysis of fine grained materials and the improvement of migration modelling of solute species. (author)

  3. Real Patient and its Virtual Twin: Application of Quantitative Systems Toxicology Modelling in the Cardiac Safety Assessment of Citalopram.

    Science.gov (United States)

    Patel, Nikunjkumar; Wiśniowska, Barbara; Jamei, Masoud; Polak, Sebastian

    2017-11-27

    A quantitative systems toxicology (QST) model for citalopram was established to simulate, in silico, a 'virtual twin' of a real patient to predict the occurrence of cardiotoxic events previously reported in patients under various clinical conditions. The QST model considers the effects of citalopram and its most notable electrophysiologically active primary (desmethylcitalopram) and secondary (didesmethylcitalopram) metabolites, on cardiac electrophysiology. The in vitro cardiac ion channel current inhibition data was coupled with the biophysically detailed model of human cardiac electrophysiology to investigate the impact of (i) the inhibition of multiple ion currents (I Kr , I Ks , I CaL ); (ii) the inclusion of metabolites in the QST model; and (iii) unbound or total plasma as the operating drug concentration, in predicting clinically observed QT prolongation. The inclusion of multiple ion channel current inhibition and metabolites in the simulation with unbound plasma citalopram concentration provided the lowest prediction error. The predictive performance of the model was verified with three additional therapeutic and supra-therapeutic drug exposure clinical cases. The results indicate that considering only the hERG ion channel inhibition of only the parent drug is potentially misleading, and the inclusion of active metabolite data and the influence of other ion channel currents should be considered to improve the prediction of potential cardiac toxicity. Mechanistic modelling can help bridge the gaps existing in the quantitative translation from preclinical cardiac safety assessment to clinical toxicology. Moreover, this study shows that the QST models, in combination with appropriate drug and systems parameters, can pave the way towards personalised safety assessment.

  4. System Establishment and Method Application for Quantitatively Evaluating the Green Degree of the Products in Green Public Procurement

    Directory of Open Access Journals (Sweden)

    Shengguo Xu

    2016-09-01

    Full Text Available The government green purchase is widely considered to be an effective means of promoting sustainable consumption. However, how to identify the greener product is the biggest obstacle of government green purchase and it has not been well solved. A quantitative evaluation method is provided to measure the green degree of different products of the same use function with an indicator system established, which includes fundamental indicators, general indicators, and leading indicators. It can clearly show the products’ green extent by rating the scores of different products, which provides the government a tool to compare the green degree of different products and select greener ones. A comprehensive evaluation case of a project purchasing 1635 desk computers in Tianjin government procurement center is conducted using the green degree evaluation system. The environmental performance of the products were assessed quantitatively, and the evaluation price, which was the bid price minus the discount (the discount rate was according to the total scores attained by their environmental performance, and the final evaluation price ranking from low to high in turn is supplier C, D, E, A, and B. The winner, supplier C, was not the lowest bid price or the best environmental performance, but it performed well at both bid price and environmental performance so it deserved the project. It shows that the green extent evaluation system can help classify the different products by evaluating their environment performance including structure and connection technology, selection of materials and marks, prolonged use, hazardous substances, energy consumption, recyclability rate, etc. and price, so that it could help to choose the greener products.

  5. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    Science.gov (United States)

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well. © The Author(s) 2013.

  6. Evaluation of Quantitative Exposure Assessment Method for Nanomaterials in Mixed Dust Environments: Application in Tire Manufacturing Facilities.

    Science.gov (United States)

    Kreider, Marisa L; Cyrs, William D; Tosiano, Melissa A; Panko, Julie M

    2015-11-01

    Current recommendations for nanomaterial-specific exposure assessment require adaptation in order to be applied to complicated manufacturing settings, where a variety of particle types may contribute to the potential exposure. The purpose of this work was to evaluate a method that would allow for exposure assessment of nanostructured materials by chemical composition and size in a mixed dust setting, using carbon black (CB) and amorphous silica (AS) from tire manufacturing as an example. This method combined air sampling with a low pressure cascade impactor with analysis of elemental composition by size to quantitatively assess potential exposures in the workplace. This method was first pilot-tested in one tire manufacturing facility; air samples were collected with a Dekati Low Pressure Impactor (DLPI) during mixing where either CB or AS were used as the primary filler. Air samples were analyzed via scanning transmission electron microscopy (STEM) coupled with energy dispersive spectroscopy (EDS) to identify what fraction of particles were CB, AS, or 'other'. From this pilot study, it was determined that ~95% of all nanoscale particles were identified as CB or AS. Subsequent samples were collected with the Dekati Electrical Low Pressure Impactor (ELPI) at two tire manufacturing facilities and analyzed using the same methodology to quantify exposure to these materials. This analysis confirmed that CB and AS were the predominant nanoscale particle types in the mixing area at both facilities. Air concentrations of CB and AS ranged from ~8900 to 77600 and 400 to 22200 particles cm(-3), respectively. This method offers the potential to provide quantitative estimates of worker exposure to nanoparticles of specific materials in a mixed dust environment. With pending development of occupational exposure limits for nanomaterials, this methodology will allow occupational health and safety practitioners to estimate worker exposures to specific materials, even in scenarios

  7. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    Science.gov (United States)

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  8. Application of quantitative structure-activity relationship to the determination of binding constant based on fluorescence quenching

    Energy Technology Data Exchange (ETDEWEB)

    Wen Yingying [Department of Applied Chemistry, Yantai University, Yantai 264005 (China); Liu Huitao, E-mail: liuht-ytu@163.co [Department of Applied Chemistry, Yantai University, Yantai 264005 (China); Luan Feng; Gao Yuan [Department of Applied Chemistry, Yantai University, Yantai 264005 (China)

    2011-01-15

    Quantitative structure-activity relationship (QSAR) model was used to predict and explain binding constant (log K) determined by fluorescence quenching. This method allowed us to predict binding constants of a variety of compounds with human serum albumin (HSA) based on their structures alone. Stepwise multiple linear regression (MLR) and nonlinear radial basis function neural network (RBFNN) were performed to build the models. The statistical parameters provided by the MLR model (R{sup 2}=0.8521, RMS=0.2678) indicated satisfactory stability and predictive ability while the RBFNN predictive ability is somewhat superior (R{sup 2}=0.9245, RMS=0.1736). The proposed models were used to predict the binding constants of two bioactive components in traditional Chinese medicines (isoimperatorin and chrysophanol) whose experimental results were obtained in our laboratory and the predicted results were in good agreement with the experimental results. This QSAR approach can contribute to a better understanding of structural factors of the compounds responsible for drug-protein interactions, and can be useful in predicting the binding constants of other compounds. - Research Highlights: QSAR models for binding constants of some compounds to HSA were developed. The models provide a simple and straightforward way to predict binding constant. QSAR can give some insight into structural features related to binding behavior.

  9. Application of a quantitative histological health index for Antarctic rock cod (Trematomus bernacchii) from Davis Station, East Antarctica.

    Science.gov (United States)

    Corbett, Patricia A; King, Catherine K; Mondon, Julie A

    2015-08-01

    A quantitative Histological Health Index (HHI) was applied to Antarctic rock cod (Trematomus bernacchii) using gill, liver, spleen, kidney and gonad to assess the impact of wastewater effluent from Davis Station, East Antarctica. A total of 120 fish were collected from 6 sites in the Prydz Bay region of East Antarctica at varying distances from the wastewater outfall. The HHI revealed a greater severity of alteration in fish at the wastewater outfall, which decreased stepwise with distance. Gill and liver displayed the greatest severity of alteration in fish occurring in close proximity to the wastewater outfall, showing severe and pronounced alteration respectively. Findings of the HHI add to a growing weight of evidence indicating that the current level of wastewater treatment at Davis Station is insufficient to prevent impact to the surrounding environment. The HHI for T. bernacchii developed in this study is recommended as a useful risk assessment tool for assessing in situ, sub-lethal impacts from station-derived contamination in coastal regions throughout Antarctica. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Novel atomic absorption spectrometric and rapid spectrophotometric methods for the quantitation of paracetamol in saliva: application to pharmacokinetic studies.

    Science.gov (United States)

    Issa, M M; Nejem, R M; El-Abadla, N S; Al-Kholy, M; Saleh, Akila A

    2008-01-01

    A novel atomic absorption spectrometric method and two highly sensitive spectrophotometric methods were developed for the determination of paracetamol. These techniques based on the oxidation of paracetamol by iron (III) (method I); oxidation of p-aminophenol after the hydrolysis of paracetamol (method II). Iron (II) then reacts with potassium ferricyanide to form Prussian blue color with a maximum absorbance at 700 nm. The atomic absorption method was accomplished by extracting the excess iron (III) in method II and aspirates the aqueous layer into air-acetylene flame to measure the absorbance of iron (II) at 302.1 nm. The reactions have been spectrometrically evaluated to attain optimum experimental conditions. Linear responses were exhibited over the ranges 1.0-10, 0.2-2.0 and 0.1-1.0 mug/ml for method I, method II and atomic absorption spectrometric method, respectively. A high sensitivity is recorded for the proposed methods I and II and atomic absorption spectrometric method value indicate: 0.05, 0.022 and 0.012 mug/ml, respectively. The limit of quantitation of paracetamol by method II and atomic absorption spectrometric method were 0.20 and 0.10 mug/ml. Method II and the atomic absorption spectrometric method were applied to demonstrate a pharmacokinetic study by means of salivary samples in normal volunteers who received 1.0 g paracetamol. Intra and inter-day precision did not exceed 6.9%.

  11. Vibrational algorithms for quantitative crystallographic analyses of hydroxyapatite-based biomaterials: II, application to decayed human teeth.

    Science.gov (United States)

    Adachi, Tetsuya; Pezzotti, Giuseppe; Yamamoto, Toshiro; Ichioka, Hiroaki; Boffelli, Marco; Zhu, Wenliang; Kanamura, Narisato

    2015-05-01

    A systematic investigation, based on highly spectrally resolved Raman spectroscopy, was undertaken to research the efficacy of vibrational assessments in locating chemical and crystallographic fingerprints for the characterization of dental caries and the early detection of non-cavitated carious lesions. Raman results published by other authors have indicated possible approaches for this method. However, they conspicuously lacked physical insight at the molecular scale and, thus, the rigor necessary to prove the efficacy of this spectroscopy method. After solving basic physical challenges in a companion paper, we apply them here in the form of newly developed Raman algorithms for practical dental research. Relevant differences in mineral crystallite (average) orientation and texture distribution were revealed for diseased enamel at different stages compared with healthy mineralized enamel. Clear spectroscopy features could be directly translated in terms of a rigorous and quantitative classification of crystallography and chemical characteristics of diseased enamel structures. The Raman procedure enabled us to trace back otherwise invisible characteristics in early caries, in the translucent zone (i.e., the advancing front of the disease) and in the body of lesion of cavitated caries.

  12. Recent technologic developments on high-resolution beta imaging systems for quantitative autoradiography and double labeling applications

    CERN Document Server

    Barthe, N; Chatti, K; Coulon, P; Maitrejean, S; 10.1016/j.nima.2004.03.014

    2004-01-01

    Two novel beta imaging systems, particularly interesting in the field of radiopharmacology and molecular biology research, were developed these last years. (1) a beta imager was derived from research conducted by Pr Charpak at CERN. This parallel plate avalanche chamber is a direct detection system of beta radioactivity, which is particularly adapted for qualitative and quantitative autoradiography. With this detector, autoradiographic techniques can be performed with emitters such as /sup 99m/Tc because this radionuclide emits many low-energy electrons and the detector has a very low sensitivity to low-range gamma -rays. Its sensitivity (smallest activity detected: 0.007 cpm/mm/sup 2/ for /sup 3/H and 0.01 for /sup 14/C), linearity (over a dynamic range of 10/sup 4/) and spatial resolution (50 mu m for /sup 3/H or /sup 99m/Tc to 150 mu m for /sup 32/P or /sup 18/F ( beta /sup +/)) gives a real interest to this system as a new imaging device. Its principle of detection is based on the analysis of light emitte...

  13. The application of Near Infrared Reflectance Spectroscopy (NIRS) for the quantitative analysis of hydrocortisone in primary materials

    OpenAIRE

    A. PITTAS; C. SERGIDES; K. NIKOLICH

    2001-01-01

    Near Infrared Reflectance Spectroscopy (NIRS), coupled with fiber optic probes, has been shown to be a quick and reliable analytical tool for quality assurance and quality control in the pharmaceutical industry, both for verifications of raw materials and quantification of the active ingredients in final products. In this paper, a typical pharmaceutical product, hydrocortisone sodium succinate, is used as an example for the application of NIR spectroscopy for quality control. In order to deve...

  14. The Effects of Simultaneous Application of Different Organic and Biological Fertilizers on Quantitative and Qualitative Characteristics of Cucurbita pepo L.

    Directory of Open Access Journals (Sweden)

    M Jahan

    2013-08-01

    Full Text Available Understanding of relations and interactions between ecosystem’s components and plants is one of the main conditions for sustainable production of medicinal plants. To study the effect of simultaneous application of organic and biological fertilizers on yield and yield components of zucchini squash, a split plot arrangement of factors based on randomized complete block design with tree replications was used during 2009-10 growing season. The mainplot factor was the type of organic fertilizers, including 1-cow manure, 2-sheep manure, 3-chicken manure, 4-vermicompost and 5-control. The subplot factor was the biofertilizer (namely Nitragin, containing Azotobacter sp. , Azospirillum sp. and Pseudomonas sp., utilization. The results showed the positive but non significant effect of organic and biological fertilizers on yield and yield components of zucchini squash. Amongst the organic fertilizers, cow and chicken manure, have superiority compared the others. The highest seed oil and protein percent resulted in chicken manure, although there was not significant different between treatments due to seed oil percent. The positive effect of organic and biological fertilizers on seed yield was higher than fruit yield. Positive correlations found between fruit and seed yield, and between one fruit weight and one fruit seed weight (R2=0.72** and 0.56**, respectively. At a glance, cow manure solely application was better than its application with nitragin. Nitragin application has no significant effect on some traits, when utilized with sheep manure and vermicompost. The possibilities of antagonistic effect among organic and biological fertilizers needs to be more studied.

  15. Effect of Foliar Application of Iron, Zinc and Manganese on Quantitative and Qualitative Characteristics of Two Varieties of Grain Millet

    Directory of Open Access Journals (Sweden)

    H. Javadi

    2016-12-01

    Full Text Available In order to study the effect of foliar application of Fe, Zn and Mn on yield, yield components and protein content of two varieties of grain millet an experiment was conducted as factorial based on randomized complete block design with three replications in research field of Birjand branch, Islamic Azad University at 2010.  In this study two millet varieties including Bastan (Setaria italica and Pishahang (Panicum miliaceum, and six levels of foliar micronutrient fertilizer including control, Fe, Zn, Mn, (Fe+Zn, (Fe+Zn+Mn were investigated. The results indicated that, panicle length, 1000 grain weight and panicle number per m2 were higher in Pishahang than Bastan, but grain yield, number of seeds per panicle, harvest index and protein yield were higher in Bastan. Characteristics such as panicle length, biological yield and harvest index and protein percentage were affected by foliar micronutrient fertilizer but grain yield remained unchanged. Foliar application with (Fe+Zn+Mn increased protein content compared to the control, but it did not affect protein yield. According to the results of this experiment, Bastan millet variety and foliar application of Zn is potent to produce the maximum grain yield, albeit it warrants further studies.

  16. Quantitative phosphoproteomics using acetone-based peptide labeling: Method evaluation and application to a cardiac ischemia/reperfusion model

    Science.gov (United States)

    Wijeratne, Aruna B.; Manning, Janet R.; Schultz, Jo El J.; Greis, Kenneth D.

    2013-01-01

    Mass spectrometry (MS) techniques to globally profile protein phosphorylation in cellular systems that are relevant to physiological or pathological changes have been of significant interest in biological research. In this report, an MS-based strategy utilizing an inexpensive acetone-based peptide labeling technique known as reductive alkylation by acetone (RABA) for quantitative phosphoproteomics was explored to evaluate its capacity. Since the chemistry for RABA-labeling for phosphorylation profiling had not been previously reported, it was first validated using a standard phosphoprotein and identical phosphoproteomes from cardiac tissue extracts. A workflow was then utilized to compare cardiac tissue phosphoproteomes from mouse hearts not expressing FGF2 vs. hearts expressing low molecular weight fibroblast growth factor-2 (LMW FGF2) to relate low molecular weight fibroblast growth factor-2 (LMW FGF2) mediated cardioprotective phenomena induced by ischemia/reperfusion (I/R) injury of hearts, with downstream phosphorylation changes in LMW FGF2 signaling cascades. Statistically significant phosphorylation changes were identified at 14 different sites on 10 distinct proteins including some with mechanisms already established for LMW FGF2-mediated cardioprotective signaling (e.g. connexin-43), some with new details linking LMW FGF2 to the cardioprotective mechanisms (e.g. cardiac myosin binding protein C or cMyBPC), and also several new downstream effectors not previously recognized for cardio-protective signaling by LMW FGF2. Additionally, one of the phosphopeptides, cMyBPC/pSer-282, identified was further verified with site-specific quantification using an SRM (selected reaction monitoring)-based approach that also relies on isotope labeling of a synthetic phosphopeptide with deuterated acetone as an internal standard. Overall, this study confirms that the inexpensive acetone-based peptide labeling can be used in both exploratory and targeted quantification

  17. Application of propidium monoazide quantitative real-time PCR to quantify the viability of Lactobacillus delbrueckii ssp. bulgaricus.

    Science.gov (United States)

    Shao, Yuyu; Wang, Zhaoxia; Bao, Qiuhua; Zhang, Heping

    2016-12-01

    In this study, a combination of propidium monoazide (PMA) and quantitative real-time PCR (qPCR) was used to develop a method to determine the viability of cells of Lactobacillus delbrueckii ssp. bulgaricus ND02 (L. bulgaricus) that may have entered into a viable but nonculturable state. This can happen due to its susceptibility to cold shock during lyophilization and storage. Propidium monoazide concentration, PMA incubation time, and light exposure time were optimized to fully exploit the PMA-qPCR approach to accurately assess the total number of living L. bulgaricus ND02. Although PMA has little influence on living cells, when concentrations of PMA were higher than 30μg/mL the number of PCR-positive living bacteria decreased from 10 6 to 10 5 cfu/mL in comparison with qPCR enumeration. Mixtures of living and dead cells were used as method verification samples for enumeration by PMA-qPCR, demonstrating that this method was feasible and effective for distinguishing living cells of L. bulgaricus when mixed with a known number of dead cells. We suggest that several conditions need to be studied further before PMA-qPCR methods can be accurately used to distinguish living from dead cells for enumeration under more realistic sampling situations. However, this research provides a rapid way to enumerate living cells of L. bulgaricus and could be used to optimize selection of cryoprotectants in the lyophilization process and develop technologies for high cell density cultivation and optimal freeze-drying processes. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. PETROGRAPHY AND APPLICATION OF THE RIETVELD METHOD TO THE QUANTITATIVE ANALYSIS OF PHASES OF NATURAL CLINKER GENERATED BY COAL SPONTANEOUS COMBUSTION

    Directory of Open Access Journals (Sweden)

    Pinilla A. Jesús Andelfo

    2010-06-01

    Full Text Available

    Fine-grained and mainly reddish color, compact and slightly breccious and vesicular pyrometamorphic rocks (natural clinker are associated to the spontaneous combustion of coal seams of the Cerrejón Formation exploited by Carbones del Cerrejón Limited in La Guajira Peninsula (Caribbean Region of Colombia. These rocks constitute remaining inorganic materials derived from claystones, mudstones and sandstones originally associated with the coal and are essentially a complex mixture of various amorphous and crystalline inorganic constituents. In this paper, a petrographic characterization of natural clinker, aswell as the application of the X-ray diffraction (Rietveld method by mean of quantitative analysis of its mineral phases were carried out. The RIQAS program was used for the refinement of X ray powder diffraction profiles, analyzing the importance of using the correct isostructural models for each of the existing phases, which were obtained from the Inorganic Crystal Structure Database (ICSD. The results obtained in this investigation show that the Rietveld method can be used as a powerful tool in the quantitative analysis of phases in polycrystalline samples, which has been a traditional problem in geology.

  19. Chemical applicability domain of the local lymph node assay (LLNA) for skin sensitisation potency. Part 4. Quantitative correlation of LLNA potency with human potency.

    Science.gov (United States)

    Roberts, David W; Api, Anne Marie

    2018-07-01

    Prediction of skin sensitisation potential and potency by non-animal methods is the target of many active research programmes. Although the aim is to predict sensitisation potential and potency in humans, data from the murine local lymph node assay (LLNA) constitute much the largest source of quantitative data on in vivo skin sensitisation. The LLNA has been the preferred in vivo method for identification of skin sensitising chemicals and as such is potentially valuable as a benchmark for assessment of non-animal approaches. However, in common with all predictive test methods, the LLNA is subject to false positives and false negatives with an overall level of accuracy said variously to be approximately 80% or 90%. It is also necessary to consider the extent to which, for true positives, LLNA potency correlates with human potency. In this paper LLNA potency and human potency are compared so as to express quantitatively the correlation between them, and reasons for non-agreement between LLNA and human potency are analysed. This leads to a better definition of the applicability domain of the LLNA, within which LLNA data can be used confidently to predict human potency and as a benchmark to assess the performance of non-animal approaches. Copyright © 2018. Published by Elsevier Inc.

  20. Application of Near-Infrared Spectroscopy to Quantitatively Determine Relative Content of Puccnia striiformis f. sp. tritici DNA in Wheat Leaves in Incubation Period

    Directory of Open Access Journals (Sweden)

    Yaqiong Zhao

    2017-01-01

    Full Text Available Stripe rust caused by Puccinia striiformis f. sp. tritici (Pst is a devastating wheat disease worldwide. Potential application of near-infrared spectroscopy (NIRS in detection of pathogen amounts in latently Pst-infected wheat leaves was investigated for disease prediction and control. A total of 300 near-infrared spectra were acquired from the Pst-infected leaf samples in an incubation period, and relative contents of Pst DNA in the samples were obtained using duplex TaqMan real-time PCR arrays. Determination models of the relative contents of Pst DNA in the samples were built using quantitative partial least squares (QPLS, support vector regression (SVR, and a method integrated with QPLS and SVR. The results showed that the kQPLS-SVR model built with a ratio of training set to testing set equal to 3 : 1 based on the original spectra, when the number of the randomly selected wavelength points was 700, the number of principal components was 8, and the number of the built QPLS models was 5, was the best. The results indicated that quantitative detection of Pst DNA in leaves in the incubation period could be implemented using NIRS. A novel method for determination of latent infection levels of Pst and early detection of stripe rust was provided.

  1. A guide through the computational analysis of isotope-labeled mass spectrometry-based quantitative proteomics data: an application study

    Directory of Open Access Journals (Sweden)

    Haußmann Ute

    2011-06-01

    Full Text Available Abstract Background Mass spectrometry-based proteomics has reached a stage where it is possible to comprehensively analyze the whole proteome of a cell in one experiment. Here, the employment of stable isotopes has become a standard technique to yield relative abundance values of proteins. In recent times, more and more experiments are conducted that depict not only a static image of the up- or down-regulated proteins at a distinct time point but instead compare developmental stages of an organism or varying experimental conditions. Results Although the scientific questions behind these experiments are of course manifold, there are, nevertheless, two questions that commonly arise: 1 which proteins are differentially regulated regarding the selected experimental conditions, and 2 are there groups of proteins that show similar abundance ratios, indicating that they have a similar turnover? We give advice on how these two questions can be answered and comprehensively compare a variety of commonly applied computational methods and their outcomes. Conclusions This work provides guidance through the jungle of computational methods to analyze mass spectrometry-based isotope-labeled datasets and recommends an effective and easy-to-use evaluation strategy. We demonstrate our approach with three recently published datasets on Bacillus subtilis 12 and Corynebacterium glutamicum 3. Special focus is placed on the application and validation of cluster analysis methods. All applied methods were implemented within the rich internet application QuPE 4. Results can be found at http://qupe.cebitec.uni-bielefeld.de.

  2. Quantitative Campylobacter spp., antibiotic resistance genes, and veterinary antibiotics in surface and ground water following manure application: Influence of tile drainage control.

    Science.gov (United States)

    Frey, Steven K; Topp, Edward; Khan, Izhar U H; Ball, Bonnie R; Edwards, Mark; Gottschall, Natalie; Sunohara, Mark; Lapen, David R

    2015-11-01

    This work investigated chlortetracycline, tylosin, and tetracycline (plus transformation products), and DNA-based quantitative Campylobacter spp. and Campylobacter tetracycline antibiotic resistant genes (tet(O)) in tile drainage, groundwater, and soil before and following a liquid swine manure (LSM) application on clay loam plots under controlled (CD) and free (FD) tile drainage. Chlortetracycline/tetracycline was strongly bound to manure solids while tylosin dominated in the liquid portion of manure. The chlortetracycline transformation product isochlortetracycline was the most persistent analyte in water. Rhodamine WT (RWT) tracer was mixed with manure and monitored in tile and groundwater. RWT and veterinary antibiotic (VA) concentrations were strongly correlated in water which supported the use of RWT as a surrogate tracer. While CD reduced tile discharge and eliminated application-induced VA movement (via tile) to surface water, total VA mass loading to surface water was not affected by CD. At both CD and FD test plots, the biggest 'flush' of VA mass and highest VA concentrations occurred in response to precipitation received 2d after application, which strongly influenced the flow abatement capacity of CD on account of highly elevated water levels in field initiating overflow drainage for CD systems (when water level tile and groundwater became very low within 10d following application. Both Campylobacter spp. and Campylobacter tet(O) genes were present in groundwater and soil prior to application, and increased thereafter. Unlike the VA compounds, Campylobacter spp. and Campylobacter tet(O) gene loadings in tile drainage were reduced by CD, in relation to FD. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  3. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  4. Potential application of quantitative microbiological risk assessment techniques to an aseptic-UHT process in the food industry.

    Science.gov (United States)

    Pujol, Laure; Albert, Isabelle; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2013-04-01

    Aseptic ultra-high-temperature (UHT)-type processed food products (e.g., milk or soup) are ready to eat products which are consumed extensively globally due to a combination of their comparative high quality and long shelf life, with no cold chain or other preservation requirements. Due to the inherent microbial vulnerability of aseptic-UHT product formulations, the safety and stability-related performance objectives (POs) required at the end of the manufacturing process are the most demanding found in the food industry. The key determinants to achieving sterility, and which also differentiates aseptic-UHT from in-pack sterilised products, are the challenges associated with the processes of aseptic filling and sealing. This is a complex process that has traditionally been run using deterministic or empirical process settings. Quantifying the risk of microbial contamination and recontamination along the aseptic-UHT process, using the scientifically based process quantitative microbial risk assessment (QMRA), offers the possibility to improve on the currently tolerable sterility failure rate (i.e., 1 defect per 10,000 units). In addition, benefits of applying QMRA are (i) to implement process settings in a transparent and scientific manner; (ii) to develop a uniform common structure whatever the production line, leading to a harmonisation of these process settings, and; (iii) to bring elements of a cost-benefit analysis of the management measures. The objective of this article is to explore how QMRA techniques and risk management metrics may be applied to aseptic-UHT-type processed food products. In particular, the aseptic-UHT process should benefit from a number of novel mathematical and statistical concepts that have been developed in the field of QMRA. Probabilistic techniques such as Monte Carlo simulation, Bayesian inference and sensitivity analysis, should help in assessing the compliance with safety and stability-related POs set at the end of the manufacturing

  5. Application of a Multiplex Quantitative PCR to Assess Prevalence and Intensity Of Intestinal Parasite Infections in a Controlled Clinical Trial

    Science.gov (United States)

    Llewellyn, Stacey; Inpankaew, Tawin; Nery, Susana Vaz; Gray, Darren J.; Verweij, Jaco J.; Clements, Archie C. A.; Gomes, Santina J.; Traub, Rebecca; McCarthy, James S.

    2016-01-01

    Background Accurate quantitative assessment of infection with soil transmitted helminths and protozoa is key to the interpretation of epidemiologic studies of these parasites, as well as for monitoring large scale treatment efficacy and effectiveness studies. As morbidity and transmission of helminth infections are directly related to both the prevalence and intensity of infection, there is particular need for improved techniques for assessment of infection intensity for both purposes. The current study aimed to evaluate two multiplex PCR assays to determine prevalence and intensity of intestinal parasite infections, and compare them to standard microscopy. Methodology/Principal Findings Faecal samples were collected from a total of 680 people, originating from rural communities in Timor-Leste (467 samples) and Cambodia (213 samples). DNA was extracted from stool samples and subject to two multiplex real-time PCR reactions the first targeting: Necator americanus, Ancylostoma spp., Ascaris spp., and Trichuris trichiura; and the second Entamoeba histolytica, Cryptosporidium spp., Giardia. duodenalis, and Strongyloides stercoralis. Samples were also subject to sodium nitrate flotation for identification and quantification of STH eggs, and zinc sulphate centrifugal flotation for detection of protozoan parasites. Higher parasite prevalence was detected by multiplex PCR (hookworms 2.9 times higher, Ascaris 1.2, Giardia 1.6, along with superior polyparasitism detection with this effect magnified as the number of parasites present increased (one: 40.2% vs. 38.1%, two: 30.9% vs. 12.9%, three: 7.6% vs. 0.4%, four: 0.4% vs. 0%). Although, all STH positive samples were low intensity infections by microscopy as defined by WHO guidelines the DNA-load detected by multiplex PCR suggested higher intensity infections. Conclusions/Significance Multiplex PCR, in addition to superior sensitivity, enabled more accurate determination of infection intensity for Ascaris, hookworms and

  6. Effect of Zinc and Salicylic acid Foliar Application on Quantitative and Qualititative Characteristics of Soybean under Deficit Irrigation Conditions

    Directory of Open Access Journals (Sweden)

    Z Zarei

    2017-03-01

    Full Text Available Introduction Soybean (Glycine max (L. Merrill is a leguminous annual crop belonging to the Fabaceae family, that because an important source of food containing 20 to 28 grain oil percent and high protein is the most important oilseed of worldwide interest. Recently, cultivation of this plant is considered as a valuable oil plant in crop rotation. Drought, salinity, heat and freezing are environmental conditions that cause adverse effects on the growth of plants. Water deficit more than other stresses limits the growth of crops. Yield of soybean decreased due to drought stress. The consumption of fertilizers increases the quality of crops. According to the findings of Yasari and Vahedi (2012 use of Zn in soil and foliar application has an increasing effect on the percentage and the amount of oil and protein in soybean product. The role of salicylic acid (SA is reducing the effects of environmental stresses. It appears that water stress impairs plants and zinc alleviates water stress injuries. Thus, the purpose of this study was to evaluate the effect of water stress, zinc and salicylic acid foliar application on oil and grain protein percentage and their relation with oil and protein yield of soybean. Materials and Methods This study was carried out in the agricultural garden of Lorestan-Iran, in 2013. The meteorological data of the region are representing in Table 2. The soil was clay-loam texture (Table 1. The experiment was performed using Split factorial in a randomized complete block design with four replications. In this study, main factor was two levels of irrigation regimes: after 60 (optimum irrigation and 120 mm (stress evaporation from evaporation pan class A and subplot were considered combination of zinc foliar application (Zero and 1 L/ha, in two levels and salicylic acid (0, 0.5 and 1 mM. All statistical analyses were carried out using SAS software and the correlation was done using MSTAT-C program. Results and Discussion In the

  7. Quantitative habitability.

    Science.gov (United States)

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  8. Application of Least-Squares Support Vector Machines for Quantitative Evaluation of Known Contaminant in Water Distribution System Using Online Water Quality Parameters

    Directory of Open Access Journals (Sweden)

    Kexin Wang

    2018-03-01

    Full Text Available In water-quality, early warning systems and qualitative detection of contaminants are always challenging. There are a number of parameters that need to be measured which are not entirely linearly related to pollutant concentrations. Besides the complex correlations between variable water parameters that need to be analyzed also impairs the accuracy of quantitative detection. In aspects of these problems, the application of least-squares support vector machines (LS-SVM is used to evaluate the water contamination and various conventional water quality sensors quantitatively. The various contaminations may cause different correlative responses of sensors, and also the degree of response is related to the concentration of the injected contaminant. Therefore to enhance the reliability and accuracy of water contamination detection a new method is proposed. In this method, a new relative response parameter is introduced to calculate the differences between water quality parameters and their baselines. A variety of regression models has been examined, as result of its high performance, the regression model based on genetic algorithm (GA is combined with LS-SVM. In this paper, the practical application of the proposed method is considered, controlled experiments are designed, and data is collected from the experimental setup. The measured data is applied to analyze the water contamination concentration. The evaluation of results validated that the LS-SVM model can adapt to the local nonlinear variations between water quality parameters and contamination concentration with the excellent generalization ability and accuracy. The validity of the proposed approach in concentration evaluation for potassium ferricyanide is proven to be more than 0.5 mg/L in water distribution systems.

  9. Quantitative determination of sirolimus in dog blood using liquid chromatography-tandem mass spectrometry, and its applications to pharmacokinetic studies.

    Science.gov (United States)

    Lee, Jong-Hwa; Cha, Kwang-Ho; Cho, Wonkyung; Park, Junsung; Park, Hee Jun; Cho, Youngseok; Hwang, Sung-Joo

    2010-12-01

    A rapid, sensitive method of detecting sirolimus in blood was developed and applied in pharmacokinetic studies employing deionized water for hemolysis and a weakly basic mobile phase to enhance chromatographic peak intensity. Dog blood samples were processed via liquid-liquid extraction and the amounts of sirolimus and tacrolimus, an internal standard, were quantified by LC-MS/MS. Specificity, the lower limit of quantification, linearity, accuracy, precision, dilution, recovery, matrix effects, robustness and stability were within the acceptable range for assay validation. The concentration of sirolimus was quantifiable in blood samples for up to 36 h after the dog had received a 3 mg/kg dose of sirolimus. These observations suggest that sirolimus can be detected at low levels in dog blood using a basic mobile phase and metal-free hemolysis. This method is therefore applicable to pharmacokinetic studies in dogs. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  10. The application of Near Infrared Reflectance Spectroscopy (NIRS for the quantitative analysis of hydrocortisone in primary materials

    Directory of Open Access Journals (Sweden)

    A. PITTAS

    2001-03-01

    Full Text Available Near Infrared Reflectance Spectroscopy (NIRS, coupled with fiber optic probes, has been shown to be a quick and reliable analytical tool for quality assurance and quality control in the pharmaceutical industry, both for verifications of raw materials and quantification of the active ingredients in final products. In this paper, a typical pharmaceutical product, hydrocortisone sodium succinate, is used as an example for the application of NIR spectroscopy for quality control. In order to develop an NIRS method with higher precision and accuracy than the official UV/VIS spectroscopic method (BP '99, 19 samples, taken from one year’s production and several prepared in the laboratory, having a hydrocortisone sodium succinate concentration in the range from 89.05%to 95.83 %, were analysed by NIR and UV/VIS spectroscopy. Three frequency ranges: 5939.73–5627.32 cm-1; 4863.64 – 4574.36 cm-1; 4308.23–4200.24 cm-1, with the best positive correlation between the changes in the spectral and concentration data, were chosen. The validity of the developed NIRS chemometric method for the determination of the hydrocortisone sodium succinate concentration, constructed by the partial least squares (PLS regression technique, is discussed. A correlation coefficient of 0.9758 and a standard error of cross validation (RMSECVof 1.06%were found between the UV/VI Sand òhe NIR spectroscopic results of the hydrocortisone sodium succinate concentration in the samples.

  11. Biomization and quantitative climate reconstruction techniques in northwestern Mexico—With an application to four Holocene pollen sequences

    Science.gov (United States)

    Ortega-Rosas, C. I.; Guiot, J.; Peñalba, M. C.; Ortiz-Acosta, M. E.

    2008-04-01

    New paleovegetation and paleoclimatic reconstructions from the Sierra Madre Occidental (SMO) in northwestern Mexico are presented. This work involves climate and biome reconstruction using Plant Functional Types (PFT) assigned to pollen taxa. We used fossil pollen data from four Holocene peat bogs located at different altitudes (1500-2000 m) at the border region of Sonora and Chihuahua at around 28° N latitude (Ortega-Rosas, C.I. 2003. Palinología de la Ciénega de Camilo: datos para la historia de la vegetación y el clima del Holoceno medio y superior en el NW de la Sierra Madre Occidental, Sonora, Mexico. Master Thesis, Universidad Nacional Autónoma de México, México D.F.; Ortega-Rosas, C.I., Peñalba, M.C., Guiot, J. Holocene altitudinal shifts in vegetation belts and environmental changes in the Sierra Madre Occidental, Northwestern Mexico. Submitted for publication of Palaeobotany and Palynology). The closest modern pollen data come from pollen analysis across an altitudinal transect from the Sonoran Desert towards the highlands of the temperate SMO at the same latitude (Ortega-Rosas, C.I. 2003. Palinología de la Ciénega de Camilo: datos para la historia de la vegetación y el clima del Holoceno medio y superior en el NW de la Sierra Madre Occidental, Sonora, Mexico. Master Thesis, Universidad Nacional Autónoma de México, México D.F.). An additional modern pollen dataset of 400 sites across NW Mexico and the SW United States was compiled from different sources (Davis, O.K., 1995. Climate and vegetation pattern in surface samples from arid western U.S.A.: application to Holocene climatic reconstruction. Palynology 19, 95-119, North American Pollen Database, Latin-American Pollen Database, personal data, and different scientific papers). For the biomization method (Prentice, I.C., Guiot, J., Huntley, B., Jolly, D., Cheddadi, R., 1996. Reconstructing biomes from paleoecological data: a general method and its application to European pollen data at 0 and

  12. Quantitative Finance

    Science.gov (United States)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  13. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    Science.gov (United States)

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Qupe--a Rich Internet Application to take a step forward in the analysis of mass spectrometry-based quantitative proteomics experiments.

    Science.gov (United States)

    Albaum, Stefan P; Neuweger, Heiko; Fränzel, Benjamin; Lange, Sita; Mertens, Dominik; Trötschel, Christian; Wolters, Dirk; Kalinowski, Jörn; Nattkemper, Tim W; Goesmann, Alexander

    2009-12-01

    The goal of present -omics sciences is to understand biological systems as a whole in terms of interactions of the individual cellular components. One of the main building blocks in this field of study is proteomics where tandem mass spectrometry (LC-MS/MS) in combination with isotopic labelling techniques provides a common way to obtain a direct insight into regulation at the protein level. Methods to identify and quantify the peptides contained in a sample are well established, and their output usually results in lists of identified proteins and calculated relative abundance values. The next step is to move ahead from these abstract lists and apply statistical inference methods to compare measurements, to identify genes that are significantly up- or down-regulated, or to detect clusters of proteins with similar expression profiles. We introduce the Rich Internet Application (RIA) Qupe providing comprehensive data management and analysis functions for LC-MS/MS experiments. Starting with the import of mass spectra data the system guides the experimenter through the process of protein identification by database search, the calculation of protein abundance ratios, and in particular, the statistical evaluation of the quantification results including multivariate analysis methods such as analysis of variance or hierarchical cluster analysis. While a data model to store these results has been developed, a well-defined programming interface facilitates the integration of novel approaches. A compute cluster is utilized to distribute computationally intensive calculations, and a web service allows to interchange information with other -omics software applications. To demonstrate that Qupe represents a step forward in quantitative proteomics analysis an application study on Corynebacterium glutamicum has been carried out. Qupe is implemented in Java utilizing Hibernate, Echo2, R and the Spring framework. We encourage the usage of the RIA in the sense of the 'software as a

  15. A method for energy window optimization for quantitative tasks that includes the effects of model-mismatch on bias: application to Y-90 bremsstrahlung SPECT imaging

    International Nuclear Information System (INIS)

    Rong Xing; Du Yong; Frey, Eric C

    2012-01-01

    Quantitative Yttrium-90 ( 90 Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of 90 Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for 90 Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as 90 Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In 90 Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative 90 Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were derived for

  16. A method for energy window optimization for quantitative tasks that includes the effects of model-mismatch on bias: application to Y-90 bremsstrahlung SPECT imaging.

    Science.gov (United States)

    Rong, Xing; Du, Yong; Frey, Eric C

    2012-06-21

    Quantitative Yttrium-90 ((90)Y) bremsstrahlung single photon emission computed tomography (SPECT) imaging has shown great potential to provide reliable estimates of (90)Y activity distribution for targeted radionuclide therapy dosimetry applications. One factor that potentially affects the reliability of the activity estimates is the choice of the acquisition energy window. In contrast to imaging conventional gamma photon emitters where the acquisition energy windows are usually placed around photopeaks, there has been great variation in the choice of the acquisition energy window for (90)Y imaging due to the continuous and broad energy distribution of the bremsstrahlung photons. In quantitative imaging of conventional gamma photon emitters, previous methods for optimizing the acquisition energy window assumed unbiased estimators and used the variance in the estimates as a figure of merit (FOM). However, for situations, such as (90)Y imaging, where there are errors in the modeling of the image formation process used in the reconstruction there will be bias in the activity estimates. In (90)Y bremsstrahlung imaging this will be especially important due to the high levels of scatter, multiple scatter, and collimator septal penetration and scatter. Thus variance will not be a complete measure of reliability of the estimates and thus is not a complete FOM. To address this, we first aimed to develop a new method to optimize the energy window that accounts for both the bias due to model-mismatch and the variance of the activity estimates. We applied this method to optimize the acquisition energy window for quantitative (90)Y bremsstrahlung SPECT imaging in microsphere brachytherapy. Since absorbed dose is defined as the absorbed energy from the radiation per unit mass of tissues in this new method we proposed a mass-weighted root mean squared error of the volume of interest (VOI) activity estimates as the FOM. To calculate this FOM, two analytical expressions were

  17. Novel Quantitative Real-Time LCR for the Sensitive Detection of SNP Frequencies in Pooled DNA: Method Development, Evaluation and Application

    Science.gov (United States)

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-01

    Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808

  18. Effects of Single and Combined Application of Organic, Biological and Chemical Fertilizers on Quantitative and Qualitative Yield of Coriander (Coriandrum sativum

    Directory of Open Access Journals (Sweden)

    M. Aghhavani Shajari

    2016-07-01

    Full Text Available Introduction: Medicinal plants were one of the main natural resources of Iran from ancient times. Coriander (Coriandrum sativum L. is from Apiaceae family that it has cultivated extensively in the world. Management and environmental factors such as nutritional management has a significant impact on the quantity and quality of plants. Application of organic fertilizers in conventional farming systems is not common and most of the nutritional need of plants supply through chemical fertilizers for short period. Excessive and unbalanced use of fertilizers in the long period, reduce crop yield and soil biological activity, accumulation of nitrates and heavy metals, and finally cause negative environmental effects and increase the cost of production. The use of bio-fertilizers and organic matter are taken into consideration to reduce the use of chemical fertilizers and increase the quality of most crops. Stability and soil fertility through the use of organic fertilizers are important due to having most of the elements required by plants and beneficial effects on physical, chemical, biological and soil fertility. Therefore, the aim of this research was to evaluate the effects of organic, biological and chemical fertilizers on quality and quantity characteristics of coriander. Materials and Methods: In order to study the effects of single and combined applications of organic, biological and chemical fertilizers on quantitative and qualitative characteristics of Coriander (Coriandrum sativum, an experiment was conducted based on a randomized complete block design with three replications and 12 treatments at Research Station, Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, in - 2011. Treatments included: (1 mycorrhizae (Glomus mosseae (2 biosulfur (Thiobacillus sp., (3 chemical fertilizer (NPK, (4 cow manure, 5( vermin compost, 6( mycorrhizae + chemical fertilizer, 7( mycorrhizae + cow manure, 8( mycorrhizae + vermicompost, 9( biosulfur

  19. Quantitative radiography

    International Nuclear Information System (INIS)

    Brase, J.M.; Martz, H.E.; Waltjen, K.E.; Hurd, R.L.; Wieting, M.G.

    1986-01-01

    Radiographic techniques have been used in nondestructive evaluation primarily to develop qualitative information (i.e., defect detection). This project applies and extends the techniques developed in medical x-ray imaging, particularly computed tomography (CT), to develop quantitative information (both spatial dimensions and material quantities) on the three-dimensional (3D) structure of solids. Accomplishments in FY 86 include (1) improvements in experimental equipment - an improved microfocus system that will give 20-μm resolution and has potential for increased imaging speed, and (2) development of a simple new technique for displaying 3D images so as to clearly show the structure of the object. Image reconstruction and data analysis for a series of synchrotron CT experiments conducted by LLNL's Chemistry Department has begun

  20. Quantitative lymphography

    International Nuclear Information System (INIS)

    Mostbeck, A.; Lofferer, O.; Kahn, P.; Partsch, H.; Koehn, H.; Bialonczyk, Ch.; Koenig, B.

    1984-01-01

    Labelled colloids and macromolecules are removed lymphatically. The uptake of tracer in the regional lymphnodes is a parameter of lymphatic flow. Due to great variations in patient shape - obesity, cachexia - and accompanying variations in counting efficiencies quantitative measurements with reasonable accuracy have not been reported to date. A new approach to regional absorption correction is based on the combination of transmission and emission scans for each patient. The transmission scan is used for calculation of an absorption correction matrix. Accurate superposition of the correction matrix and the emission scan is achieved by computing the centers of gravity of point sources and - in the case of aligning opposite views - by cross correlation of binary images. In phantom studies the recovery was high (98.3%) and the coefficient of variation of repeated measurement below 1%. In patient studies a standardized stress is a prerequisite for reliable and comparable results. Discrimination between normals (14.3 +- 4.2D%) and patients with lymphedema (2.05 +- 2.5D%) was highly significant using praefascial lymphography and sc injection. Clearence curve analysis of the activities at the injection site, however, gave no reliable data for this purpose. In normals, the uptake in lymphnodes after im injection is by one order of magnitude lower then the uptake after sc injection. The discrimination between normals and patients with postthromboic syndrome was significant. Lymphography after ic injection was in the normal range in 2/3 of the patients with lymphedema and is therefore of no diagnostic value. The difference in uptake after ic and sc injection demonstrated for the first time by our quantitative method provides new insights into the pathophysiology of lymphedema and needs further investigation. (Author)

  1. Determination of correction coefficients for quantitative analysis by mass spectrometry. Application to uranium impurities analysis; Recherche des coefficients de correction permettant l'analyse quantitative par spectrometrie de masse. Application a l'analyse d'impuretes dans l'uranium

    Energy Technology Data Exchange (ETDEWEB)

    Billon, J P [Commissariat a l' Energie Atomique, Bruyeres-le-Chatel (France). Centre d' Etudes

    1970-07-01

    Some of basic principles in spark source mass spectrometry are recalled. It is shown how this method can lead to quantitative analysis when attention is paid to some theoretical aspects. A time constant relation being assumed between the analysed solid sample and the ionic beam it gives we determined experimental relative sensitivity factors for impurities in uranium matrix. Results being in fairly good agreement with: an unelaborate theory on ionization yield in spark-source use of theoretically obtained relative sensitivity factors in uranium matrix has been developed. (author) [French] Apres avoir rappele quelques principes fondamentaux regissant la spectrometrie de masse a etincelles, nous avons montre que moyennant un certain nombre de precautions, il etait possible d'utiliser cette methode en analyse quantitative. Ayant admis qu'il existait une relation constante dans le temps entre l'echantillon solide analyse et le faisceau ionique qui en est issu, nous avons d'abord entrepris de determiner des coefficients de correction experimentaux pour des matrices d'uranium. Les premiers resultats pratiques semblant en accord avec une theorie simple relative au rendement d'ionisation dans la source a etincelles, nous avons etudie la possibilite d'appliquer directement les coefficients theoriques ainsi definis, l'application etant toujours faite sur des matrices d'uranium. (auteur)

  2. The Relationship between Student's Quantitative Skills, Application of Math, Science Courses, and Science Marks at Single-Sex Independent High Schools

    Science.gov (United States)

    Cambridge, David

    2012-01-01

    For independent secondary schools who offer rigorous curriculum to attract students, integration of quantitative skills in the science courses has become an important definition of rigor. However, there is little research examining students' quantitative skills in relation to high school science performance within the single-sex independent school…

  3. Quantitative Thermochronology

    Science.gov (United States)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  4. Effect of Sodium Chloride Concentrations and Its Foliar Application Time on Quantitative and Qualitative Characteristics of Pomegranate Fruit (Punica granatum L. CV. “Malas Saveh”

    Directory of Open Access Journals (Sweden)

    V. Rouhi

    2016-02-01

    Full Text Available Introduction: Pomegranate (Punica granatum L. belong to Punicaceae family is native to Iran and grown extensively in arid and semi-arid regions worldwide. Pomegranate is also important in human medicine and its components have a wide range of clinical applications. Cracking causes a major fruit loss, which is a serious commercial loss to farmers. Fruit cracking, seems to be a problem that lessens the marketability to a great extent. Fruit cracking is one of the physiological disorders wherever pomegranate trees are grown. It may be due to moisture imbalances as this fruit is very sensitive to variation in soil moisture prolonged drought causes hardening of skin and if this is followed by heavy irrigation the pulp grows then skin grows and cracks. Many factors i.e., climate, soil and irrigation, varieties, pruning, insects and nutrition statues influence the growth and production of fruit trees. Deficiencies of various nutrients are related to soil types, plants and even to various cultivars. Most nutrients are readily fixed in soil having different PH. Plant roots are unable to absorb these nutrients adequately from the dry topsoil. Foliar fertilization is particularly useful under conditions where the absorption of nutrients through the soil and this difficult situation to be present in the nutrients such as calcium. Since the calcium element is needed, so spraying them at the right time is correct way to save the plant requirements. Therefore, a research conducted on effect of sodium chloride concentrations and its foliar application time on quantitative and qualitative characteristics of pomegranate fruit (Punica granatum L. CV. “Malas Saveh”. Materials and Methods: An experiment conducted at Jarghoyeh, Esfahan, Iran in 2012. The factors were Sodium chloride (0, 5 and 10 g/L and times of spray (15, 45 and 75 days before harvest. The study was factorial experiment in the base of randomized complete blocks design with three replications

  5. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  6. Qualitative and quantitative combined nonlinear dynamics model and its application in analysis of price, supply–demand ratio and selling rate

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    The qualitative and quantitative combined nonlinear dynamics model proposed in this paper fill the gap in nonlinear dynamics model in terms of qualitative and quantitative combined methods, allowing the qualitative model and quantitative model to perfectly combine and overcome their weaknesses by learning from each other. These two types of models use their strengths to make up for the other’s deficiencies. The qualitative and quantitative combined models can surmount the weakness that the qualitative model cannot be applied and verified in a quantitative manner, and the high costs and long time of multiple construction as well as verification of the quantitative model. The combined model is more practical and efficient, which is of great significance for nonlinear dynamics. The qualitative and quantitative combined modeling and model analytical method raised in this paper is not only applied to nonlinear dynamics, but can be adopted and drawn on in the modeling and model analysis of other fields. Additionally, the analytical method of qualitative and quantitative combined nonlinear dynamics model proposed in this paper can satisfactorily resolve the problems with the price system’s existing nonlinear dynamics model analytical method. The three-dimensional dynamics model of price, supply–demand ratio and selling rate established in this paper make estimates about the best commodity prices using the model results, thereby providing a theoretical basis for the government’s macro-control of price. Meanwhile, this model also offer theoretical guidance to how to enhance people’s purchasing power and consumption levels through price regulation and hence to improve people’s living standards.

  7. Effects of Various Substrates and Foliar Application of Humic Acid ‎on Growth and some Qualitative and Quantitative Characteristics of Tomato (Lycopersicon esculentum Seedling

    Directory of Open Access Journals (Sweden)

    Nasibeh Pourghasemian

    2018-03-01

    history. However, the chemistry and function of the organic matter have been a subject of controversy since humans began their postulating about it in the 18th century. Selection of the proper media components is critical to the successful production of plants. So, the objective of this study was to assess the effect of humic acid foliar application and various substrate on quantitative and qualitative characteristics of tomato seedling. Material and Methods: The experiment was conducted in a greenhouse at Bardsir Faculty of Agriculture , Shahid Bahonar University of Kerman in 2015, as a factorial arrangement based on completely randomized design with five replications. The experimental treatments were substrate in 7 levels (peat, coco-peat, leaf-soil, compost, vermi-compost, manure and clay soil, humic acid in two levels (foliar application and non- foliar application. After preparation of substrates, plastic boxes with 12 cm diameter and 10 cm height were chosen. After extracting gravity water, tomato (cv. Canyon seeds were sown in pots. Rain irrigation was done daily. Foliar application of humic acid with concentration of 0.001 liter was performed from seedling emergence to transplanting every two days. The germinated seeds was daily counted and number and rate of seed emergence was estimated. Plant height, stem diameter, number of internodes, leaf area, shoot and root dry matter and chlorophyll contents were calculated at transplanting time of seedling. Results and Discussion: The substrate treatment had a significant effect on rate and percent of germination, plant height, shoot dry matter, leaf area, number of internodes and, chlorophyll a and carotenoid contents. According to the results, the greatest and smallest rate and percent of germination ‎was found in peat and manure treatments, respectively. Also the greatest shoot dry matter (1.17 g, leaf area (125.9 ‎cm plant-1, number of internodes (6.19, plant height (13.51 cm and chlorophyll a concentration (2

  8. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  9. A kinetic-based sigmoidal model for the polymerase chain reaction and its application to high-capacity absolute quantitative real-time PCR

    Directory of Open Access Journals (Sweden)

    Stewart Don

    2008-05-01

    Full Text Available Abstract Background Based upon defining a common reference point, current real-time quantitative PCR technologies compare relative differences in amplification profile position. As such, absolute quantification requires construction of target-specific standard curves that are highly resource intensive and prone to introducing quantitative errors. Sigmoidal modeling using nonlinear regression has previously demonstrated that absolute quantification can be accomplished without standard curves; however, quantitative errors caused by distortions within the plateau phase have impeded effective implementation of this alternative approach. Results Recognition that amplification rate is linearly correlated to amplicon quantity led to the derivation of two sigmoid functions that allow target quantification via linear regression analysis. In addition to circumventing quantitative errors produced by plateau distortions, this approach allows the amplification efficiency within individual amplification reactions to be determined. Absolute quantification is accomplished by first converting individual fluorescence readings into target quantity expressed in fluorescence units, followed by conversion into the number of target molecules via optical calibration. Founded upon expressing reaction fluorescence in relation to amplicon DNA mass, a seminal element of this study was to implement optical calibration using lambda gDNA as a universal quantitative standard. Not only does this eliminate the need to prepare target-specific quantitative standards, it relegates establishment of quantitative scale to a single, highly defined entity. The quantitative competency of this approach was assessed by exploiting "limiting dilution assay" for absolute quantification, which provided an independent gold standard from which to verify quantitative accuracy. This yielded substantive corroborating evidence that absolute accuracies of ± 25% can be routinely achieved. Comparison

  10. Quantitative traits and diversification.

    Science.gov (United States)

    FitzJohn, Richard G

    2010-12-01

    Quantitative traits have long been hypothesized to affect speciation and extinction rates. For example, smaller body size or increased specialization may be associated with increased rates of diversification. Here, I present a phylogenetic likelihood-based method (quantitative state speciation and extinction [QuaSSE]) that can be used to test such hypotheses using extant character distributions. This approach assumes that diversification follows a birth-death process where speciation and extinction rates may vary with one or more traits that evolve under a diffusion model. Speciation and extinction rates may be arbitrary functions of the character state, allowing much flexibility in testing models of trait-dependent diversification. I test the approach using simulated phylogenies and show that a known relationship between speciation and a quantitative character could be recovered in up to 80% of the cases on large trees (500 species). Consistent with other approaches, detecting shifts in diversification due to differences in extinction rates was harder than when due to differences in speciation rates. Finally, I demonstrate the application of QuaSSE to investigate the correlation between body size and diversification in primates, concluding that clade-specific differences in diversification may be more important than size-dependent diversification in shaping the patterns of diversity within this group.

  11. Application of multivariable analysis methods to the quantitative detection of gas by tin dioxide micro-sensors; Application des methodes d'analyse multivariables a la detection quantitative de gaz par microcapteurs a base de dioxyde d'etain

    Energy Technology Data Exchange (ETDEWEB)

    Perdreau, N.

    2000-01-17

    The electric conductivity of tin dioxide depends on the temperature of the material and on the nature and environment of the surrounding gas. This work shows that the treatment by multivariable analysis methods of electric conductance signals of one sensor allows to determine concentrations of binary or ternary mixtures of ethanol (0-80 ppm), carbon monoxide (0-300 ppm) and methane (0-1000 ppm). A part of this study has consisted of the design and the implementation of an automatic testing bench allowing to acquire the electric conductance of four sensors in thermal cycle and under gaseous cycles. It has also revealed some disturbing effects (humidity,..) of the measurement. Two techniques of sensor fabrication have been used to obtain conductances (depending of temperature) distinct for each gas, reproducible for the different sensors and enough stable with time to allow the exploitation of the signals by multivariable analysis methods (tin dioxide under the form of thin layers obtained by reactive evaporation or under the form of sintered powder bars). In a last part, it has been shown that the quantitative determination of gas by the application of chemo-metry methods is possible although the relation between the electric conductances in one part and the temperatures and concentrations in another part is non linear. Moreover, the modelling with the 'Partial Least Square' method and a pretreatment allows to obtain performance data comparable to those obtained with neural networks. (O.M.)

  12. Quantitative ion implantation

    International Nuclear Information System (INIS)

    Gries, W.H.

    1976-06-01

    This is a report of the study of the implantation of heavy ions at medium keV-energies into electrically conducting mono-elemental solids, at ion doses too small to cause significant loss of the implanted ions by resputtering. The study has been undertaken to investigate the possibility of accurate portioning of matter in submicrogram quantities, with some specific applications in mind. The problem is extensively investigated both on a theoretical level and in practice. A mathematical model is developed for calculating the loss of implanted ions by resputtering as a function of the implanted ion dose and the sputtering yield. Numerical data are produced therefrom which permit a good order-of-magnitude estimate of the loss for any ion/solid combination in which the ions are heavier than the solid atoms, and for any ion energy from 10 to 300 keV. The implanted ion dose is measured by integration of the ion beam current, and equipment and techniques are described which make possible the accurate integration of an ion current in an electromagnetic isotope separator. The methods are applied to two sample cases, one being a stable isotope, the other a radioisotope. In both cases independent methods are used to show that the implantation is indeed quantitative, as predicted. At the same time the sample cases are used to demonstrate two possible applications for quantitative ion implantation, viz. firstly for the manufacture of calibration standards for instrumental micromethods of elemental trace analysis in metals, and secondly for the determination of the half-lives of long-lived radioisotopes by a specific activity method. It is concluded that the present study has advanced quantitative ion implantation to the state where it can be successfully applied to the solution of problems in other fields

  13. Quantitative microbiological risk assessment as a tool to obtain useful information for risk managers - specific application to Listeria monocytogenes and ready-to-eat meat products

    NARCIS (Netherlands)

    Mataragas, M.; Zwietering, M.H.; Skandamis, P.N.; Drosinos, E.H.

    2010-01-01

    The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required),

  14. Quantitative model calculation of the time-dependent protoporphyrin IX concentration in normal human epidermis after delivery of ALA by passive topical application or lontophoresis

    NARCIS (Netherlands)

    Star, Willem M.; Aalders, Maurice C. G.; Sac, Arnoldo; Sterenborg, Henricus J. C. M.

    2002-01-01

    We present a mathematical layer model to quantitatively calculate the diffusion of 5-aminolevulinic acid (ALA) in the skin in vivo, its uptake into the cells and its conversion to protoporphyrin IX (PpIX) and subsequently to heme. The model is a modification and extension of a recently presented

  15. Novel application of quantitative single-photon emission computed-tomography/computed tomography to predict early response to methimazole in Graves' disease

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Joo; Bang, Ji In; Kim, Ji Young; Moon, Jae Hoon [Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam (Korea, Republic of); So, Young [Dept. of Nuclear Medicine, Konkuk University Medical Center, Seoul (Korea, Republic of); Lee, Won Woo [Institute of Radiation Medicine, Medical Research Center, Seoul National University, Seoul (Korea, Republic of)

    2017-06-15

    Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of {sup 99m}Tc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). A novel parameter of thyroid uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients.

  16. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  17. Quantitative in-depth state analysis by means of x-ray photoelectron spectroscopy and its application to surface Layer of SiC coatings

    International Nuclear Information System (INIS)

    Yabe, Katsumasa; Yamashina, Toshiro.

    1980-01-01

    An attempt of quantitative state analysis was made on the surface and the depth profile of inorganic compounds by X-ray photoelectron spectroscopy (XPS) which was combined by the sputter-etching with argon ions. A masking attachment was designed for an area of sample which is exposed to the non-uniform portion of the ion beam. Uniform sputter-etching could be attained, with the advantages on XPS observation of low background level and less impurity spectra from other origins than the sample. The photoelectron yields were examined for the quantitative analysis by XPS. The method established here was applied to analyze the surface and in-depth composition of SiC coatings onto carbon and molybdenum which are promising candidate materials as the first wall in a controlled thermonuclear reactor. (author)

  18. A Novel HPLC Method for the Concurrent Analysis and Quantitation of Seven Water-Soluble Vitamins in Biological Fluids (Plasma and Urine: A Validation Study and Application

    Directory of Open Access Journals (Sweden)

    Margherita Grotzkyj Giorgi

    2012-01-01

    Full Text Available An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B1, B2, B5, B6, B9, B12 in biological matrices (plasma and urine. Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males. Interday and intraday precision were <4% and <7%, respectively, for all vitamins. Recovery percentages ranged from 93% to 100%.

  19. Fabrication of type I collagen microcarrier using a microfluidic 3D T-junction device and its application for the quantitative analysis of cell-ECM interactions.

    Science.gov (United States)

    Yoon, Junghyo; Kim, Jaehoon; Jeong, Hyo Eun; Sudo, Ryo; Park, Myung-Jin; Chung, Seok

    2016-08-26

    We presented a new quantitative analysis for cell and extracellular matrix (ECM) interactions, using cell-coated ECM hydrogel microbeads (hydrobeads) made of type I collagen. The hydrobeads can carry cells as three-dimensional spheroidal forms with an ECM inside, facilitating a direct interaction between the cells and ECM. The cells on hydrobeads do not have a hypoxic core, which opens the possibility for using as a cell microcarrier for bottom-up tissue reconstitution. This technique can utilize various types of cells, even MDA-MB-231 cells, which have weak cell-cell interactions and do not form spheroids in conventional spheroid culture methods. Morphological indices of the cell-coated hydrobead visually present cell-ECM interactions in a quantitative manner.

  20. A novel HPLC method for the concurrent analysis and quantitation of seven water-soluble vitamins in biological fluids (plasma and urine): a validation study and application.

    Science.gov (United States)

    Giorgi, Margherita Grotzkyj; Howland, Kevin; Martin, Colin; Bonner, Adrian B

    2012-01-01

    An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B(1), B(2), B(5), B(6), B(9), B(12)) in biological matrices (plasma and urine). Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males). Interday and intraday precision were vitamins. Recovery percentages ranged from 93% to 100%.

  1. Exploring alternative models for sex-linked quantitative trait loci in outbred populations: application to an iberian x landrace pig intercross.

    OpenAIRE

    Pérez-Enciso, Miguel; Clop, Alex; Folch, Josep M; Sánchez, Armand; Oliver, Maria A; Ovilo, Cristina; Barragán, C; Varona, Luis; Noguera, José L

    2002-01-01

    We present a very flexible method that allows us to analyze X-linked quantitative trait loci (QTL) in crosses between outbred lines. The dosage compensation phenomenon is modeled explicitly in an identity-by-descent approach. A variety of models can be fitted, ranging from considering alternative fixed alleles within the founder breeds to a model where the only genetic variation is within breeds, as well as mixed models. Different genetic variances within each founder breed can be estimated. ...

  2. Application of High-Performance Liquid Chromatography Coupled with Linear Ion Trap Quadrupole Orbitrap Mass Spectrometry for Qualitative and Quantitative Assessment of Shejin-Liyan Granule Supplements

    OpenAIRE

    Jifeng Gu; Weijun Wu; Mengwei Huang; Fen Long; Xinhua Liu; Yizhun Zhu

    2018-01-01

    A method for high-performance liquid chromatography coupled with linear ion trap quadrupole Orbitrap high-resolution mass spectrometry (HPLC-LTQ-Orbitrap MS) was developed and validated for the qualitative and quantitative assessment of Shejin-liyan Granule. According to the fragmentation mechanism and high-resolution MS data, 54 compounds, including fourteen isoflavones, eleven ligands, eight flavonoids, six physalins, six organic acids, four triterpenoid saponins, two xanthones, two alkaloi...

  3. Performance assessment of a NaI(Tl) gamma counter for PET applications with methods for improved quantitative accuracy and greater standardization

    OpenAIRE

    Lodge, Martin A; Holt, Daniel P; Kinahan, Paul E; Wong, Dean F; Wahl, Richard L

    2015-01-01

    Background Although NaI(Tl) gamma counters play an important role in many quantitative positron emission tomography (PET) protocols, their calibration for positron-emitting samples has not been standardized across imaging sites. In this study, we characterized the operational range of a gamma counter specifically for positron-emitting radionuclides, and we assessed the role of traceable 68Ge/68Ga sources for standardizing system calibration. Methods A NaI(Tl) gamma counter was characterized w...

  4. Turn-on Fluorescent Probe for Exogenous and Endogenous Imaging of Hypochlorous Acid in Living Cells and Quantitative Application in Flow Cytometry.

    Science.gov (United States)

    Zhan, Zixuan; Liu, Rui; Chai, Li; Li, Qiuyan; Zhang, Kexin; Lv, Yi

    2017-09-05

    Hypochlorous acid (HClO) acts as a dominant microbicidal mediator in the natural immune system, and the excess production of hypochlorites is related to a series of diseases. Thus, it is vitally important and necessary to develop a highly sensitive and selective method for HClO detection in living systems, and most of fluorescent probes are mainly focused on cells imaging. Besides, accurate HClO quantitative information about individual cells in a large cell population is extremely important for understanding inflammation and cellular apoptosis as well. In our work, a turn-on fluorescent probe has been synthesized, which can selectively and sensitively detect HClO with fast response time. The probe is almost nonfluorescent possibly due to both the spirolactam form of fluorescein and unbridged C═N bonds which can undergo a nonradiative decay process in the excited state. Upon the addition of ClO - , the probe was oxidized to ring-opened fluorescent form and the fluorescence intensity was greatly enhanced. In live cell experiments, the probe was successfully applied to image exogenous ClO - in HeLa cells and endogenous HClO in RAW 264.7 macrophage cells. In particular, the quantitative information on exogenous and endogenous HClO can also be acquired in flow cytometry. Therefore, the probe not only can image exogenous and endogenous HClO but also provides a new and promising platform to quantitatively detect HClO in flow cytometry.

  5. Development and validation of a high throughput LC–MS/MS method for simultaneous quantitation of pioglitazone and telmisartan in rat plasma and its application to a pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Pinaki Sengupta

    2017-12-01

    Full Text Available Management of cardiovascular risk factors in diabetes demands special attention due to their co-existence. Pioglitazone (PIO and telmisartan (TLM combination can be beneficial in effective control of cardiovascular complication in diabetes. In this research, we developed and validated a high throughput LC–MS/MS method for simultaneous quantitation of PIO and TLM in rat plasma. This developed method is more sensitive and can quantitate the analytes in relatively shorter period of time compared to the previously reported methods for their individual quantification. Moreover, till date, there is no bioanalytical method available to simultaneously quantitate PIO and TLM in a single run. The method was validated according to the USFDA guidelines for bioanalytical method validation. A linear response of the analytes was observed over the range of 0.005–10 µg/mL with satisfactory precision and accuracy. Accuracy at four quality control levels was within 94.27%–106.10%. The intra- and inter-day precision ranged from 2.32%–10.14 and 5.02%–8.12%, respectively. The method was reproducible and sensitive enough to quantitate PIO and TLM in rat plasma samples of a preclinical pharmacokinetic study. Due to the potential of PIO-TLM combination to be therapeutically explored, this method is expected to have significant usefulness in future. Keywords: LC–MS/MS, Rat plasma, Pharmacokinetic applicability, Telmisartan, Pioglitazone, Pharmacokinetic application

  6. Resonant elastic scattering of {sup 15}O and a new reaction path in the CNO cycle; Spectroscopie par diffusion elastique resonante d' {sup 15}O et nouveau chemin de reaction dans le cycle CNO

    Energy Technology Data Exchange (ETDEWEB)

    Stefan, Gheorghe Iulian [Ecole doctorale SIMEM, U.F.R. Sciences, Universite de Caen Basse-Normandie, 14032 Caen Cedex (France)

    2006-12-15

    This work presents a very accurate experimental method based on radioactive beams for the study of the spectroscopical properties of unbound states. It makes use of inverse kinematical elastic scattering of the ions of an radioactive beam from a target of stable nuclei. An application of the method for the study of radioactive nuclei of astrophysical interests is given, namely of {sup 19}Ne and {sup 16}F nuclei. It is shown that on the basis of the properties of proton-emitting unbound levels of {sup 19}Ne one can develop a method of experimental study of nova explosions. It is based on observation of gamma emissions following the gamma decays of the radionuclides generated in the explosion. The most interesting radioactive nucleus involved in this process is {sup 18}F the yield of which depends strongly on the rate of {sup 18}F(p,{alpha}){sup 15}O reaction. This yield depends in turn of the properties of the states of the ({sup 18}F + p) compound nucleus, i.e. the {sup 19}Ne nucleus. In addition it was studied the unbound {sup 16}F nucleus also of astrophysical significance in {sup 15}O rich environment. Since {sup 16}F is an unbound nucleus the reaction of {sup 15}O with protons, although abundant in most astrophysical media, appears to be negligible. Thus the question that was posed was whether the exotic {sup 15}O(p,{beta}{sup +}){sup 16}O resonant reaction acquires some importance in various astrophysical media. In this work one describes a novel approach to study the reaction mechanisms which could change drastically the role of non-bound nuclei in stellar processes. One implies this mechanism to the processes (p,{gamma})({beta}){sup +} and (p,{gamma}) (p,{gamma}) within {sup 15}O rich media. The experimental studies of the {sup 19}Ne and {sup 16}F were carried out with a radioactive beam of {sup 15}O ions of very low energy produced by SPIRAL at GANIL. To improve the energy resolution thin targets were used with a 0 angle of observation relative to the beam

  7. Simultaneous quantitation of hydroxychloroquine and its metabolites in mouse blood and tissues using LC-ESI-MS/MS: An application for pharmacokinetic studies.

    Science.gov (United States)

    Chhonker, Yashpal S; Sleightholm, Richard L; Li, Jing; Oupický, David; Murry, Daryl J

    2018-01-01

    Hydroxychloroquine (HCQ) has been shown to disrupt autophagy and sensitize cancer cells to radiation and chemotherapeutic agents. However, the optimal delivery method, dose, and tumor concentrations required for these effects are not known. This is in part due to a lack of sensitive and reproducible analytical methods for HCQ quantitation in small animals. As such, we developed and validated a selective and sensitive liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) method for simultaneous quantitation of hydroxychloroquine and its metabolites in mouse blood and tissues. The chromatographic separation and detection of analytes were achieved on a reversed phase Thermo Aquasil C 18 (50×4.6mm, 3μ) column, with gradient elution using 0.2% formic acid and 0.1% formic acid in methanol as mobile phase at a flow rate of 0.5mL/min. Simple protein precipitation was utilized for extraction of analytes from the desired matrix. Analytes were separated and quantitated using MS/MS with an electrospray ionization source in positive multiple reaction monitoring (MRM) mode. The MS/MS response was linear over the concentration range from 1 to 2000ng/mL for all analytes with a correlation coefficient (R 2 ) of 0.998 or better. The within- and between-day precision (relative standard deviation, % RSD) and accuracy were within the acceptable limits per FDA guidelines. The validated method was successfully applied to a preclinical pharmacokinetic mouse study involving low volume blood and tissue samples for hydroxychloroquine and metabolites. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Quantitative measurement of the chemical composition of geological standards with a miniature laser ablation/ionization mass spectrometer designed for in situ application in space research

    International Nuclear Information System (INIS)

    Neuland, M B; Riedo, A; Tulej, M; Wurz, P; Grimaudo, V; Moreno-García, P; Mezger, K

    2016-01-01

    A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface. (paper)

  9. Improved simultaneous quantitation of candesartan and hydrochlorthiazide in human plasma by UPLC–MS/MS and its application in bioequivalence studies

    Directory of Open Access Journals (Sweden)

    Bhupinder Singh

    2014-04-01

    Full Text Available A validated ultra-performance liquid chromatography mass spectrometric method (UPLC–MS/MS was used for the simultaneous quantitation of candesartan (CN and hydrochlorothiazide (HCT in human plasma. The analysis was performed on UPLC–MS/MS system using turbo ion spray interface. Negative ions were measured in multiple reaction monitoring (MRM mode. The analytes were extracted using a liquid–liquid extraction (LLE method by using 0.1 mL of plasma volume. The lower limit of quantitation for CN and HCT was 1.00 ng/mL whereas the upper limit of quantitation was 499.15 ng/mL and 601.61 ng/mL for CN and HCT respectively. CN d4 and HCT-13Cd2 were used as the internal standards for CN and HCT respectively. The chromatography was achieved within 2.0 min run time using a C18 Phenomenex, Gemini NX (100 mm×4.6 mm, 5 µm column with organic mixture:buffer solution (80:20, v/v at a flow rate of 0.800 mL/min. The method has been successfully applied to establish the bioequivalence of candesartan cilexetil (CNC and HCT immediate release tablets with reference product in human subjects. Keywords: Candesartan cilexetil, Hydrochlorothiazide, UPLC–MS/MS, Bioequivalence, Candesartan cilexetil-hydrochlorothiazide (ATACAND HCT

  10. Development of a method for urine bikunin/urinary trypsin inhibitor (UTI) quantitation and structural characterization: Application to type 1 and type 2 diabetes.

    Science.gov (United States)

    Lepedda, Antonio Junior; Nieddu, Gabriele; Rocchiccioli, Silvia; Fresu, Pietro; De Muro, Pierina; Formato, Marilena

    2013-12-01

    Bikunin is a plasma proteinase inhibitor often associated with inflammatory conditions. It has a half-life of few minutes and it is rapidly excreted into urine as urinary trypsin inhibitor (UTI). UTI levels are usually low in healthy individuals but they can increase up to tenfold in both acute and chronic inflammatory diseases. This article describes a sensitive method for both direct UTI quantitation and structural characterization. UTI purification was performed by anion exchange micro-chromatography followed by SDS-PAGE. A calibration curve for protein quantitation was set up by using a purified UTI fraction. UTI identification and structural characterization was performed by Nano-LC-MS/MS analysis. The method was applied on urine samples from 9 patients with type 1 diabetes, 11 patients with type 2 diabetes, and 28 healthy controls, matched for age and sex with patients, evidencing higher UTI levels in both groups of patients with respect to controls (p UTI levels and age in each group tested. Owing to the elevated sensitivity and specificity, the described method allows UTI quantitation from very low quantities of specimen. Furthermore, as UTI concentration is normalized for creatinine level, the analysis could be also performed on randomly collected urine samples. Finally, MS/MS analysis prospects the possibility of characterizing PTM sites potentially able to affect UTI localization, function, and pathophysiological activity. Preliminary results suggest that UTI levels could represent a useful marker of chronic inflammatory condition in type 1 and 2 diabetes. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Identification and quantitation of 3,4-methylenedioxy-N-methylamphetamine (MDMA, ecstasy) in human urine by 1H NMR spectroscopy. Application to five cases of intoxication.

    Science.gov (United States)

    Liu, Jonathan; Decatur, John; Proni, Gloria; Champeil, Elise

    2010-01-30

    Identification of 3,4-methylenedioxy-N-methylamphetamine (MDMA, ecstasy) in five cases of intoxication using nuclear magnetic resonance (NMR) spectroscopy of human urine is reported. A new water suppression technique PURGE (Presaturation Utilizing Relaxation Gradients and Echoes) was used. A calibration curve was obtained using spiked samples. The method gave a linear response (correlation coefficient of 0.992) over the range 0.01-1mg/mL. Subsequently, quantitation of the amount of MDMA present in the samples was performed. The benefit and reliability of NMR investigations of human urine for cases of intoxication with MDMA are discussed. Published by Elsevier Ireland Ltd.

  12. PLS-based quantitative structure-activity relationship for substituted benzamides of clebopride type. Application of experimental design in drug design.

    Science.gov (United States)

    Norinder, U; Högberg, T

    1992-04-01

    The advantageous approach of using an experimentally designed training set as the basis for establishing a quantitative structure-activity relationship with good predictive capability is described. The training set was selected from a fractional factorial design scheme based on a principal component description of physico-chemical parameters of aromatic substituents. The derived model successfully predicts the activities of additional substituted benzamides of 6-methoxy-N-(4-piperidyl)salicylamide type. The major influence on activity of the 3-substituent is demonstrated.

  13. The 2D Hotelling filter - a quantitative noise-reducing principal-component filter for dynamic PET data, with applications in patient dose reduction

    International Nuclear Information System (INIS)

    Axelsson, Jan; Sörensen, Jens

    2013-01-01

    In this paper we apply the principal-component analysis filter (Hotelling filter) to reduce noise from dynamic positron-emission tomography (PET) patient data, for a number of different radio-tracer molecules. We furthermore show how preprocessing images with this filter improves parametric images created from such dynamic sequence. We use zero-mean unit variance normalization, prior to performing a Hotelling filter on the slices of a dynamic time-series. The Scree-plot technique was used to determine which principal components to be rejected in the filter process. This filter was applied to [ 11 C]-acetate on heart and head-neck tumors, [ 18 F]-FDG on liver tumors and brain, and [ 11 C]-Raclopride on brain. Simulations of blood and tissue regions with noise properties matched to real PET data, was used to analyze how quantitation and resolution is affected by the Hotelling filter. Summing varying parts of a 90-frame [ 18 F]-FDG brain scan, we created 9-frame dynamic scans with image statistics comparable to 20 MBq, 60 MBq and 200 MBq injected activity. Hotelling filter performed on slices (2D) and on volumes (3D) were compared. The 2D Hotelling filter reduces noise in the tissue uptake drastically, so that it becomes simple to manually pick out regions-of-interest from noisy data. 2D Hotelling filter introduces less bias than 3D Hotelling filter in focal Raclopride uptake. Simulations show that the Hotelling filter is sensitive to typical blood peak in PET prior to tissue uptake have commenced, introducing a negative bias in early tissue uptake. Quantitation on real dynamic data is reliable. Two examples clearly show that pre-filtering the dynamic sequence with the Hotelling filter prior to Patlak-slope calculations gives clearly improved parametric image quality. We also show that a dramatic dose reduction can be achieved for Patlak slope images without changing image quality or quantitation. The 2D Hotelling-filtering of dynamic PET data is a computer

  14. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  15. Application of a tri-axial accelerometry-based portable motion recorder for the quantitative assessment of hippotherapy in children and adolescents with cerebral palsy.

    Science.gov (United States)

    Mutoh, Tomoko; Mutoh, Tatsushi; Takada, Makoto; Doumura, Misato; Ihara, Masayo; Taki, Yasuyuki; Tsubone, Hirokazu; Ihara, Masahiro

    2016-10-01

    [Purpose] This case series aims to evaluate the effects of hippotherapy on gait and balance ability of children and adolescents with cerebral palsy using quantitative parameters for physical activity. [Subjects and Methods] Three patients with gait disability as a sequela of cerebral palsy (one female and two males; age 5, 12, and 25 years old) were recruited. Participants received hippotherapy for 30 min once a week for 2 years. Gait parameters (step rate, step length, gait speed, mean acceleration, and horizontal/vertical displacement ratio) were measured using a portable motion recorder equipped with a tri-axial accelerometer attached to the waist before and after a 10-m walking test. [Results] There was a significant increase in step length between before and after a single hippotherapy session. Over the course of 2 year intervention, there was a significant increase in step rate, gait speed, step length, and mean acceleration and a significant improvement in horizontal/vertical displacement ratio. [Conclusion] The data suggest that quantitative parameters derived from a portable motion recorder can track both immediate and long-term changes in the walking ability of children and adolescents with cerebral palsy undergoing hippotherapy.

  16. Quantitative analysis of Fe and Co in Co-substituted magnetite using XPS: The application of non-linear least squares fitting (NLLSF)

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongmei, E-mail: hmliu@gig.ac.cn [CAS Key Laboratory of Mineralogy and Metallogeny/Guangdong Provincial Key Laboratory of Mineral Physics and Materials, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou, 510640 (China); Wei, Gaoling [Guangdong Key Laboratory of Agricultural Environment Pollution Integrated Control, Guangdong Institute of Eco-Environmental and Soil Sciences, Guangzhou, 510650 (China); Xu, Zhen [School of Materials Science and Engineering, Central South University, Changsha, 410012 (China); Liu, Peng; Li, Ying [CAS Key Laboratory of Mineralogy and Metallogeny/Guangdong Provincial Key Laboratory of Mineral Physics and Materials, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou, 510640 (China); University of Chinese Academy of Sciences, Beijing, 100049 (China)

    2016-12-15

    Highlights: • XPS and Auger peak overlapping complicates Co-substituted magnetite quantification. • Disrurbance of Auger peaks was eliminated by non-linear least squares fitting. • Fitting greatly improved the accuracy of quantification for Co and Fe. • Catalytic activity of magnetite was enhanced with the increase of Co substitution. - Abstract: Quantitative analysis of Co and Fe using X-ray photoelectron spectroscopy (XPS) is of important for the evaluation of the catalytic ability of Co-substituted magnetite. However, the overlap of XPS peaks and Auger peaks for Co and Fe complicate quantification. In this study, non-linear least squares fitting (NLLSF) was used to calculate the relative Co and Fe contents of a series of synthesized Co-substituted magnetite samples with different Co doping levels. NLLSF separated the XPS peaks of Co 2p and Fe 2p from the Auger peaks of Fe and Co, respectively. Compared with a control group without fitting, the accuracy of quantification of Co and Fe was greatly improved after elimination by NLLSF of the disturbance of Auger peaks. A catalysis study confirmed that the catalytic activity of magnetite was enhanced with the increase of Co substitution. This study confirms the effectiveness and accuracy of the NLLSF method in XPS quantitative calculation of Fe and Co coexisting in a material.

  17. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  18. Application of local approach to quantitative prediction of degradation in fracture toughness of steels due to pre-straining and irradiation

    International Nuclear Information System (INIS)

    Miyata, T.; Tagawa, T.

    1996-01-01

    Degradation of cleavage fracture toughness for low carbon steels due to pre-straining and irradiation was investigated on the basis of the local fracture criterion approach. Formulation of cleavage fracture toughness through the statistical modelling proposed by BEREMIN has been simplified by the present authors to the expression involving yield stress and cleavage fracture stress of materials. A few percent pre-strain induced by cold rolling deteriorates significantly the cleavage fracture toughness. Ductile-brittle transition temperature is increased to more than 70 C higher by 8% straining in 500 MPa class high strength steel. Quantitative prediction of degradation has been successfully examined through the formulation of the cleavage fracture toughness. Analytical and experimental results indicate that degradation in toughness is caused by the increase of flow stress in pre-strained materials. Quantitative prediction of degradation of toughness due to irradiation has been also examined for the past experiments on the basis of the local fracture criterion approach. Analytical prediction from variance of yield stress by irradiation is well consistent with the experimental results. (orig.)

  19. Application of femtosecond laser ablation inductively coupled plasma mass spectrometry for quantitative analysis of thin Cu(In,Ga)Se{sub 2} solar cell films

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seokhee [School of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong, Buk-gu, Gwangju 500-712 (Korea, Republic of); Gonzalez, Jhanis J. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Applied Spectra Inc., 46665 Fremont Boulevard, Fremont, CA 94538 (United States); Yoo, Jong H. [Applied Spectra Inc., 46665 Fremont Boulevard, Fremont, CA 94538 (United States); Chirinos, Jose R. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Facultad de Ciencias, Universidad Central de Venezuela, Caracas 1041A (Venezuela, Bolivarian Republic of); Russo, Richard E. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Applied Spectra Inc., 46665 Fremont Boulevard, Fremont, CA 94538 (United States); Jeong, Sungho, E-mail: shjeong@gist.ac.kr [School of Mechatronics, Gwangju Institute of Science and Technology, 1 Oryong-dong, Buk-gu, Gwangju 500-712 (Korea, Republic of)

    2015-02-27

    This work reports that the composition of Cu(In,Ga)Se{sub 2} (CIGS) thin solar cell films can be quantitatively predicted with high accuracy and precision by femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS). It is demonstrated that the results are strongly influenced by sampling conditions during fs-laser beam (λ = 1030 nm, τ = 450 fs) scanning on the CIGS surface. The fs-LA-ICP-MS signals measured at optimal sampling conditions generally provide a straight line calibration with respect to the reference concentrations measured by inductively coupled plasma optical emission spectroscopy (ICP-OES). The concentration ratios predicted by fs-LA-ICP-MS showed high accuracy, to 95–97% of the values measured with ICP-OES, for Cu, In, Ga, and Se elements. - Highlights: • Laser ablation inductively coupled plasma mass spectrometry of thin film is reported. • Concentration ratio prediction with a confidence level of 95–97% is achieved. • Quantitative determination of composition is demonstrated.

  20. Development and application of a quantitative method based on LC-QqQ MS/MS for determination of steviol glycosides in Stevia leaves.

    Science.gov (United States)

    Molina-Calle, M; Sánchez de Medina, V; Delgado de la Torre, M P; Priego-Capote, F; Luque de Castro, M D

    2016-07-01

    Stevia is a currently well-known plant thanks to the presence of steviol glycosides, which are considered as sweeteners obtained from a natural source. In this research, a method based on LC-MS/MS by using a triple quadrupole detector was developed for quantitation of 8 steviol glycosides in extracts from Stevia leaves. The ionization and fragmentation parameters for selected reaction monitoring were optimized. Detection and quantitation limits ranging from 0.1 to 0.5ng/mL and from 0.5 to 1ng/mL, respectively, were achieved: the lowest attained so far. The steviol glycosides were quantified in extracts from leaves of seven varieties of Stevia cultivated in laboratory, greenhouse and field. Plants cultivated in field presented higher concentration of steviol glycosides than those cultivated in greenhouse. Thus, the way of cultivation clearly influences the concentration of these compounds. The inclusion of branches together with leaves as raw material was also evaluated, showing that this inclusion modifies, either positively or negatively, the concentration of steviol glycosides. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Development and validation of an improved method for the quantitation of sertraline in human plasma using LC-MS-MS and its application to bioequivalence studies.

    Science.gov (United States)

    Zhang, Mengliang; Gao, Feng; Cui, Xiangyong; Zhang, Yunhui; Sun, Yantong; Gu, Jingkai

    2011-02-01

    A rapid and sensitive LC-MS-MS method for the quantitation of sertraline in human plasma was developed and validated. Sertraline and the internal standard, telmisartan, were cleaned up by protein precipitation from 100 μL of plasma sample, and analyzed on a TC-C18 column (5 μm, 150 × 4.6 mm i.d.) using 70% acetonitrile and 30% 10 mM ammonium acetate (0.1% formic acid) as mobile phase. The method was demonstrated to be linear from 0.1 ng/mL to 50 ng/mL with the lower limit of quantitation of 0.1 ng/mL. Intra- and inter-day precision were below 4.40% and 3.55%. Recoveries of sertraline at low, medium, and high levels were 88.0 ± 2.3%, 88.2 ± 1.9%, and 90.0 ± 2.0%, respectively. The method was successfully applied to a bioequivalence study of sertraline after a single oral administration of 50 mg sertraline hydrochloride tablets.

  2. Development of a quantitative method for trace elements determination in ores by XRF: an application to phosphorite from Olinda (PE), Brazil

    International Nuclear Information System (INIS)

    Imakuma, K.; Sato, I.M.; Cretella Neto, J.; Costa, M.I.

    1976-01-01

    A quantitative analytical method by means of X-Ray Fluorescence intended to determine Zn, Cu and Ni trace amounts in a phosphorite ore from Olinda, PE-Brazil was established. The double dilution method with borax as the melting flux was the one choosed because ores diluted in borax in the form of melted samples show matrix effects with respect to the element to be analysed; it was possible to identify the elements already presented in the ore that caused interference in the Zn, Cu and Ni determinations. Such elements were Ca and their quantities were subsequently determined. The addition of appropriate quantities of Fe and Ca to standards allowed us to minimize the matrix effects without the undersired introduction of extraneous elements in the ore, moreover, the urge of knowing the exact amounts of Fe and Cu present in the ore drove us towards a simultaneous development of another analytical method suitable to measure medium to high contents. This method also made use of the technique of dilution with melting. These methods present advantages such as: a quantitative analysis with great reproducibility of results; the extension of the method to routine determination, to all kinds of ores. The main sources of error can be controlled, allowing an accuracy as high as +- 1 ppm for Cu, +- 4 ppm for Ni, +- 6 ppm for Zn and +- 1% for both Fe can Ca under the most unfavorable conditions

  3. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  4. A 96-well-plate-based optical method for the quantitative and qualitative evaluation of Pseudomonas aeruginosa biofilm formation and its application to susceptibility testing.

    Science.gov (United States)

    Müsken, Mathias; Di Fiore, Stefano; Römling, Ute; Häussler, Susanne

    2010-08-01

    A major reason for bacterial persistence during chronic infections is the survival of bacteria within biofilm structures, which protect cells from environmental stresses, host immune responses and antimicrobial therapy. Thus, there is concern that laboratory methods developed to measure the antibiotic susceptibility of planktonic bacteria may not be relevant to chronic biofilm infections, and it has been suggested that alternative methods should test antibiotic susceptibility within a biofilm. In this paper, we describe a fast and reliable protocol for using 96-well microtiter plates for the formation of Pseudomonas aeruginosa biofilms; the method is easily adaptable for antimicrobial susceptibility testing. This method is based on bacterial viability staining in combination with automated confocal laser scanning microscopy. The procedure simplifies qualitative and quantitative evaluation of biofilms and has proven to be effective for standardized determination of antibiotic efficiency on P. aeruginosa biofilms. The protocol can be performed within approximately 60 h.

  5. Quantitative Structure-Relative Volatility Relationship Model for Extractive Distillation of Ethylbenzene/p-Xylene Mixtures: Application to Binary and Ternary Mixtures as Extractive Agents

    International Nuclear Information System (INIS)

    Kang, Young-Mook; Oh, Kyunghwan; You, Hwan; No, Kyoung Tai; Jeon, Yukwon; Shul, Yong-Gun; Hwang, Sung Bo; Shin, Hyun Kil; Kim, Min Sung; Kim, Namseok; Son, Hyoungjun; Chu, Young Hwan; Cho, Kwang-Hwi

    2016-01-01

    Ethylbenzene (EB) and p-xylene (PX) are important chemicals for the production of industrial materials; accordingly, their efficient separation is desired, even though the difference in their boiling points is very small. This paper describes the efforts toward the identification of high-performance extractive agents for EB and PX separation by distillation. Most high-performance extractive agents contain halogen atoms, which present health hazards and are corrosive to distillation plates. To avoid this disadvantage of extractive agents, we developed a quantitative structure-relative volatility relationship (QSRVR) model for designing safe extractive agents. We have previously developed and reported QSRVR models for single extractive agents. In this study, we introduce extended QSRVR models for binary and ternary extractive agents. The QSRVR models accurately predict the relative volatilities of binary and ternary extractive agents. The service to predict the relative volatility for binary and ternary extractive agents is freely available from the Internet at http://qsrvr.o pengsi.org/.

  6. Quantitative Structure-Relative Volatility Relationship Model for Extractive Distillation of Ethylbenzene/p-Xylene Mixtures: Application to Binary and Ternary Mixtures as Extractive Agents

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Young-Mook; Oh, Kyunghwan; You, Hwan; No, Kyoung Tai [Bioinformatics and Molecular Design Research Center, Seoul (Korea, Republic of); Jeon, Yukwon; Shul, Yong-Gun; Hwang, Sung Bo; Shin, Hyun Kil; Kim, Min Sung; Kim, Namseok; Son, Hyoungjun [Yonsei University, Seoul (Korea, Republic of); Chu, Young Hwan [Sangji University, Wonju (Korea, Republic of); Cho, Kwang-Hwi [Soongsil University, Seoul (Korea, Republic of)

    2016-04-15

    Ethylbenzene (EB) and p-xylene (PX) are important chemicals for the production of industrial materials; accordingly, their efficient separation is desired, even though the difference in their boiling points is very small. This paper describes the efforts toward the identification of high-performance extractive agents for EB and PX separation by distillation. Most high-performance extractive agents contain halogen atoms, which present health hazards and are corrosive to distillation plates. To avoid this disadvantage of extractive agents, we developed a quantitative structure-relative volatility relationship (QSRVR) model for designing safe extractive agents. We have previously developed and reported QSRVR models for single extractive agents. In this study, we introduce extended QSRVR models for binary and ternary extractive agents. The QSRVR models accurately predict the relative volatilities of binary and ternary extractive agents. The service to predict the relative volatility for binary and ternary extractive agents is freely available from the Internet at http://qsrvr.o pengsi.org/.

  7. Application of quantitative light-induced fluorescence to determine the depth of demineralization of dental fluorosis in enamel microabrasion: a case report

    Directory of Open Access Journals (Sweden)

    Tae-Young Park

    2016-08-01

    Full Text Available Enamel microabrasion has become accepted as a conservative, nonrestorative method of removing intrinsic and superficial dysmineralization defects from dental fluorosis, restoring esthetics with minimal loss of enamel. However, it can be difficult to determine if restoration is necessary in dental fluorosis, because the lesion depth is often not easily recognized. This case report presents a method for analysis of enamel hypoplasia that uses quantitative light-induced fluorescence (QLF followed by a combination of enamel microabrasion with carbamide peroxide home bleaching. We describe the utility of QLF when selecting a conservative treatment plan and confirming treatment efficacy. In this case, the treatment plan was based on QLF analysis, and the selected combination treatment of microabrasion and bleaching had good results.

  8. Analytical performance of reciprocal isotope labeling of proteome digests for quantitative proteomics and its application for comparative studies of aerobic and anaerobic Escherichia coli proteomes

    International Nuclear Information System (INIS)

    Lo, Andy; Weiner, Joel H.; Li, Liang

    2013-01-01

    Graphical abstract: -- Highlights: •Investigating a strategy of reciprocal isotope labeling of comparative samples. •Filtering out incorrect peptide identification or quantification values. •Analyzing the proteome changes of E. coli cells grown aerobically or anaerobically. •Presenting guidelines for reciprocal labeling experimental design. -- Abstract: Due to limited sample amounts, instrument time considerations, and reagent costs, only a small number of replicate experiments are typically performed for quantitative proteome analyses. Generation of reproducible data that can be readily assessed for consistency within a small number of datasets is critical for accurate quantification. We report our investigation of a strategy using reciprocal isotope labeling of two comparative samples as a tool for determining proteome changes. Reciprocal labeling was evaluated to determine the internal consistency of quantified proteome changes from Escherichia coli grown under aerobic and anaerobic conditions. Qualitatively, the peptide overlap between replicate analyses of the same sample and reverse labeled samples were found to be within 8%. Quantitatively, reciprocal analyses showed only a slight increase in average overall inconsistency when compared with replicate analyses (1.29 vs. 1.24-fold difference). Most importantly, reverse labeling was successfully used to identify spurious values resulting from incorrect peptide identifications and poor peak fitting. After removal of 5% of the peptide data with low reproducibility, a total of 275 differentially expressed proteins (>1.50-fold difference) were consistently identified and were then subjected to bioinformatics analysis. General considerations and guidelines for reciprocal labeling experimental design and biological significance of obtained results are discussed

  9. Socioeconomic influences on biodiversity, ecosystem services and human well-being: a quantitative application of the DPSIR model in Jiangsu, China.

    Science.gov (United States)

    Hou, Ying; Zhou, Shudong; Burkhard, Benjamin; Müller, Felix

    2014-08-15

    One focus of ecosystem service research is the connection between biodiversity, ecosystem services and human well-being as well as the socioeconomic influences on them. Despite existing investigations, exact impacts from the human system on the dynamics of biodiversity, ecosystem services and human well-being are still uncertain because of the insufficiency of the respective quantitative analyses. Our research aims are discerning the socioeconomic influences on biodiversity, ecosystem services and human well-being and demonstrating mutual impacts between these items. We propose a DPSIR framework coupling ecological integrity, ecosystem services as well as human well-being and suggest DPSIR indicators for the case study area Jiangsu, China. Based on available statistical and surveying data, we revealed the factors significantly impacting biodiversity, ecosystem services and human well-being in the research area through factor analysis and correlation analysis, using the 13 prefecture-level cities of Jiangsu as samples. The results show that urbanization and industrialization in the urban areas have predominant positive influences on regional biodiversity, agricultural productivity and tourism services as well as rural residents' living standards. Additionally, the knowledge, technology and finance inputs for agriculture also have generally positive impacts on these system components. Concerning regional carbon storage, non-cropland vegetation cover obviously plays a significant positive role. Contrarily, the expansion of farming land and the increase of total food production are two important negative influential factors of biodiversity, ecosystem's food provisioning service capacity, regional tourism income and the well-being of the rural population. Our study provides a promising approach based on the DPSIR model to quantitatively capture the socioeconomic influential factors of biodiversity, ecosystem services and human well-being for human-environmental systems

  10. Application of ovine luteinizing hormone (LH) radioimmunoassay in the quantitation of LH in different mammalian species. [/sup 125/I tracer technique

    Energy Technology Data Exchange (ETDEWEB)

    Millar, R.P.; Aehnelt, C.

    1977-09-01

    A sensitive double antibody radioimmunoassay has been developed for measuring luteinizing hormone (LH) in various African mammalian species, using rabbit anti-ovine LH serum (GDN 15) and radioiodinated rat LH or ovine LH. Serum and pituitary homogenates from some African mammals (hyrax, reedbuck, sable, impala, tsessebe, thar, spring-hare, ground squirrel and cheetah, as well as the domestic sheep, cow and horse and laboratory rat and hamster) produced displacement curves parallel to that of the ovine LH standards. The specificity of the assay was examined in detail for one species, the rock hyrax. Radioimmunoassay and bioassay estimates of LH in hyrax pituitaries containing widely differing quantities of pituitary hormones were similar. In sexually active male hyrax mean plasma LH was 12.1 ng/ml and pituitary LH 194 ..mu..g/gland, but in sexually quiescent hyrax mean plasma LH was 2.4 ng/ml and mean pituitary LH 76 ..mu..g/gland. Intravenous injection of 10 ..mu..g of luteinizing hormone releasing hormone increased mean LH levels in hyrax from 0.9 ng/ml to 23.2 ng/ml by 30 min. Conversely, im injection of 250 ..mu..g testosterone induced a fall in LH levels in male hyrax from 1.7 ng/ml to 0.7 ng/ml 6 h after administration. Although the specificity of the assay for quantitating plasma LH in other species was not categorically established, there was a good correlation between plasma LH concentration and reproductive state in the bontebok, impala, spring-hare, thar, cheetah, domestic horse and laboratory rat, suggesting the potential use of the antiserum in quantitating LH in a variety of mammalian species.

  11. Development of pharmacophore similarity-based quantitative activity hypothesis and its applicability domain: applied on a diverse data-set of HIV-1 integrase inhibitors.

    Science.gov (United States)

    Kumar, Sivakumar Prasanth; Jasrai, Yogesh T; Mehta, Vijay P; Pandya, Himanshu A

    2015-01-01

    Quantitative pharmacophore hypothesis combines the 3D spatial arrangement of pharmacophore features with biological activities of the ligand data-set and predicts the activities of geometrically and/or pharmacophoric similar ligands. Most pharmacophore discovery programs face difficulties in conformational flexibility, molecular alignment, pharmacophore features sampling, and feature selection to score models if the data-set constitutes diverse ligands. Towards this focus, we describe a ligand-based computational procedure to introduce flexibility in aligning the small molecules and generating a pharmacophore hypothesis without geometrical constraints to define pharmacophore space, enriched with chemical features necessary to elucidate common pharmacophore hypotheses (CPHs). Maximal common substructure (MCS)-based alignment method was adopted to guide the alignment of carbon molecules, deciphered the MCS atom connectivity to cluster molecules in bins and subsequently, calculated the pharmacophore similarity matrix with the bin-specific reference molecules. After alignment, the carbon molecules were enriched with original atoms in their respective positions and conventional pharmacophore features were perceived. Distance-based pharmacophoric descriptors were enumerated by computing the interdistance between perceived features and MCS-aligned 'centroid' position. The descriptor set and biological activities were used to develop support vector machine models to predict the activities of the external test set. Finally, fitness score was estimated based on pharmacophore similarity with its bin-specific reference molecules to recognize the best and poor alignments and, also with each reference molecule to predict outliers of the quantitative hypothesis model. We applied this procedure to a diverse data-set of 40 HIV-1 integrase inhibitors and discussed its effectiveness with the reported CPH model.

  12. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  13. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  14. Applicability of integrated cell culture quantitative PCR (ICC-qPCR) for the detection of infectious adenovirus type 2 in UV disinfection studies

    Science.gov (United States)

    Human adenovirus is relatively resistant to UV radiation and has been used as a conservative testing microbe for evaluations of UV disinfection systems as components of water treatment processes. In this study, we attempted to validate the applicability of integrated cell culture...

  15. Quantitative neutron radiography using neutron absorbing honeycomb

    International Nuclear Information System (INIS)

    Tamaki, Masayoshi; Oda, Masahiro; Takahashi, Kenji; Ohkubo, Kohei; Tasaka, Kanji; Tsuruno, Akira; Matsubayashi, Masahito.

    1993-01-01

    This investigation concerns quantitative neutron radiography and computed tomography by using a neutron absorbing honeycomb collimator. By setting the neutron absorbing honeycomb collimator between object and imaging system, neutrons scattered in the object were absorbed by the honeycomb material and eliminated before coming to the imaging system, but the neutrons which were transmitted the object without interaction could reach the imaging system. The image by purely transmitted neutrons gives the quantitative information. Two honeycombs were prepared with coating of boron nitride and gadolinium oxide and evaluated for the quantitative application. The relation between the neutron total cross section and the attenuation coefficient confirmed that they were in a fairly good agreement. Application to quantitative computed tomography was also successfully conducted. The new neutron radiography method using the neutron-absorbing honeycomb collimator for the elimination of the scattered neutrons improved remarkably the quantitativeness of the neutron radiography and computed tomography. (author)

  16. Development and Application of an MSALL-Based Approach for the Quantitative Analysis of Linear Polyethylene Glycols in Rat Plasma by Liquid Chromatography Triple-Quadrupole/Time-of-Flight Mass Spectrometry.

    Science.gov (United States)

    Zhou, Xiaotong; Meng, Xiangjun; Cheng, Longmei; Su, Chong; Sun, Yantong; Sun, Lingxia; Tang, Zhaohui; Fawcett, John Paul; Yang, Yan; Gu, Jingkai

    2017-05-16

    Polyethylene glycols (PEGs) are synthetic polymers composed of repeating ethylene oxide subunits. They display excellent biocompatibility and are widely used as pharmaceutical excipients. To fully understand the biological fate of PEGs requires accurate and sensitive analytical methods for their quantitation. Application of conventional liquid chromatography-tandem mass spectrometry (LC-MS/MS) is difficult because PEGs have polydisperse molecular weights (MWs) and tend to produce multicharged ions in-source resulting in innumerable precursor ions. As a result, multiple reaction monitoring (MRM) fails to scan all ion pairs so that information on the fate of unselected ions is missed. This Article addresses this problem by application of liquid chromatography-triple-quadrupole/time-of-flight mass spectrometry (LC-Q-TOF MS) based on the MS ALL technique. This technique performs information-independent acquisition by allowing all PEG precursor ions to enter the collision cell (Q2). In-quadrupole collision-induced dissociation (CID) in Q2 then effectively generates several fragments from all PEGs due to the high collision energy (CE). A particular PEG product ion (m/z 133.08592) was found to be common to all linear PEGs and allowed their total quantitation in rat plasma with high sensitivity, excellent linearity and reproducibility. Assay validation showed the method was linear for all linear PEGs over the concentration range 0.05-5.0 μg/mL. The assay was successfully applied to the pharmacokinetic study in rat involving intravenous administration of linear PEG 600, PEG 4000, and PEG 20000. It is anticipated the method will have wide ranging applications and stimulate the development of assays for other pharmaceutical polymers in the future.

  17. EASY: a simple tool for simultaneously removing background, deadtime and acoustic ringing in quantitative NMR spectroscopy--part I: basic principle and applications.

    Science.gov (United States)

    Jaeger, Christian; Hemmann, Felix

    2014-01-01

    Elimination of Artifacts in NMR SpectroscopY (EASY) is a simple but very effective tool to remove simultaneously any real NMR probe background signal, any spectral distortions due to deadtime ringdown effects and -specifically- severe acoustic ringing artifacts in NMR spectra of low-gamma nuclei. EASY enables and maintains quantitative NMR (qNMR) as only a single pulse (preferably 90°) is used for data acquisition. After the acquisition of the first scan (it contains the wanted NMR signal and the background/deadtime/ringing artifacts) the same experiment is repeated immediately afterwards before the T1 waiting delay. This second scan contains only the background/deadtime/ringing parts. Hence, the simple difference of both yields clean NMR line shapes free of artefacts. In this Part I various examples for complete (1)H, (11)B, (13)C, (19)F probe background removal due to construction parts of the NMR probes are presented. Furthermore, (25)Mg EASY of Mg(OH)2 is presented and this example shows how extremely strong acoustic ringing can be suppressed (more than a factor of 200) such that phase and baseline correction for spectra acquired with a single pulse is no longer a problem. EASY is also a step towards deadtime-free data acquisition as these effects are also canceled completely. EASY can be combined with any other NMR experiment, including 2D NMR, if baseline distortions are a big problem. © 2013 Published by Elsevier Inc.

  18. Quantitative characterization of chitosan in the skin by Fourier-transform infrared spectroscopic imaging and ninhydrin assay: application in transdermal sciences.

    Science.gov (United States)

    Nawaz, A; Wong, T W

    2016-07-01

    The chitosan has been used as the primary excipient in transdermal particulate dosage form design. Its distribution pattern across the epidermis and dermis is not easily accessible through chemical assay and limited to radiolabelled molecules via quantitative autoradiography. This study explored Fourier-transform infrared spectroscopy imaging technique with built-in microscope as the means to examine chitosan molecular distribution over epidermis and dermis with the aid of histology operation. Fourier-transform infrared spectroscopy skin imaging was conducted using chitosan of varying molecular weights, deacetylation degrees, particle sizes and zeta potentials, obtained via microwave ligation of polymer chains at solution state. Both skin permeation and retention characteristics of chitosan increased with the use of smaller chitosan molecules with reduced acetyl content and size, and increased positive charge density. The ratio of epidermal to dermal chitosan content decreased with the use of these chitosan molecules as their accumulation in dermis (3.90% to 18.22%) was raised to a greater extent than epidermis (0.62% to 1.92%). A larger dermal chitosan accumulation nonetheless did not promote the transdermal polymer passage more than the epidermal chitosan. A small increase in epidermal chitosan content apparently could fluidize the stratum corneum and was more essential to dictate molecular permeation into dermis and systemic circulation. The histology technique aided Fourier-transform infrared spectroscopy imaging approach introduces a new dimension to the mechanistic aspect of chitosan in transdermal delivery. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  19. The optokinetic reflex as a tool for quantitative analyses of nervous system function in mice: application to genetic and drug-induced variation.

    Directory of Open Access Journals (Sweden)

    Hugh Cahill

    2008-04-01

    Full Text Available The optokinetic reflex (OKR, which serves to stabilize a moving image on the retina, is a behavioral response that has many favorable attributes as a test of CNS function. The OKR requires no training, assesses the function of diverse CNS circuits, can be induced repeatedly with minimal fatigue or adaptation, and produces an electronic record that is readily and objectively quantifiable. We describe a new type of OKR test apparatus in which computer-controlled visual stimuli and streamlined data analysis facilitate a relatively high throughput behavioral assay. We used this apparatus, in conjunction with infrared imaging, to quantify basic OKR stimulus-response characteristics for C57BL/6J and 129/SvEv mouse strains and for genetically engineered lines lacking one or more photoreceptor systems or with an alteration in cone spectral sensitivity. A second generation (F2 cross shows that the characteristic difference in OKR frequency between C57BL/6J and 129/SvEv is inherited as a polygenic trait. Finally, we demonstrate the sensitivity and high temporal resolution of the OKR for quantitative analysis of CNS drug action. These experiments show that the mouse OKR is well suited for neurologic testing in the context of drug discovery and large-scale phenotyping programs.

  20. Direct resolution and quantitative analysis of flurbiprofen enantiomers using microcrystalline cellulose triacetate plates: applications to the enantiomeric purity control and optical isomer determination in widely consumed drugs.

    Science.gov (United States)

    Del Bubba, M; Checchini, L; Ciofi, L; Furlanetto, S; Lepri, L

    2014-01-01

    Flurbiprofen enantiomers have very different pharmacological properties, since the (S)-(+) form has a much higher anti-inflammatory activity than the (R)-(-) isomer, the latter being responsible for very undesirable side effects, such as gastrointestinal irritation. Based on the different biological properties of flurbiprofen enantiomers, the development of chiral chromatographic methods for the control of the enantiomeric purity is a very important topic. In this study the separation of flurbiprofen enantiomers was achieved using for the first time noncommercial MCTA layers with polyvinyl alcohol as binder, which gives to these plates a mechanical stability equivalent to that of marketed ones. Baseline resolution (α = 1.31; RS = 2.0) was obtained with ethanol-acetic acid solution (pH 3.0 ± 0.1; 60:40, v/v) as eluent and a migration distance of about 14.5 cm. Under these experimental conditions, the thin-layer chromatography determination of the enantiomeric purity of the pharmacologically active (S)-(+)-flurbiprofen in the presence of 1% of the undesired (R)-(-) form was demonstrated. Moreover, the quantitative analysis of flurbiprofen enantiomers was achieved, obtaining quantification limits and detection limits of 50 and 25 ng of each enantiomer applied to the plate, respectively. The method was succesfully applied to the enantiomer determination in widely consumed drugs, obtaining results consistent with the flurbiprofen content declared in the drug facts. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Application of High-Performance Liquid Chromatography Coupled with Linear Ion Trap Quadrupole Orbitrap Mass Spectrometry for Qualitative and Quantitative Assessment of Shejin-Liyan Granule Supplements.

    Science.gov (United States)

    Gu, Jifeng; Wu, Weijun; Huang, Mengwei; Long, Fen; Liu, Xinhua; Zhu, Yizhun

    2018-04-11

    A method for high-performance liquid chromatography coupled with linear ion trap quadrupole Orbitrap high-resolution mass spectrometry (HPLC-LTQ-Orbitrap MS) was developed and validated for the qualitative and quantitative assessment of Shejin-liyan Granule. According to the fragmentation mechanism and high-resolution MS data, 54 compounds, including fourteen isoflavones, eleven ligands, eight flavonoids, six physalins, six organic acids, four triterpenoid saponins, two xanthones, two alkaloids, and one licorice coumarin, were identified or tentatively characterized. In addition, ten of the representative compounds (matrine, galuteolin, tectoridin, iridin, arctiin, tectorigenin, glycyrrhizic acid, irigenin, arctigenin, and irisflorentin) were quantified using the validated HPLC-LTQ-Orbitrap MS method. The method validation showed a good linearity with coefficients of determination (r²) above 0.9914 for all analytes. The accuracy of the intra- and inter-day variation of the investigated compounds was 95.0-105.0%, and the precision values were less than 4.89%. The mean recoveries and reproducibilities of each analyte were 95.1-104.8%, with relative standard deviations below 4.91%. The method successfully quantified the ten compounds in Shejin-liyan Granule, and the results show that the method is accurate, sensitive, and reliable.

  2. Simultaneous quantitation of lamivudine, zidovudine and nevirapine in human plasma by liquid chromatography–tandem mass spectrometry and application to a pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Murali Krishna Matta

    2012-10-01

    Full Text Available A rapid and sensitive LC–MS/MS method for the simultaneous quantitation of lamivudine, zidovudine and nevirapine in human plasma using abacavir as internal standard has been developed and validated. The analytes and IS were extracted from plasma by solid phase extraction using Oasis HLB cartridges and separated on a Hypurity Advance C18 column using a mixture of acetonitrile:0.1% formic acid (76:24, v/v at a flow rate of 0.8 mL/min. Detection involved an API-4000 LC–MS/MS with electrospray ionization in the positive ion mode and multiple-reaction monitoring for analysis. The method was validated according to FDA guidelines and shown to provide intra- and inter-day precision and accuracy within acceptable limits in a run time of only 3.5 min. The method was successfully applied to a pharmacokinetic study involving a single oral administration of a combination tablet to human male volunteers.

  3. Development of an LC–MS/MS method for the quantitation of deoxyglycychloxazol in rat plasma and its application in pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Rongshan Li

    2016-06-01

    Full Text Available Deoxyglycychloxazol (TY501 is a glycyrrhetinic acid derivative which exhibits high anti-inflammatory activity and reduced pseudoaldosteronism compared to glycyrrhetinic acid. In this study, a sensitive and rapid liquid chromatography–tandem mass spectrometry (LC–MS/MS method was established for the quantitation of TY501 in rat plasma. Plasma samples were treated by precipitating protein with methanol and supernatants were separated by a Symmetry C8 column with the mobile phase consisting of methanol and 10 mM ammonium formate (containing 0.1% of formic acid (90:10, v/v. The selected reaction monitoring (SRM transitions were performed at m/z 647.4→191.2 for TY501 and m/z 473.3→143.3 for astragaloside aglycone (IS in the positive ion mode with atmospheric pressure chemical ionization (APCI source. Calibration curve was linear over the concentration range of 5–5000 ng/mL. The lower limit of quantification was 5 ng/mL. The mean recovery was over 88%. The intra- and inter-day precisions were lower than 6.0% and 12.8%, respectively, and the accuracy was within ±1.3%. TY501 was stable under usual storage conditions and handling procedure. The validated method has been successfully applied to a pharmacokinetic study after oral administration of TY501 to rats at a dosage of 10 mg/kg.

  4. Application of Quantitative Microbial Risk Assessment to analyze the public health risk from poor drinking water quality in a low income area in Accra, Ghana.

    Science.gov (United States)

    Machdar, E; van der Steen, N P; Raschid-Sally, L; Lens, P N L

    2013-04-01

    In Accra, Ghana, a majority of inhabitants lives in over-crowded areas with limited access to piped water supply, which is often also intermittent. This study assessed in a densely populated area the risk from microbial contamination of various sources of drinking water, by conducting a Quantitative Microbiological Risk Assessment (QMRA) to estimate the risk to human health from microorganism exposure and dose-response relationships. Furthermore the cost-effectiveness in reducing the disease burden through targeted interventions was evaluated. Five risk pathways for drinking water were identified through a survey (110 families), namely household storage, private yard taps, communal taps, communal wells and water sachets. Samples from each source were analyzed for Escherichia coli and Ascaris contamination. Published ratios between E. coli and other pathogens were used for the QMRA and disease burden calculations. The major part of the burden of disease originated from E. coli O157:H7 (78%) and the least important contributor was Cryptosporidium (0.01%). Other pathogens contributed 16% (Campylobacter), 5% (Rotavirus) and 0.3% (Ascaris). The sum of the disease burden of these pathogens was 0.5 DALYs per person per year, which is much higher than the WHO reference level. The major contamination pathway was found to be household storage. Disinfection of water at household level was the most cost-effective intervention (Water supply network improvements were significantly less cost-effective. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Application of High-Performance Liquid Chromatography Coupled with Linear Ion Trap Quadrupole Orbitrap Mass Spectrometry for Qualitative and Quantitative Assessment of Shejin-Liyan Granule Supplements

    Directory of Open Access Journals (Sweden)

    Jifeng Gu

    2018-04-01

    Full Text Available A method for high-performance liquid chromatography coupled with linear ion trap quadrupole Orbitrap high-resolution mass spectrometry (HPLC-LTQ-Orbitrap MS was developed and validated for the qualitative and quantitative assessment of Shejin-liyan Granule. According to the fragmentation mechanism and high-resolution MS data, 54 compounds, including fourteen isoflavones, eleven ligands, eight flavonoids, six physalins, six organic acids, four triterpenoid saponins, two xanthones, two alkaloids, and one licorice coumarin, were identified or tentatively characterized. In addition, ten of the representative compounds (matrine, galuteolin, tectoridin, iridin, arctiin, tectorigenin, glycyrrhizic acid, irigenin, arctigenin, and irisflorentin were quantified using the validated HPLC-LTQ-Orbitrap MS method. The method validation showed a good linearity with coefficients of determination (r2 above 0.9914 for all analytes. The accuracy of the intra- and inter-day variation of the investigated compounds was 95.0–105.0%, and the precision values were less than 4.89%. The mean recoveries and reproducibilities of each analyte were 95.1–104.8%, with relative standard deviations below 4.91%. The method successfully quantified the ten compounds in Shejin-liyan Granule, and the results show that the method is accurate, sensitive, and reliable.

  6. Quantitative anatomical analysis of facial expression using a 3D motion capture system: Application to cosmetic surgery and facial recognition technology.

    Science.gov (United States)

    Lee, Jae-Gi; Jung, Su-Jin; Lee, Hyung-Jin; Seo, Jung-Hyuk; Choi, You-Jin; Bae, Hyun-Sook; Park, Jong-Tae; Kim, Hee-Jin

    2015-09-01

    The topography of the facial muscles differs between males and females and among individuals of the same gender. To explain the unique expressions that people can make, it is important to define the shapes of the muscle, their associations with the skin, and their relative functions. Three-dimensional (3D) motion-capture analysis, often used to study facial expression, was used in this study to identify characteristic skin movements in males and females when they made six representative basic expressions. The movements of 44 reflective markers (RMs) positioned on anatomical landmarks were measured. Their mean displacement was large in males [ranging from 14.31 mm (fear) to 41.15 mm (anger)], and 3.35-4.76 mm smaller in females [ranging from 9.55 mm (fear) to 37.80 mm (anger)]. The percentages of RMs involved in the ten highest mean maximum displacement values in making at least one expression were 47.6% in males and 61.9% in females. The movements of the RMs were larger in males than females but were more limited. Expanding our understanding of facial expression requires morphological studies of facial muscles and studies of related complex functionality. Conducting these together with quantitative analyses, as in the present study, will yield data valuable for medicine, dentistry, and engineering, for example, for surgical operations on facial regions, software for predicting changes in facial features and expressions after corrective surgery, and the development of face-mimicking robots. © 2015 Wiley Periodicals, Inc.

  7. An overview on development and application of an experimental platform for quantitative cardiac imaging research in rabbit models of myocardial infarction.

    Science.gov (United States)

    Feng, Yuanbo; Bogaert, Jan; Oyen, Raymond; Ni, Yicheng

    2014-10-01

    To exploit the advantages of using rabbits for cardiac imaging research and to tackle the technical obstacles, efforts have been made under the framework of a doctoral research program. In this overview article, by cross-referencing the current literature, we summarize how we have developed a preclinical cardiac research platform based on modified models of reperfused myocardial infarction (MI) in rabbits; how the in vivo manifestations of cardiac imaging could be closely matched with those ex vivo macro- and microscopic findings; how these imaging outcomes could be quantitatively analyzed, validated and demonstrated; and how we could apply this cardiac imaging platform to provide possible solutions to certain lingering diagnostic and therapeutic problems in experimental cardiology. In particular, tissue components in acute cardiac ischemia have been stratified and characterized, post-infarct lipomatous metaplasia (LM) as a common but hardly illuminated clinical pathology has been identified in rabbit models, and a necrosis avid tracer as well as an anti-ischemic drug have been successfully assessed for their potential utilities in clinical cardiology. These outcomes may interest the researchers in the related fields and help strengthen translational research in cardiovascular diseases.

  8. Quantitative inspection by computerized tomography

    International Nuclear Information System (INIS)

    Lopes, R.T.; Assis, J.T. de; Jesus, E.F.O. de

    1989-01-01

    The computerized Tomography (CT) is a method of nondestructive testing, that furnish quantitative information, that permit the detection and accurate localization of defects, internal dimension measurement, and, measurement and chart of the density distribution. The CT technology is much versatile, not presenting restriction in relation to form, size or composition of the object. A tomographic system, projected and constructed in our laboratory is presented. The applications and limitation of this system, illustrated by tomographyc images, are shown. (V.R.B.)

  9. Effect of Sodium Chloride Concentrations and Its Foliar Application Time on Quantitative and Qualitative Characteristics of Pomegranate Fruit (Punica granatum L.) CV. “Malas Saveh”

    OpenAIRE

    V. Rouhi; A. Nikbakht; S. Houshmand

    2016-01-01

    Introduction: Pomegranate (Punica granatum L.) belong to Punicaceae family is native to Iran and grown extensively in arid and semi-arid regions worldwide. Pomegranate is also important in human medicine and its components have a wide range of clinical applications. Cracking causes a major fruit loss, which is a serious commercial loss to farmers. Fruit cracking, seems to be a problem that lessens the marketability to a great extent. Fruit cracking is one of the physiological disorders wherev...

  10. A novel baseline-correction method for standard addition based derivative spectra and its application to quantitative analysis of benzo(a)pyrene in vegetable oil samples.

    Science.gov (United States)

    Li, Na; Li, Xiu-Ying; Zou, Zhe-Xiang; Lin, Li-Rong; Li, Yao-Qun

    2011-07-07

    In the present work, a baseline-correction method based on peak-to-derivative baseline measurement was proposed for the elimination of complex matrix interference that was mainly caused by unknown components and/or background in the analysis of derivative spectra. This novel method was applicable particularly when the matrix interfering components showed a broad spectral band, which was common in practical analysis. The derivative baseline was established by connecting two crossing points of the spectral curves obtained with a standard addition method (SAM). The applicability and reliability of the proposed method was demonstrated through both theoretical simulation and practical application. Firstly, Gaussian bands were used to simulate 'interfering' and 'analyte' bands to investigate the effect of different parameters of interfering band on the derivative baseline. This simulation analysis verified that the accuracy of the proposed method was remarkably better than other conventional methods such as peak-to-zero, tangent, and peak-to-peak measurements. Then the above proposed baseline-correction method was applied to the determination of benzo(a)pyrene (BaP) in vegetable oil samples by second-derivative synchronous fluorescence spectroscopy. The satisfactory results were obtained by using this new method to analyze a certified reference material (coconut oil, BCR(®)-458) with a relative error of -3.2% from the certified BaP concentration. Potentially, the proposed method can be applied to various types of derivative spectra in different fields such as UV-visible absorption spectroscopy, fluorescence spectroscopy and infrared spectroscopy.

  11. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    Science.gov (United States)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  12. Quantitative sensory testing in the German Research Network on Neuropathic Pain (DFNS): reference data for the trunk and application in patients with chronic postherpetic neuralgia.

    Science.gov (United States)

    Pfau, Doreen B; Krumova, Elena K; Treede, Rolf-Detlef; Baron, Ralf; Toelle, Thomas; Birklein, Frank; Eich, Wolfgang; Geber, Christian; Gerhardt, Andreas; Weiss, Thomas; Magerl, Walter; Maier, Christoph

    2014-05-01

    Age- and gender-matched reference values are essential for the clinical use of quantitative sensory testing (QST). To extend the standard test sites for QST-according to the German Research Network on Neuropathic Pain-to the trunk, we collected QST profiles on the back in 162 healthy subjects. Sensory profiles for standard test sites were within normal interlaboratory differences. QST revealed lower sensitivity on the upper back than the hand, and higher sensitivity on the lower back than the foot, but no systematic differences between these trunk sites. Age effects were significant for most parameters. Females exhibited lower pressure pain thresholds (PPT) than males, which was the only significant gender difference. Values outside the 95% confidence interval of healthy subjects (considered abnormal) required temperature changes of >3.3-8.2 °C for thermal detection. For cold pain thresholds, confidence intervals extended mostly beyond safety cutoffs, hence only relative reference data (left-right differences, hand-trunk differences) were sufficiently sensitive. For mechanical detection and pain thresholds, left-right differences were 1.5-2.3 times more sensitive than absolute reference data. The most sensitive parameter was PPT, where already side-to-side differences >35% were abnormal. Compared to trunk reference data, patients with postherpetic neuralgia exhibited thermal and tactile deficits and dynamic mechanical allodynia, mostly without reduced mechanical pain thresholds. This pattern deviates from other types of neuropathic pain. QST reference data for the trunk will also be useful for patients with postthoracotomy pain or chronic back pain. Copyright © 2014 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  13. Simultaneous determination of linagliptin and metformin by reverse phase-high performance liquid chromatography method: An application in quantitative analysis of pharmaceutical dosage forms

    Directory of Open Access Journals (Sweden)

    Prathyusha Vemula

    2015-01-01

    Full Text Available To enhance patient compliance toward treatment in diseases like diabetes, usually a combination of drugs is prescribed. Therefore, an anti-diabetic fixed-dose combination of 2.5 mg of linagliptin 500 mg of metformin was taken for simultaneous estimation of both the drugs by reverse phase-high performance liquid chromatography (RP-HPLC method. The present study aimed to develop a simple and sensitive RP-HPLC method for the simultaneous determination of linagliptin and metformin in pharmaceutical dosage forms. The chromatographic separation was designed and evaluated by using linagliptin and metformin working standard and sample solutions in the linearity range. Chromatographic separation was performed on a C 18 column using a mobile phase of 70:30 (v/v mixture of methanol and 0.05 M potassium dihydrogen orthophosphate (pH adjusted to 4.6 with orthophosphoric acid delivered at a flow rate of 0.6 mL/min and UV detection at 267 nm. Linagliptin and metformin shown linearity in the range of 2-12 μg/mL and 400-2400 μg/mL respectively with correlation co-efficient of 0.9996 and 0.9989. The resultant findings analyzed for standard deviation (SD and relative standard deviation to validate the developed method. The retention time of linagliptin and metformin was found to be 6.3 and 4.6 min and separation was complete in <10 min. The method was validated for linearity, accuracy and precision were found to be acceptable over the linearity range of the linagliptin and metformin. The method was found suitable for the routine quantitative analysis of linagliptin and metformin in pharmaceutical dosage forms.

  14. Quantitation of itopride in human serum by high-performance liquid chromatography with fluorescence detection and its application to a bioequivalence study.

    Science.gov (United States)

    Singh, Sonu Sundd; Jain, Manish; Sharma, Kuldeep; Shah, Bhavin; Vyas, Meghna; Thakkar, Purav; Shah, Ruchy; Singh, Shriprakash; Lohray, Brajbhushan

    2005-04-25

    A new method was developed for determination of itopride in human serum by reversed phase high-performance liquid chromatography (HPLC) with fluorescence detection (excitation at 291 nm and emission at 342 nm). The method employed one-step extraction of itopride from serum matrix with a mixture of tert-butyl methyl ether and dichloromethane (70:30, v/v) using etoricoxib as an internal standard. Chromatographic separation was obtained within 12.0 min using a reverse phase YMC-Pack AM ODS column (250 mm x 4.6 mm, 5 microm) and an isocratic mobile phase constituting of a mixture of 0.05% tri-fluoro acetic acid in water and acetonitrile (75:25, v/v) flowing at a flow rate of 1.0 ml/min. The method was linear in the range of 14.0 ng/ml to 1000.0 ng/ml. The lower limit of quantitation (LLOQ) was 14.0 ng/ml. Average recovery of itopride and the internal standard from the biological matrix was more than 66.04 and 64.57%, respectively. The inter-day accuracy of the drug containing serum samples was more than 97.81% with a precision of 2.31-3.68%. The intra-day accuracy was 96.91% or more with a precision of 5.17-9.50%. Serum samples containing itopride were stable for 180.0 days at -70+/-5 degrees C and for 24.0 h at ambient temperature (25+/-5 degrees C). The method was successfully applied to the bioequivalence study of itopride in healthy, male human subjects.

  15. Quantitative analysis of cellular glutathione by flow cytometry utilizing monochlorobimane: some applications to radiation and drug resistance in vitro and in vivo.

    Science.gov (United States)

    Rice, G C; Bump, E A; Shrieve, D C; Lee, W; Kovacs, M

    1986-12-01

    An assay using a bimane derivative has been developed to detect free glutathione (GSH) in individual viable cells by flow cytometry. Monochlorobimane [syn-(ClCH2CH3)-1,5-diazabicycla[3.30]acta-3,6-diene-2,8-dio ne], itself nonfluorescent, reacts with GSH to form a highly fluorescent derivative. High pressure liquid chromatography analysis showed that, using specific staining conditions, the only low molecular weight fluorescent derivative formed in Chinese hamster ovary cells was that formed with GSH. Very little reaction with protein sulfhydryls was observed. Rates of GSH depletion in Chinese hamster ovary cells exposed to diethylmaleate were essentially the same, whether measured by relative fluorescence intensity, by flow cytometry or by enzymatic assay on cellular extracts. This method was shown to be useful for measurement of GSH resynthesis, uptake, and depletion by prolonged hypoxia and misonidazole treatment. Since measurements are made on individual cells, cell-to-cell variation and populational heterogeneity in GSH content are revealed by flow cytometry. Although under most conditions in vitro GSH content is relatively homogeneous, under certain circumstances, such as release from hypoxia, heterogeneity in populational GSH levels was observed. The significance of this heterogeneity is discussed in regard to the induction of gene amplification and drug resistance by transient hypoxia. Numerous subclones of Chinese hamster ovary cells selected by growth in Adriamycin or methotrexate-containing medium express elevated levels of GSH per cell. The method was extended to quantitate the GSH content of cells excised from EMT-6/SF mouse tumors that had been treated in vivo with L-buthionine-S-R-sulfoximine, an inhibitor of GSH synthesis. The bivariate analysis (forward angle light scatter versus monochlorobimane fluorescence) of cells derived from these tumors gave excellent resolution of normal and tumor cells and demonstrated extensive heterogeneity in the tumor

  16. An orientation sensitive approach in biomolecule interaction quantitative structure-activity relationship modeling and its application in ion-exchange chromatography.

    Science.gov (United States)

    Kittelmann, Jörg; Lang, Katharina M H; Ottens, Marcel; Hubbuch, Jürgen

    2017-01-27

    Quantitative structure-activity relationship (QSAR) modeling for prediction of biomolecule parameters has become an established technique in chromatographic purification process design. Unfortunately available descriptor sets fail to describe the orientation of biomolecules and the effects of ionic strength in the mobile phase on the interaction with the stationary phase. The literature describes several special descriptors used for chromatographic retention modeling, all of these do not describe the screening of electrostatic potential by the mobile phase in use. In this work we introduce two new approaches of descriptor calculations, namely surface patches and plane projection, which capture an oriented binding to charged surfaces and steric hindrance of the interaction with chromatographic ligands with regard to electrostatic potential screening by mobile phase ions. We present the use of the developed descriptor sets for predictive modeling of Langmuir isotherms for proteins at different pH values between pH 5 and 10 and varying ionic strength in the range of 10-100mM. The resulting model has a high correlation of calculated descriptors and experimental results, with a coefficient of determination of 0.82 and a predictive coefficient of determination of 0.92 for unknown molecular structures and conditions. The agreement of calculated molecular interaction orientations with both, experimental results as well as molecular dynamic simulations from literature is shown. The developed descriptors provide the means for improved QSAR models of chromatographic processes, as they reflect the complex interactions of biomolecules with chromatographic phases. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. The application of continuous wavelet transform and least squares support vector machine for the simultaneous quantitative spectrophotometric determination of Myricetin, Kaempferol and Quercetin as flavonoids in pharmaceutical plants

    Science.gov (United States)

    Sohrabi, Mahmoud Reza; Darabi, Golnaz

    2016-01-01

    Flavonoids are γ-benzopyrone derivatives, which are highly regarded in these researchers for their antioxidant property. In this study, two new signals processing methods been coupled with UV spectroscopy for spectral resolution and simultaneous quantitative determination of Myricetin, Kaempferol and Quercetin as flavonoids in Laurel, St. John's Wort and Green Tea without the need for any previous separation procedure. The developed methods are continuous wavelet transform (CWT) and least squares support vector machine (LS-SVM) methods integrated with UV spectroscopy individually. Different wavelet families were tested by CWT method and finally the Daubechies wavelet family (Db4) for Myricetin and the Gaussian wavelet families for Kaempferol (Gaus3) and Quercetin (Gaus7) were selected and applied for simultaneous analysis under the optimal conditions. The LS-SVM was applied to build the flavonoids prediction model based on absorption spectra. The root mean square errors for prediction (RMSEP) of Myricetin, Kaempferol and Quercetin were 0.0552, 0.0275 and 0.0374, respectively. The developed methods were validated by the analysis of the various synthetic mixtures associated with a well- known flavonoid contents. Mean recovery values of Myricetin, Kaempferol and Quercetin, in CWT method were 100.123, 100.253, 100.439 and in LS-SVM method were 99.94, 99.81 and 99.682, respectively. The results achieved by analyzing the real samples from the CWT and LS-SVM methods were compared to the HPLC reference method and the results were very close to the reference method. Meanwhile, the obtained results of the one-way ANOVA (analysis of variance) test revealed that there was no significant difference between the suggested methods.

  18. Application of targeted quantitative proteomics analysis in human cerebrospinal fluid using a liquid chromatography matrix-assisted laser desorption/ionization time-of-flight tandem mass spectrometer (LC MALDI TOF/TOF) platform.

    Science.gov (United States)

    Pan, Sheng; Rush, John; Peskind, Elaine R; Galasko, Douglas; Chung, Kathryn; Quinn, Joseph; Jankovic, Joseph; Leverenz, James B; Zabetian, Cyrus; Pan, Catherine; Wang, Yan; Oh, Jung Hun; Gao, Jean; Zhang, Jianpeng; Montine, Thomas; Zhang, Jing

    2008-02-01

    Targeted quantitative proteomics by mass spectrometry aims to selectively detect one or a panel of peptides/proteins in a complex sample and is particularly appealing for novel biomarker verification/validation because it does not require specific antibodies. Here, we demonstrated the application of targeted quantitative proteomics in searching, identifying, and quantifying selected peptides in human cerebrospinal spinal fluid (CSF) using a matrix-assisted laser desorption/ionization time-of-flight tandem mass spectrometer (MALDI TOF/TOF)-based platform. The approach involved two major components: the use of isotopic-labeled synthetic peptides as references for targeted identification and quantification and a highly selective mass spectrometric analysis based on the unique characteristics of the MALDI instrument. The platform provides high confidence for targeted peptide detection in a complex system and can potentially be developed into a high-throughput system. Using the liquid chromatography (LC) MALDI TOF/TOF platform and the complementary identification strategy, we were able to selectively identify and quantify a panel of targeted peptides in the whole proteome of CSF without prior depletion of abundant proteins. The effectiveness and robustness of the approach associated with different sample complexity, sample preparation strategies, as well as mass spectrometric quantification were evaluated. Other issues related to chromatography separation and the feasibility for high-throughput analysis were also discussed. Finally, we applied targeted quantitative proteomics to analyze a subset of previously identified candidate markers in CSF samples of patients with Parkinson's disease (PD) at different stages and Alzheimer's disease (AD) along with normal controls.

  19. Application of quantitative time-lapse imaging (QTLI) for evaluation of Mrp2-based drug–drug interaction induced by liver metabolites

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, Takeo; Ikenaga, Miho; Fukuda, Hajime; Matsunaga, Norikazu; Tamai, Ikumi, E-mail: tamai@p.kanazawa-w.ac.jp

    2012-09-01

    We previously reported a quantitative time-lapse imaging (QTLI)-based analysis method to assess drug–drug interactions (DDI) at multidrug resistance-associated protein 2 (Mrp2) in rat sandwich-cultured hepatocyte (SCH) system, utilizing the fluorescent Mrp2 substrate, 5-(and 6)-carboxy-2′,7′-dichlorofluorescein (CDF). Here, we aimed to examine the feasibility of using QTLI to evaluate DDI involving drug metabolite(s) generated in hepatocytes. We used estradiol (E2) and bilirubin as model compounds; both are not substrates of MRP2, whereas their hepatic metabolites, estradiol-17β-glucuronide (E17G) or bilirubin glucuronides, are known to be its substrates as well as inhibitors. When rat SCHs were pre-exposed with E2, fluorescence of CDF accumulated in bile canaliculi decreased depending upon both the duration of pre-exposure and the concentration of extracellular E2. The decrease corresponded with the increase in intracellular concentration of E17G in hepatocytes. Furthermore, cytotoxicity of vinblastine, a substrate of MRP2, was enhanced in SCHs treated with E2. Similarly, CDF accumulated in bile canaliculi was significantly reduced in rat SCHs pre-exposed with bilirubin. In conclusion, these results suggest that phase II biotransformation of a competitor is reflected in alteration of MRP2-mediated CDF transport detected in QTLI. The QTLI might provide a convenient platform to evaluate transporter-based DDIs involving hepatic metabolites of drug candidates without the need to identify the metabolites. -- Highlights: ► Mrp2-mediated CDF transport is inhibited by E2, but not E17G in vesicle study. ► Both E2 and E17G do not compromise CDF formation from CDFDA in hepatocytes. ► CDF accumulation in bile canaliculi is inhibited by E2 or E17G in QTLI. ► Increasing exposure to E2 decreases CDF accumulation in bile canaliculi in QTLI. ► QTLI is feasible to assess Mrp2-based DDI involving drug metabolite in hepatocytes.

  20. Quantitative analysis of the z-spectrum using a numerically simulated look-up table: Application to the healthy human brain at 7T.

    Science.gov (United States)

    Geades, Nicolas; Hunt, Benjamin A E; Shah, Simon M; Peters, Andrew; Mougin, Olivier E; Gowland, Penny A

    2017-08-01

    To develop a method that fits a multipool model to z-spectra acquired from non-steady state sequences, taking into account the effects of variations in T1 or B1 amplitude and the results estimating the parameters for a four-pool model to describe the z-spectrum from the healthy brain. We compared measured spectra with a look-up table (LUT) of possible spectra and investigated the potential advantages of simultaneously considering spectra acquired at different saturation powers (coupled spectra) to provide sensitivity to a range of different physicochemical phenomena. The LUT method provided reproducible results in healthy controls. The average values of the macromolecular pool sizes measured in white matter (WM) and gray matter (GM) of 10 healthy volunteers were 8.9% ± 0.3% (intersubject standard deviation) and 4.4% ± 0.4%, respectively, whereas the average nuclear Overhauser effect pool sizes in WM and GM were 5% ± 0.1% and 3% ± 0.1%, respectively, and average amide proton transfer pool sizes in WM and GM were 0.21% ± 0.03% and 0.20% ± 0.02%, respectively. The proposed method demonstrated increased robustness when compared with existing methods (such as Lorentzian fitting and asymmetry analysis) while yielding fully quantitative results. The method can be adjusted to measure other parameters relevant to the z-spectrum. Magn Reson Med 78:645-655, 2017. © 2016 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2016 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.

  1. Application of quantitative time-lapse imaging (QTLI) for evaluation of Mrp2-based drug–drug interaction induced by liver metabolites

    International Nuclear Information System (INIS)

    Nakanishi, Takeo; Ikenaga, Miho; Fukuda, Hajime; Matsunaga, Norikazu; Tamai, Ikumi

    2012-01-01

    We previously reported a quantitative time-lapse imaging (QTLI)-based analysis method to assess drug–drug interactions (DDI) at multidrug resistance-associated protein 2 (Mrp2) in rat sandwich-cultured hepatocyte (SCH) system, utilizing the fluorescent Mrp2 substrate, 5-(and 6)-carboxy-2′,7′-dichlorofluorescein (CDF). Here, we aimed to examine the feasibility of using QTLI to evaluate DDI involving drug metabolite(s) generated in hepatocytes. We used estradiol (E2) and bilirubin as model compounds; both are not substrates of MRP2, whereas their hepatic metabolites, estradiol-17β-glucuronide (E17G) or bilirubin glucuronides, are known to be its substrates as well as inhibitors. When rat SCHs were pre-exposed with E2, fluorescence of CDF accumulated in bile canaliculi decreased depending upon both the duration of pre-exposure and the concentration of extracellular E2. The decrease corresponded with the increase in intracellular concentration of E17G in hepatocytes. Furthermore, cytotoxicity of vinblastine, a substrate of MRP2, was enhanced in SCHs treated with E2. Similarly, CDF accumulated in bile canaliculi was significantly reduced in rat SCHs pre-exposed with bilirubin. In conclusion, these results suggest that phase II biotransformation of a competitor is reflected in alteration of MRP2-mediated CDF transport detected in QTLI. The QTLI might provide a convenient platform to evaluate transporter-based DDIs involving hepatic metabolites of drug candidates without the need to identify the metabolites. -- Highlights: ► Mrp2-mediated CDF transport is inhibited by E2, but not E17G in vesicle study. ► Both E2 and E17G do not compromise CDF formation from CDFDA in hepatocytes. ► CDF accumulation in bile canaliculi is inhibited by E2 or E17G in QTLI. ► Increasing exposure to E2 decreases CDF accumulation in bile canaliculi in QTLI. ► QTLI is feasible to assess Mrp2-based DDI involving drug metabolite in hepatocytes.

  2. Development and application of a quantitative PCR assay to study equine herpesvirus 5 invasion and replication in equine tissues in vitro and in vivo.

    Science.gov (United States)

    Zarski, Lila M; High, Emily A; Nelli, Rahul K; Bolin, Steven R; Williams, Kurt J; Hussey, Gisela

    2017-10-01

    Equine herpesvirus 5 (EHV-5) infection is associated with pulmonary fibrosis in horses, but further studies on EHV-5 persistence in equine cells are needed to fully understand viral and host contributions to disease pathogenesis. Our aim was to develop a quantitative PCR (qPCR) assay to measure EHV-5 viral copy number in equine cell cultures, blood lymphocytes, and nasal swabs of horses. Furthermore, we used a recently developed equine primary respiratory cell culture system to study EHV-5 pathogenesis at the respiratory tract. PCR primers and a probe were designed to target gene E11 of the EHV-5 genome. Sensitivity and repeatability were established, and specificity was verified by testing multiple isolates of EHV-5, as well as DNA from other equine herpesviruses. Four-week old fully differentiated (mature), newly seeded (immature) primary equine respiratory epithelial cell (ERECs), and equine dermal cell cultures were inoculated with EHV-5 and the cells and supernatants collected daily for 14days. Blood lymphocytes and nasal swabs were collected from horses experimentally infected with equine herpesvirus 1 (EHV-1). The qPCR assay detected EHV-5 at stable concentrations throughout 14days in inoculated mature EREC and equine dermal cell cultures (peaking at 202 and 5861 viral genomes per 10 6 cellular β actin, respectively). EHV-5 copies detected in the immature EREC cultures increased over 14days and reached levels greater than 10,000 viral genomes per 10 6 cellular β actin. Moreover, EHV-5 was detected in the lymphocytes of 76% of horses and in the nasal swabs of 84% of horses experimentally infected with EHV-1 pre-inoculation with EHV-1. Post-inoculation with EHV-1, EHV-5 was detected in lymphocytes of 52% of horses while EHV-5 levels in nasal swabs were not significantly different from pre-inoculation levels. In conclusion, qPCR was a reliable technique to investigate viral load in in vivo and in vitro samples, and EHV-5 replication in equine epithelial cells

  3. Establishment of real time allele specific locked nucleic acid quantitative PCR for detection of HBV YIDD (ATT mutation and evaluation of its application.

    Directory of Open Access Journals (Sweden)

    Yongbin Zeng

    Full Text Available BACKGROUND: Long-term use of nucleos(tide analogues can increase risk of HBV drug-resistance mutations. The rtM204I (ATT coding for isoleucine is one of the most important resistance mutation sites. Establishing a simple, rapid, reliable and highly sensitive assay to detect the resistant mutants as early as possible is of great clinical significance. METHODS: Recombinant plasmids for HBV YMDD (tyrosine-methionine-aspartate-aspartate and YIDD (tyrosine-isoleucine-aspartate-aspartate were constructed by TA cloning. Real time allele specific locked nucleic acid quantitative PCR (RT-AS-LNA-qPCR with SYBR Green I was established by LNA-modified primers and evaluated with standard recombinant plasmids, clinical templates (the clinical wild type and mutant HBV DNA mixture and 102 serum samples from nucleos(tide analogues-experienced patients. The serum samples from a chronic hepatitis B (CHB patient firstly received LMV mono therapy and then switched to LMV + ADV combined therapy were also dynamically analyzed for 10 times. RESULTS: The linear range of the assay was between 1×10(9 copies/μl and 1 × 10(2 copies/μl. The low detection limit was 1 × 10(1 copies/μl. Sensitivity of the assay were 10(-6, 10(-4 and 10(-2 in the wild-type background of 1 × 10(9 copies/μl, 1 × 10(7 copies/μl and 1 × 10(5 copies/μl, respectively. The sensitivity of the assay in detection of clinical samples was 0.03%. The complete coincidence rate between RT-AS-LNA-qPCR and direct sequencing was 91.2% (93/102, partial coincidence rate was 8.8% (9/102, and no complete discordance was observed. The two assays showed a high concordance (Kappa = 0.676, P = 0.000. Minor variants can be detected 18 weeks earlier than the rebound of HBV DNA load and alanine aminotransferase level. CONCLUSIONS: A rapid, cost-effective, high sensitive, specific and reliable method of RT-AS-LNA-qPCR with SYBR Green I for early and absolute quantification of HBV YIDD (ATT coding for isoleucine

  4. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  5. Effect of Irrigation CutOff on Flowering Stage and Foliar Application of Spermidine on Some Quantitative and Qualitative Characteristics of Various Ecotypes of Cumin

    Directory of Open Access Journals (Sweden)

    Sarah Bakhtari

    2017-02-01

    Full Text Available Introduction: Medicinal plants play major roles in human health. . Cumin (Cuminum cyminum L. is an annual plant that commonly cultivated in arid and semiarid regions of Iran. The crop has a wide range of uses including medicinal, cosmetic and food industry. Cumin occupies about 26% of the total area devoted to medicinal plants in Iran. However, cumin is seriously affected by the Fusarium wilt and blight diseases. The diseases usually increase under warm and wet conditions. It was demonstrated that the peak of the disease incidence is occurring at the flowering stage and irrigation cutoff at this time may be reduced the diseases density. Materials and methods: In order to evaluate the effects of irrigation cutoff in flowering stage and foliar application of spermidine on some characteristics of various ecotype of cumin, an experiment was conducted in a split-split-plot arrangement in randomized complete block design with three replications at the research farm of Shahid Bahonar University of Kerman at 2014. The experimental treatments were irrigation in two levels (complete irrigation and cutoff the irrigation in flowering stage assigned to main plots, foliar application of spermidine in three levels (0, 1 and 2 Mm as a subplot and cumin ecotypes in three levels (Kerman, Khorasan and Esfahan that was randomized in sub-subplot. Plots size under the trial was 4 m × 3 m so as to get 50 cm inter row spacing in six rows. The ideal density of the crops was considered as 120 plant m-2. As soon as the seeds were sown, irrigation was applied every 10 days. Foliar application of spermidine was done at three stages (after thinning, before flowering stage and in the middle of flowering stage. No herbicides and chemical fertilizers were applied during the expriments. Results and discussion: In this study the number of branches, umbels per plant, 1000-seed weight, seed yield per plant and hectare, harvest index, essential oil percentage and yield, infected

  6. Quantitative analysis chemistry

    International Nuclear Information System (INIS)

    Ko, Wansuk; Lee, Choongyoung; Jun, Kwangsik; Hwang, Taeksung

    1995-02-01

    This book is about quantitative analysis chemistry. It is divided into ten chapters, which deal with the basic conception of material with the meaning of analysis chemistry and SI units, chemical equilibrium, basic preparation for quantitative analysis, introduction of volumetric analysis, acid-base titration of outline and experiment examples, chelate titration, oxidation-reduction titration with introduction, titration curve, and diazotization titration, precipitation titration, electrometric titration and quantitative analysis.

  7. Assessment of vulnerability in karst aquifers using a quantitative integrated numerical model: catchment characterization and high resolution monitoring - Application to semi-arid regions- Lebanon.

    Science.gov (United States)

    Doummar, Joanna; Aoun, Michel; Andari, Fouad

    2016-04-01

    Karst aquifers are highly heterogeneous and characterized by a duality of recharge (concentrated; fast versus diffuse; slow) and a duality of flow which directly influences groundwater flow and spring responses. Given this heterogeneity in flow and infiltration, karst aquifers do not always obey standard hydraulic laws. Therefore the assessment of their vulnerability reveals to be challenging. Studies have shown that vulnerability of aquifers is highly governed by recharge to groundwater. On the other hand specific parameters appear to play a major role in the spatial and temporal distribution of infiltration on a karst system, thus greatly influencing the discharge rates observed at a karst spring, and consequently the vulnerability of a spring. This heterogeneity can only be depicted using an integrated numerical model to quantify recharge spatially and assess the spatial and temporal vulnerability of a catchment for contamination. In the framework of a three-year PEER NSF/USAID funded project, the vulnerability of a karst catchment in Lebanon is assessed quantitatively using a numerical approach. The aim of the project is also to refine actual evapotranspiration rates and spatial recharge distribution in a semi arid environment. For this purpose, a monitoring network was installed since July 2014 on two different pilot karst catchment (drained by Qachqouch Spring and Assal Spring) to collect high resolution data to be used in an integrated catchment numerical model with MIKE SHE, DHI including climate, unsaturated zone, and saturated zone. Catchment characterization essential for the model included geological mapping and karst features (e.g., dolines) survey as they contribute to fast flow. Tracer experiments were performed under different flow conditions (snow melt and low flow) to delineate the catchment area, reveal groundwater velocities and response to snowmelt events. An assessment of spring response after precipitation events allowed the estimation of the

  8. Added value of experts' knowledge to improve a quantitative microbial exposure assessment model--Application to aseptic-UHT food products.

    Science.gov (United States)

    Pujol, Laure; Johnson, Nicholas Brian; Magras, Catherine; Albert, Isabelle; Membré, Jeanne-Marie

    2015-10-15

    In a previous study, a quantitative microbial exposure assessment (QMEA) model applied to an aseptic-UHT food process was developed [Pujol, L., Albert, I., Magras, C., Johnson, N. B., Membré, J. M. Probabilistic exposure assessment model to estimate aseptic UHT product failure rate. 2015 International Journal of Food Microbiology. 192, 124-141]. It quantified Sterility Failure Rate (SFR) associated with Bacillus cereus and Geobacillus stearothermophilus per process module (nine modules in total from raw material reception to end-product storage). Previously, the probabilistic model inputs were set by experts (using knowledge and in-house data). However, only the variability dimension was taken into account. The model was then improved using expert elicitation knowledge in two ways. First, the model was refined by adding the uncertainty dimension to the probabilistic inputs, enabling to set a second order Monte Carlo analysis. The eight following inputs, and their impact on SFR, are presented in detail in this present study: D-value for each bacteria of interest (B. cereus and G. stearothermophilus) associated with the inactivation model for the UHT treatment step, i.e., two inputs; log reduction (decimal reduction) number associated with the inactivation model for the packaging sterilization step for each bacterium and each part of the packaging (product container and sealing component), i.e., four inputs; and bacterial spore air load of the aseptic tank and the filler cabinet rooms, i.e., two inputs. Second, the model was improved by leveraging expert knowledge to develop further the existing model. The proportion of bacteria in the product which settled on surface of pipes (between the UHT treatment and the aseptic tank on one hand, and between the aseptic tank and the filler cabinet on the other hand) leading to a possible biofilm formation for each bacterium, was better characterized. It was modeled as a function of the hygienic design level of the aseptic

  9. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  10. Quantitative autoradiography of neurochemicals

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-01-01

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms

  11. Quantitative determination of trigonelline in mouse serum by means of hydrophilic interaction liquid chromatography-MS/MS analysis: Application to a pharmacokinetic study.

    Science.gov (United States)

    Szczesny, Damian; Bartosińska, Ewa; Jacyna, Julia; Patejko, Małgorzata; Siluk, Danuta; Kaliszan, Roman

    2018-02-01

    Trigonelline is a pyridine alkaloid found in fenugreek seeds and coffee beans. Most of the previous studies are concerned with the quantification of trigonelline along with other constituents in coffee herbs or beverages. Only a few have focused on its determination in animal or human tissues by applying different modes of HPLC with UV or MS detection. The aim of the study was to develop and validate a fast and simple method for trigonelline determination in serum by the use of hydrophilic interaction liquid chromatography (HILIC) with ESI-MS/MS detection. Separation of trigonelline was achieved on a Kinetex HILIC column operated at 35°C with acetonitrile-ammonium formate (10 mm, pH = 3) buffer mixture (55:45, v/v) as the mobile phase. The developed method was successfully applied to determine trigonelline concentration in mouse serum after intravenous administration of 10 mg/kg. The developed assay is sensitive (limit of detection = 1.5 ng/mL, limit of quantification = 5.0 ng/mL) and linear in a concentration range from 5.0 to 250.0 ng/mL. Sample preparation is limited to deproteinization, centrifugation and filtration. The application of the HILIC mode of chromatography with MS detection and selection of deuterated trigonelline as internal standard allowed a rapid and precise method of trigonelline quantification to be to developed. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Post-Processing of Dynamic Gadolinium-Enhanced Magnetic Resonance Imaging Exams of the Liver: Explanation and Potential Clinical Applications for Color-Coded Qualitative and Quantitative Analysis

    International Nuclear Information System (INIS)

    Wang, L.; Bos, I.C. Van den; Hussain, S.M.; Pattynama, P.M.; Vogel, M.W.; Kr estin, G.P.

    2008-01-01

    The purpose of this article is to explain and illustrate the current status and potential applications of automated and color-coded post-processing techniques for the analysis of dynamic multiphasic gadolinium-enhanced magnetic resonance imaging (MRI) of the liver. Post-processing of these images on dedicated workstations allows the generation of time-intensity curves (TIC) as well as color-coded images, which provides useful information on (neo)-angiogenesis within a liver lesion, if necessary combined with information on enhancement patterns of the surrounding liver parenchyma. Analysis of TIC and color-coded images, which are based on pharmacokinetic modeling, provides an easy-to-interpret schematic presentation of tumor behavior, providing additional characteristics for adequate differential diagnosis. Inclusion of TIC and color-coded images as part of the routine abdominal MRI workup protocol may help to further improve the specificity of MRI findings, but needs to be validated in clinical decision-making situations. In addition, these tools may facilitate the diagnostic workup of disease for detection, characterization, staging, and monitoring of antitumor therapy, and hold incremental value to the widely used tumor response criteria

  13. 87Sr/86Sr as a quantitative geochemical proxy for 14C reservoir age in dynamic, brackish waters: assessing applicability and quantifying uncertainties.

    Science.gov (United States)

    Lougheed, Bryan; van der Lubbe, Jeroen; Davies, Gareth

    2016-04-01

    Accurate geochronologies are crucial for reconstructing the sensitivity of brackish and estuarine environments to rapidly changing past external impacts. A common geochronological method used for such studies is radiocarbon (14C) dating, but its application in brackish environments is severely limited by an inability to quantify spatiotemporal variations in 14C reservoir age, or R(t), due to dynamic interplay between river runoff and marine water. Additionally, old carbon effects and species-specific behavioural processes also influence 14C ages. Using the world's largest brackish water body (the estuarine Baltic Sea) as a test-bed, combined with a comprehensive approach that objectively excludes both old carbon and species-specific effects, we demonstrate that it is possible to use 87Sr/86Sr ratios to quantify R(t) in ubiquitous mollusc shell material, leading to almost one order of magnitude increase in Baltic Sea 14C geochronological precision over the current state-of-the-art. We propose that this novel proxy method can be developed for other brackish water bodies worldwide, thereby improving geochronological control in these climate sensitive, near-coastal environments.

  14. Application of a compact diode pumped solid-state laser source for quantitative laser-induced breakdown spectroscopy analysis of steel

    Science.gov (United States)

    Tortschanoff, Andreas; Baumgart, Marcus; Kroupa, Gerhard

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) technology holds the potential for onsite real-time measurements of steel products. However, for a mobile and robust LIBS measurement system, an adequate small and ruggedized laser source is a key requirement. In this contribution, we present tests with our compact high-power laser source, which, initially, was developed for ignition applications. The CTR HiPoLas® laser is a robust diode pumped solid-state laser with a passive Q-switch with dimensions of less than 10 cm3. The laser generates 2.5-ns pulses with 30 mJ at a maximum continuous repetition rate of about 30 Hz. Feasibility of LIBS experiments with the laser source was experimentally verified with steel samples. The results show that the laser with its current optical output parameters is very well-suited for LIBS measurements. We believe that the miniaturized laser presented here will enable very compact and robust portable high-performance LIBS systems.

  15. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  16. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2018-03-01

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Quantitative spatially resolved measurement of tissue chromophore concentrations using photoacoustic spectroscopy: application to the measurement of blood oxygenation and haemoglobin concentration

    Science.gov (United States)

    Laufer, Jan; Delpy, Dave; Elwell, Clare; Beard, Paul

    2007-01-01

    A new approach based on pulsed photoacoustic spectroscopy for non-invasively quantifying tissue chromophore concentrations with high spatial resolution has been developed. The technique is applicable to the quantification of tissue chromophores such as oxyhaemoglobin (HbO2) and deoxyhaemoglobin (HHb) for the measurement of physiological parameters such as blood oxygen saturation (SO2) and total haemoglobin concentration. It can also be used to quantify the local accumulation of targeted contrast agents used in photoacoustic molecular imaging. The technique employs a model-based inversion scheme to recover the chromophore concentrations from photoacoustic measurements. This comprises a numerical forward model of the detected time-dependent photoacoustic signal that incorporates a multiwavelength diffusion-based finite element light propagation model to describe the light transport and a time-domain acoustic model to describe the generation, propagation and detection of the photoacoustic wave. The forward model is then inverted by iteratively fitting it to measurements of photoacoustic signals acquired at different wavelengths to recover the chromophore concentrations. To validate this approach, photoacoustic signals were generated in a tissue phantom using nanosecond laser pulses between 740 nm and 1040 nm. The tissue phantom comprised a suspension of intralipid, blood and a near-infrared dye in which three tubes were immersed. Blood at physiological haemoglobin concentrations and oxygen saturation levels ranging from 2% to 100% was circulated through the tubes. The signal amplitude from different temporal sections of the detected photoacoustic waveforms was plotted as a function of wavelength and the forward model fitted to these data to recover the concentrations of HbO2 and HHb, total haemoglobin concentration and SO2. The performance was found to compare favourably to that of a laboratory CO-oximeter with measurement resolutions of ±3.8 g l-1 (±58 µM) and ±4

  18. Quantitative spatially resolved measurement of tissue chromophore concentrations using photoacoustic spectroscopy: application to the measurement of blood oxygenation and haemoglobin concentration

    Energy Technology Data Exchange (ETDEWEB)

    Laufer, Jan; Delpy, Dave; Elwell, Clare; Beard, Paul [Department of Medical Physics and Bioengineering, University College London, Malet Place Engineering Building, London WC1E 6BT (United Kingdom)

    2007-01-07

    A new approach based on pulsed photoacoustic spectroscopy for non-invasively quantifying tissue chromophore concentrations with high spatial resolution has been developed. The technique is applicable to the quantification of tissue chromophores such as oxyhaemoglobin (HbO{sub 2}) and deoxyhaemoglobin (HHb) for the measurement of physiological parameters such as blood oxygen saturation (SO{sub 2}) and total haemoglobin concentration. It can also be used to quantify the local accumulation of targeted contrast agents used in photoacoustic molecular imaging. The technique employs a model-based inversion scheme to recover the chromophore concentrations from photoacoustic measurements. This comprises a numerical forward model of the detected time-dependent photoacoustic signal that incorporates a multiwavelength diffusion-based finite element light propagation model to describe the light transport and a time-domain acoustic model to describe the generation, propagation and detection of the photoacoustic wave. The forward model is then inverted by iteratively fitting it to measurements of photoacoustic signals acquired at different wavelengths to recover the chromophore concentrations. To validate this approach, photoacoustic signals were generated in a tissue phantom using nanosecond laser pulses between 740 nm and 1040 nm. The tissue phantom comprised a suspension of intralipid, blood and a near-infrared dye in which three tubes were immersed. Blood at physiological haemoglobin concentrations and oxygen saturation levels ranging from 2% to 100% was circulated through the tubes. The signal amplitude from different temporal sections of the detected photoacoustic waveforms was plotted as a function of wavelength and the forward model fitted to these data to recover the concentrations of HbO{sub 2} and HHb, total haemoglobin concentration and SO{sub 2}. The performance was found to compare favourably to that of a laboratory CO-oximeter with measurement resolutions of {+-}3

  19. Quantitative spatially resolved measurement of tissue chromophore concentrations using photoacoustic spectroscopy: application to the measurement of blood oxygenation and haemoglobin concentration

    International Nuclear Information System (INIS)

    Laufer, Jan; Delpy, Dave; Elwell, Clare; Beard, Paul

    2007-01-01

    A new approach based on pulsed photoacoustic spectroscopy for non-invasively quantifying tissue chromophore concentrations with high spatial resolution has been developed. The technique is applicable to the quantification of tissue chromophores such as oxyhaemoglobin (HbO 2 ) and deoxyhaemoglobin (HHb) for the measurement of physiological parameters such as blood oxygen saturation (SO 2 ) and total haemoglobin concentration. It can also be used to quantify the local accumulation of targeted contrast agents used in photoacoustic molecular imaging. The technique employs a model-based inversion scheme to recover the chromophore concentrations from photoacoustic measurements. This comprises a numerical forward model of the detected time-dependent photoacoustic signal that incorporates a multiwavelength diffusion-based finite element light propagation model to describe the light transport and a time-domain acoustic model to describe the generation, propagation and detection of the photoacoustic wave. The forward model is then inverted by iteratively fitting it to measurements of photoacoustic signals acquired at different wavelengths to recover the chromophore concentrations. To validate this approach, photoacoustic signals were generated in a tissue phantom using nanosecond laser pulses between 740 nm and 1040 nm. The tissue phantom comprised a suspension of intralipid, blood and a near-infrared dye in which three tubes were immersed. Blood at physiological haemoglobin concentrations and oxygen saturation levels ranging from 2% to 100% was circulated through the tubes. The signal amplitude from different temporal sections of the detected photoacoustic waveforms was plotted as a function of wavelength and the forward model fitted to these data to recover the concentrations of HbO 2 and HHb, total haemoglobin concentration and SO 2 . The performance was found to compare favourably to that of a laboratory CO-oximeter with measurement resolutions of ±3.8 g l -1 (±58

  20. Quantitative analysis of boron by neutron radiography

    International Nuclear Information System (INIS)

    Bayuelken, A.; Boeck, H.; Schachner, H.; Buchberger, T.

    1990-01-01

    The quantitative determination of boron in ores is a long process with chemical analysis techniques. As nuclear techniques like X-ray fluorescence and activation analysis are not applicable for boron, only the neutron radiography technique, using the high neutron absorption cross section of this element, can be applied for quantitative determinations. This paper describes preliminary tests and calibration experiments carried out at a 250 kW TRIGA reactor. (orig.) [de

  1. Variance in total levels of phospholipase C zeta (PLC-ζ) in human sperm may limit the applicability of quantitative immunofluorescent analysis as a diagnostic indicator of oocyte activation capability.

    Science.gov (United States)

    Kashir, Junaid; Jones, Celine; Mounce, Ginny; Ramadan, Walaa M; Lemmon, Bernadette; Heindryckx, Bjorn; de Sutter, Petra; Parrington, John; Turner, Karen; Child, Tim; McVeigh, Enda; Coward, Kevin

    2013-01-01

    To examine whether similar levels of phospholipase C zeta (PLC-ζ) protein are present in sperm from men whose ejaculates resulted in normal oocyte activation, and to examine whether a predominant pattern of PLC-ζ localization is linked to normal oocyte activation ability. Laboratory study. University laboratory. Control subjects (men with proven oocyte activation capacity; n = 16) and men whose sperm resulted in recurrent intracytoplasmic sperm injection failure (oocyte activation deficient [OAD]; n = 5). Quantitative immunofluorescent analysis of PLC-ζ protein in human sperm. Total levels of PLC-ζ fluorescence, proportions of sperm exhibiting PLC-ζ immunoreactivity, and proportions of PLC-ζ localization patterns in sperm from control and OAD men. Sperm from control subjects presented a significantly higher proportion of sperm exhibiting PLC-ζ immunofluorescence compared with infertile men diagnosed with OAD (82.6% and 27.4%, respectively). Total levels of PLC-ζ in sperm from individual control and OAD patients exhibited significant variance, with sperm from 10 out of 16 (62.5%) exhibiting levels similar to OAD samples. Predominant PLC-ζ localization patterns varied between control and OAD samples with no predictable or consistent pattern. The results indicate that sperm from control men exhibited significant variance in total levels of PLC-ζ protein, as well as significant variance in the predominant localization pattern. Such variance may hinder the diagnostic application of quantitative PLC-ζ immunofluorescent analysis. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Elastic and inelastic {alpha}-scattering cross-sections obtained with the 44 MeV fixed energy Saclay cyclotron on separated targets of {sup 24}Mg, {sup 25}Mg, {sup 26}Mg, {sup 40}Ca, {sup 46}Ti, {sup 48}Ti, {sup 50}Ti, {sup 52}Cr, {sup 54}Fe, {sup 56}Fe, {sup 58}Fe, {sup 58}Ni, {sup 60}Ni, {sup 62}Ni, {sup 64}Ni, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 112}Sn, {sup 114}Sn, {sup 116}Sn, {sup 118}Sn, {sup 120}Sn, {sup 122}Sn, {sup 124}Sn and {sup 208}Pb using the Saclay fixed-energy cyclotron; Sections efficaces differentielles elastiques et inelastiques obtenues par diffusion de particules {alpha} de 44 MeV sur des cibles de {sup 24}Mg, {sup 25}Mg, {sup 26}Mg, {sup 40}Ca, {sup 46}Ti, {sup 48}Ti, {sup 50}Ti, {sup 52}Cr, {sup 54}Fe, {sup 56}Fe, {sup 58}Fe, {sup 58}Ni, {sup 60}Ni, {sup 62}Ni, {sup 64}Ni, {sup 63}Cu, {sup 65}Cu, {sup 64}Zn, {sup 112}Sn, {sup 114}Sn, {sup 116}Sn, {sup 118}Sn, {sup 120}Sn, {sup 122}Sn, {sup 124}Sn et {sup 208}Pb au cyclotron a energie fixe de saclay

    Energy Technology Data Exchange (ETDEWEB)

    Bruge, G [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires. Departement de physique nucleaire, service de physique nucleaire a moyenne energie

    1967-01-01

    This report contains elastic and inelastic {alpha}-scattering cross-sections obtained with the 44 MeV fixed energy Saclay cyclotron on Mg, Ca, Ti, Cr, Fe, Ni, Co, Zn, Sn and Pb enriched targets. (author) [French] Ce rapport contient les tableaux des sections efficaces differentielles obtenues par diffusion elastique et inelastique des particules {alpha} de 44 MeV, fournies par le cyclotron a energie fixe de Saclay, sur des cibles d'isotopes separes de Mg, Ca, Ti, Cr, Fe, Ni, Co, Zn, Sn et Pb. (auteur)

  3. Effect of simultaneous application of mycorrhiza with compost, vermicompost and sulfural geranole on some quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system

    Directory of Open Access Journals (Sweden)

    P rezvani moghaddam

    2016-03-01

    quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system was investigated. Materials and methods In order to evaluate the effects of simultaneous application of mycorrhiza and organic fertilizers on some quantitative and qualitative characteristics of sesame (Sesamum indicum L., an experiment was conducted based on randomized complete block design with three replications at Agricultural Research Farm, Ferdowsi University of Mashhad, Iran during growing season 2009-2010 growing season. Treatments were mycorrhiza (Glomus mosseae, mycorrhiza+compost, mycorrhiza+vermicompost, mycorrhiza+organic sulfural geranole, compost, vermicompost, Organic sulfural geranole and control (no fertilizer. Finally, data analysis was done using SAS 9.1 and means were compared by duncan’s multiple range test at 5% level of probability. Results and discussion The results showed that the effect of different organic and biological fertilizers were significant on seed yield. Seed yield significantly increased by using mycorrhiza in both condition of single and mixed with organic sulfural geranole and vermicompost compared to control treatment. Biological yield, in simultaneous application of vermicompost and organic sulfural geranole with mycorrhiza increased significantly compared to separate use of these fertilizers. All study organic fertilizers with mycorrhiza had significant effect on increasing oil content of sesame. Seed oil increased in simultaneous application of mycorrhiza and each of compost, vermicompost and organic sulfural geranole compared to separate application of mycorrhiza 12, 13 and 10 percentages, respectively. It seems that mycorrhiza and organic fertilizers improved quantitative and qualitative characteristics of sesame due to provide better conditions to absorption and transportation of nutrient to the plant (Hawkes et al., 2008. Conclusion In general, the results showed that the simultaneous use of ecological inputs can improve

  4. Quantitative film radiography

    International Nuclear Information System (INIS)

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-01-01

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects

  5. Quantitative Robust Control Engineering: Theory and Applications

    Science.gov (United States)

    2006-09-01

    30]. Gutman, PO., Baril , C. Neuman, L. (1994), An algorithm for computing value sets of uncertain transfer functions in factored real form...linear compensation design for saturating unstable uncertain plants. Int. J. Control, Vol. 44, pp. 1137-1146. [90]. Oldak S., Baril C. and Gutman

  6. Practical application of qualitative and quantitative methods

    CSIR Research Space (South Africa)

    Funke, Nicola S

    2017-12-01

    Full Text Available making process. • CBA can be applied to a myriad of socio- economic decisions, public and/or private sphere. 15 Cost-Benefit-Analysis (CBA) • Direct cost & benefits • Indirect effects • Third parties effects • Social adjustments – Social prices... Considerations from perspective Private Public 16 Cost-Benefit-Analysis (CBA) Preparation for group exercise – Case study Community adoption of drip irrigation technology 17 Cost-Benefit-Analysis (CBA) • Direct cost & benefits – Drip irrigation system...

  7. Infrared thermography quantitative image processing

    Science.gov (United States)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  8. Quantitative Analysis of Renogram

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Keun Chul [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    1969-03-15

    Radioisotope renography was carried out in 564 cases consisting of 150 normal controls, 140 hypertensive, 102 hypertensive nephropathys, 62 chronic renal diseases, 53 unilateral, and 57 bilateral non-functioning kidneys. It was aimed to study which parameter of the renogram is most applicable to any definite disease of the kidney. The analytical methods adopted were; Tobe, Spencer, Krueger, Matchida and Takeuchi. In the non-functioning kidney groups, the hemograms and serum nitrogen series were also studied to evaluate the relationships between the renograms and renal anemia. The parameters were; time of maximum amplitude (Tmax), half-time of maximum amplitude (T-1/2), Kac value calculated from these two parameters in Tobe's method, slopes of B and C phase, B/A and B/C values in Spencer's method, total concentration (T.C.), minute concentration (M.C.) and minute excretion (M.E.) in Krueger's method, Matchida's K value and Takeuchi's renal function Index (R.F.I.). Following were the results: 1) In general, marked differences in the patterns of the renogram were observed between the normal controls and nephropathys. In Tobe's method, each parameter showed statistically significant delay or decrease in patients with hypertensive nephropathys and chronic renal diseases. In Spencer's method, slopes of B and C phase and B/C, also showed the statistically significant decrease in patients with hypertension, hypertensive nephropathys and chronic renal diseases. In Krueger's method, M.C. and ME showed the statistically significant differences between the control and patients with hypertension, hypertensive nephropathys and chronic renal diseases, In Matchida's method, K value showed the statistically significant differences between the control and patients with hypertensive nephropathys and chronic renal diseases. 2) It appeared, therefore, that Tobe's T{sub 1}/2, Kac, Spencer's slopes of B and C phase, B/A, B/C values, Krueger's T.C., M.C., and M.E. values, matchida's K

  9. Quantitative Analysis of Renogram

    International Nuclear Information System (INIS)

    Choi, Keun Chul

    1969-01-01

    Radioisotope renography was carried out in 564 cases consisting of 150 normal controls, 140 hypertensive, 102 hypertensive nephropathys, 62 chronic renal diseases, 53 unilateral, and 57 bilateral non-functioning kidneys. It was aimed to study which parameter of the renogram is most applicable to any definite disease of the kidney. The analytical methods adopted were; Tobe, Spencer, Krueger, Matchida and Takeuchi. In the non-functioning kidney groups, the hemograms and serum nitrogen series were also studied to evaluate the relationships between the renograms and renal anemia. The parameters were; time of maximum amplitude (Tmax), half-time of maximum amplitude (T-1/2), Kac value calculated from these two parameters in Tobe's method, slopes of B and C phase, B/A and B/C values in Spencer's method, total concentration (T.C.), minute concentration (M.C.) and minute excretion (M.E.) in Krueger's method, Matchida's K value and Takeuchi's renal function Index (R.F.I.). Following were the results: 1) In general, marked differences in the patterns of the renogram were observed between the normal controls and nephropathys. In Tobe's method, each parameter showed statistically significant delay or decrease in patients with hypertensive nephropathys and chronic renal diseases. In Spencer's method, slopes of B and C phase and B/C, also showed the statistically significant decrease in patients with hypertension, hypertensive nephropathys and chronic renal diseases. In Krueger's method, M.C. and ME showed the statistically significant differences between the control and patients with hypertension, hypertensive nephropathys and chronic renal diseases, In Matchida's method, K value showed the statistically significant differences between the control and patients with hypertensive nephropathys and chronic renal diseases. 2) It appeared, therefore, that Tobe's T 1 /2, Kac, Spencer's slopes of B and C phase, B/A, B/C values, Krueger's T.C., M.C., and M.E. values, matchida's K value

  10. Quantitative secondary electron detection

    Science.gov (United States)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    2018-05-08

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  11. [Methods of quantitative proteomics].

    Science.gov (United States)

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  12. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  13. Quantitative sample preparation of some heavy elements

    International Nuclear Information System (INIS)

    Jaffey, A.H.

    1977-01-01

    A discussion is given of some techniques that have been useful in quantitatively preparing and analyzing samples used in the half-life determinations of some plutonium and uranium isotopes. Application of these methods to the preparation of uranium and plutonium samples used in neutron experiments is discussed

  14. Quantitative and Qualitative Extensions of Event Structures

    NARCIS (Netherlands)

    Katoen, Joost P.

    1996-01-01

    An important application of formal methods is the specification, design, and analysis of functional aspects of (distributed) systems. Recently the study of quantitative aspects of such systems based on formal methods has come into focus. Several extensions of formal methods where the occurrence of

  15. Analyse quantitative détaillée des distillats moyens par couplage CG/SM. Application à l'étude des schémas réactionnels du procédé d'hydrotraitement Quantitative Analysis of Middle Distillats by Gc/Ms Coupling. Application to Hydrotreatment Process Mechanisms

    Directory of Open Access Journals (Sweden)

    Fafet A.

    2006-11-01

    Full Text Available L'analyse détaillée des distillats moyens est une étape indispensable à la compréhension des mécanismes réactionnels et à la cinétique de certains procédés de raffinage comme l'hydrotraitement. Une nouvelle méthode associant, d'une part un couplage chromatographie en phase gazeuse/spectrométrie de masse (CG/SM et, d'autre part une analyse quantitative par famille chimique par spectrométrie de masse a été développée. La chromatographie en phase gazeuse, réalisée sur une colonne apolaire, effectue la distillation des composés présents dans le gazole et la spectrométrie de masse quantifie les familles chimiques par intervalle de nombre d'atomes de carbone ou de point d'ébullition. Elle permet d'accéder ainsi à la répartition par nombre d'atomes de carbone de chaque famille chimique (alcanes, cycloalcanes, hydrocarbures aromatiques à un ou plusieurs noyaux, hydrocarbures aromatiques soufrés. Cette méthode a été validée et appliquée à une charge et à une recette d'hydrotraitement. A detailed analysis of middle distillates is essential for understanding the reaction mechanism and for studying the kinetics of refining processes such hydrotreatment. In fact, when we see the complexity of saturated and aromatic hydrocarbon mixtures appearing in gas oil, we realize that it's necessary to have a very detailed analysis of those cuts to understand the mechanisms involved in refining processes and to be able to describe their kinetics. Each gas oil has a very different composition and therefore a specific reactivity. That is why we have tried to develop predictive kinetic models to avoid experimenting in pilot plants, which is very expensive. But, even if all the compounds of a gasoline (PI-200°C have now been identified and quantified, using gas chromatography (1, such is not the case for heavier cuts. Only an overall characterization can be made, by chemical family. The techniques employed are, for example, HPLC (3,4 or

  16. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  17. Quantitative Moessbauer analysis

    International Nuclear Information System (INIS)

    Collins, R.L.

    1978-01-01

    The quantitative analysis of Moessbauer data, as in the measurement of Fe 3+ /Fe 2+ concentration, has not been possible because of the different mean square velocities (x 2 ) of Moessbauer nuclei at chemically different sites. A method is now described which, based on Moessbauer data at several temperatures, permits the comparison of absorption areas at (x 2 )=0. (Auth.)

  18. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  19. Radiological interpretation 2020: Toward quantitative image assessment

    International Nuclear Information System (INIS)

    Boone, John M.

    2007-01-01

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  20. Quantitative risk assessment system (QRAS)

    Science.gov (United States)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  1. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  2. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  3. Quantitative valuation of platform technology based intangibles companies

    OpenAIRE

    Achleitner, Ann-Kristin; Nathusius, Eva; Schraml, Stephanie

    2007-01-01

    In the course of raising external equity, e.g. from venture capitalists, a quantitative valuation is usually required for entrepreneurial ventures. This paper examines the challenges of quantitatively valuing platform technology based entrepreneurial ventures. The distinct characteristics of such companies pose specific requirements on the applicability of quantitative valuation methods. The entrepreneur can choose from a wide range of potential commercialization strategies to pursue in the c...

  4. Using Live-Crown Ratio to Control Wood Quality: An Example of Quantitative Silviculture

    Science.gov (United States)

    Thomas J. Dean

    1999-01-01

    Quantitative silviculture is the application of biological relationships in meeting specific, quantitative management objectives. It is a two-sided approach requiring the identification and application of biological relationships. An example of quantitative silviculture is presented that uses a relationship between average-live crown ratio and relative stand density...

  5. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  6. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  7. Quantitative phase analysis in industrial research

    International Nuclear Information System (INIS)

    Ahmad Monshi

    1996-01-01

    X-Ray Diffraction (XRD) is the only technique able to identify phase and all the other analytical techniques give information about the elements. Quantitative phase analysis of minerals and industrial products is logically the next step after a qualitative examination and is of great importance in industrial research. Since the application of XRD in industry, early in this century, workers were trying to develop quantitative XRD methods. In this paper some of the important methods are briefly discussed and partly compared. These methods are Internal Standard, Known Additions, Double Dilution, External Standard, Direct Comparison, Diffraction Absorption and Ratio of Slopes

  8. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  9. Quantitative skeletal scintiscanning

    International Nuclear Information System (INIS)

    Haushofer, R.

    1982-01-01

    330 patients were examined by skeletal scintiscanning with sup(99m)Tc pyrophosphate and sup(99m)methylene diphosphonate in the years between 1977 and 1979. Course control examinations were carried out in 12 patients. The collective of patients presented with primary skeletal tumours, metastases, inflammatory and degenerative skeletal diseases. Bone scintiscanning combined with the ''region of interest'' technique was found to be an objective and reproducible technique for quantitative measurement of skeletal radioactivity concentrations. The validity of nuclear skeletal examinations can thus be enhanced as far as diagnosis, course control, and differential diagnosis are concerned. Quantitative skeletal scintiscanning by means of the ''region of interest'' technique has opened up a new era in skeletal diagnosis by nuclear methods. (orig./MG) [de

  10. Research and application of quantitative assessment model on chemical substances dermal exposure%化学物质经皮职业暴露定量评估模型的研究及应用

    Institute of Scientific and Technical Information of China (English)

    陈会祥; 黄德寅; 王卉; 薄亚莉; 孙倩; 张倩; 李敏嫣

    2017-01-01

    目的 验证NIOSH定量评估模型模拟结果的可靠性和准确性,并通过实例应用,评估有毒化学物质皮肤接触的吸收程度,提出相应的防护措施建议.方法 选择几种典型的易经皮吸收的化学物质,通过欧洲有毒化学物质皮肤吸收的评估和预测数据库(EDETOX Database)获取其经皮吸收实验数据,将这些研究实例通过NIOSH模型进行吸收率的模拟,将模拟结果与实验数据对比并进行统计学分析,评价NIOSH模型的可靠性和准确性.再以三甲苯磷酸酯和苯酚为实例,采用NIOSH模型评估吸收的剂量(mg),将我国的职业接触浓度限值(mg/m3)转换为接触当量限值(mg),对经由皮肤吸收的职业暴露程度进行判定.结果 模型模拟结果和实验数据的差别无统计学意义(P>0.05).实例应用模拟结果显示,三甲苯磷酸酯3种模拟场景的8h、150 h吸收剂量分别为0.01 mg、0.03 mg、0.03 mg和0.76 mg、4.48 mg、6.93 mg,未超过时间折算接触限值当量(0.67 mg、12.56 mg);苯酚3种模拟场景的8h吸收剂量分别为7.10mg、2.35 mg、15.40 mg,亦未超过时间折算接触限值当量(22.46 mg).结论 该模型具有一定的准确性和可靠性.实例应用显示,NIOSH模型对于经皮吸收的影响因素考虑全面,职业暴露场景的设置灵活方便,对于工作场所化学物质经皮吸收职业暴露评估具有较强的实用性.%Objective To verify the reliability and accuracy of NIOSH quantitative assessment model and evaluate the dermal absorption degree of chemical substances by skin exposure by practice application examples,thereby offer corresponding protection proposals.Methods Several typical chemical substances that easy to be absorpted through skin were selected,the experiment data of skin absorption was obtained from EDETOX Database;compare the skin absorption simulation results by NIOSH quantitative as sessment model with the data from EDETOX Database and take statistical analysis

  11. Quantitative FDG in depression

    Energy Technology Data Exchange (ETDEWEB)

    Chua, P.; O`Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D. [Austin Hospital, Melbourne, VIC (Australia). Dept of Psychiatry and Centre for PET

    1998-03-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual``s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals.

  12. Quantitative FDG in depression

    International Nuclear Information System (INIS)

    Chua, P.; O'Keefe, G.J.; Egan, G.F.; Berlangieri, S.U.; Tochon-Danguy, H.J.; Mckay, W.J.; Morris, P.L.P.; Burrows, G.D.

    1998-01-01

    Full text: Studies of regional cerebral glucose metabolism (rCMRGlu) using positron emission tomography (PET) in patients with affective disorders have consistently demonstrated reduced metabolism in the frontal regions. Different quantitative and semi-quantitative rCMRGlu regions of interest (ROI) comparisons, e.g. absolute metabolic rates, ratios of dorsolateral prefrontal cortex (DLPFC) to ipsilateral hemisphere cortex, have been reported. These studies suffered from the use of a standard brain atlas to define ROls, whereas in this case study, the individual''s magnetic resonance imaging (MRI) scan was registered with the PET scan to enable accurate neuroanatomical ROI definition for the subject. The patient is a 36-year-old female with a six-week history of major depression (HAM-D = 34, MMSE = 28). A quantitative FDG PET study and an MRI scan were performed. Six MRI-guided ROls (DLPFC, PFC, whole hemisphere) were defined. The average rCMRGlu in the DLPFC (left = 28.8 + 5.8 mol/100g/min; right = 25.6 7.0 mol/100g/min) were slightly reduced compared to the ipsilateral hemispherical rate (left = 30.4 6.8 mol/100g/min; right = 29.5 7.2 mol/100g/min). The ratios of DLPFC to ipsilateral hemispheric rate were close to unity (left = 0.95 0.29; right 0.87 0.32). The right to left DLPFC ratio did not show any significant asymmetry (0.91 0.30). These results do not correlate with earlier published results reporting decreased left DLPFC rates compared to right DLPFC, although our results will need to be replicated with a group of depressed patients. Registration of PET and MRI studies is necessary in ROI-based quantitative FDG PET studies to allow for the normal anatomical variation among individuals, and thus is essential for accurate comparison of rCMRGlu between individuals

  13. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  14. Integrated quantitative pharmacology for treatment optimization in oncology

    NARCIS (Netherlands)

    Hasselt, J.G.C. van

    2014-01-01

    This thesis describes the development and application of quantitative pharmacological models in oncology for treatment optimization and for the design and analysis of clinical trials with respect to pharmacokinetics, toxicity, efficacy and cost-effectiveness. A recurring theme throughout this

  15. Practical quantitative measures of ALARA

    International Nuclear Information System (INIS)

    Kathren, R.L.; Larson, H.V.

    1982-06-01

    Twenty specific quantitative measures to assist in evaluating the effectiveness of as low as reasonably achievable (ALARA) programs are described along with their applicability, practicality, advantages, disadvantages, and potential for misinterpretation or dortion. Although no single index or combination of indices is suitable for all facilities, generally, these five: (1) mean individual dose equivalent (MIDE) to the total body from penetrating radiations; (2) statistical distribution of MIDE to the whole body from penetrating radiations; (3) cumulative penetrating whole body dose equivalent; (4) MIDE evaluated by job classification; and (5) MIDE evaluated by work location-apply to most programs. Evaluation of other programs may require other specific dose equivalent based indices, including extremity exposure data, cumulative dose equivalent to organs or to the general population, and nonpenetrating radiation dose equivalents. Certain nondose equivalent indices, such as the size of the radiation or contamination area, may also be used; an airborne activity index based on air concentration, room volume, and radiotoxicity is developed for application in some ALARA programs

  16. Quantitative cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Thelen, M.; Dueber, C.; Wolff, P.; Erbel, R.; Hoffmann, T.

    1985-06-01

    The scope and limitations of quantitative cardiac CT have been evaluated in a series of experimental and clinical studies. The left ventricular muscle mass was estimated by computed tomography in 19 dogs (using volumetric methods, measurements in two axes and planes and reference volume). There was good correlation with anatomical findings. The enddiastolic volume of the left ventricle was estimated in 22 patients with cardiomyopathies; using angiography as a reference, CT led to systematic under-estimation. It is also shown that ECG-triggered magnetic resonance tomography results in improved visualisation and may be expected to improve measurements of cardiac morphology.

  17. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  18. Quantitative performance monitoring

    International Nuclear Information System (INIS)

    Heller, A.S.

    1987-01-01

    In the recently published update of NUREG/CR 3883, it was shown that Japanese plants of size and design similar to those in the US have significantly fewer trips in a given year of operation. One way to reduce such imbalance is the efficient use of available plant data. Since plant data are recorded and monitored continuously for management feedback and timely resolution of problems, this data should be actively used to increase the efficiency of operations and, ultimately, for a reduction of plant trips in power plants. A great deal of information is lost, however, if the analytical tools available for the data evaluation are misapplied or not adopted at all. This paper deals with a program developed to use quantitative techniques to monitor personnel performance in an operating power plant. Visual comparisons of ongoing performance with predetermined quantitative performance goals are made. A continuous feedback is provided to management for early detection of adverse trends and timely resolution of problems. Ultimately, costs are reduced through effective resource management and timely decision making

  19. Quantitative clinical radiobiology

    International Nuclear Information System (INIS)

    Bentzen, S.M.

    1993-01-01

    Based on a series of recent papers, a status is given of our current ability to quantify the radiobiology of human tumors and normal tissues. Progress has been made in the methods of analysis. This includes the introduction of 'direct' (maximum likelihood) analysis, incorporation of latent-time in the analyses, and statistical approaches to allow for the many factors of importance in predicting tumor-control probability of normal-tissue complications. Quantitative clinical radiobiology of normal tissues is reviewed with emphasis on fractionation sensitivity, repair kinetics, regeneration, latency, and the steepness of dose-response curves. In addition, combined modality treatment, functional endpoints, and the search for a correlation between the occurrence of different endpoints in the same individual are discussed. For tumors, quantitative analyses of fractionation sensitivity, repair kinetics, reoxygenation, and regeneration are reviewed. Other factors influencing local control are: Tumor volume, histopathologic differentiation and hemoglobin concentration. Also, the steepness of the dose-response curve for tumors is discussed. Radiobiological strategies for improving radiotherapy are discussed with emphasis on non-standard fractionation and individualization of treatment schedules. (orig.)

  20. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  1. Quantitative imaging of bilirubin by photoacoustic microscopy

    Science.gov (United States)

    Zhou, Yong; Zhang, Chi; Yao, Da-Kang; Wang, Lihong V.

    2013-03-01

    Noninvasive detection of both bilirubin concentration and its distribution is important for disease diagnosis. Here we implemented photoacoustic microscopy (PAM) to detect bilirubin distribution. We first demonstrate that our PAM system can measure the absorption spectra of bilirubin and blood. We also image bilirubin distributions in tissuemimicking samples, both without and with blood mixed. Our results show that PAM has the potential to quantitatively image bilirubin in vivo for clinical applications.

  2. Automatic quantitative metallography

    International Nuclear Information System (INIS)

    Barcelos, E.J.B.V.; Ambrozio Filho, F.; Cunha, R.C.

    1976-01-01

    The quantitative determination of metallographic parameters is analysed through the description of Micro-Videomat automatic image analysis system and volumetric percentage of perlite in nodular cast irons, porosity and average grain size in high-density sintered pellets of UO 2 , and grain size of ferritic steel. Techniques adopted are described and results obtained are compared with the corresponding ones by the direct counting process: counting of systematic points (grid) to measure volume and intersections method, by utilizing a circunference of known radius for the average grain size. The adopted technique for nodular cast iron resulted from the small difference of optical reflectivity of graphite and perlite. Porosity evaluation of sintered UO 2 pellets is also analyzed [pt

  3. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  4. Quantitative isotopes miction cystoureterography (QIMCU)

    International Nuclear Information System (INIS)

    Szy, D.A.G.; Stroetges, M.W.; Funke-Voelkers, R.

    1982-01-01

    A simple method for a quantitative evaluation of vesicoureteral reflux was developed. It allows the determination of a) the volume of reflux b) the volume of the bladder at each point of time during the examination. The QIMCU gives an insight into the dynamic of reflux, of reflux volume, and of actual bladder volume. The clinical application in 37 patients with 53 insufficient ureteral orifices (i.e. reflux) showed that the onset of reflux occured in 60% as early as in the first five minutes of the examination but later in the remaining 40%. The maximal reflux was found only in 26% during the first five minutes. The reflux volume exceeded in more than 50% the amount of 3.5 ml. The international grading corresponds with the reflux volume determined by this method. Radionuclide cystoureterography can be used as well in childhood as in adults. Because the radiaction exposure is low, the method can be recommended for the initial examination and for follow up studies. (Author)

  5. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  6. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    Key words: Polymerase chain reaction (PCR), quantitative real time polymerase chain reaction (qPCR), quantitative sexing, Siberian tiger. INTRODUCTION. Animal molecular sexing .... 43:3-12. Ellegren H (1996). First gene on the avian W chromosome (CHD) provides a tag for universal sexing of non-ratite birds. Proc.

  7. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  9. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  10. Balance between qualitative and quantitative verification methods

    International Nuclear Information System (INIS)

    Nidaira, Kazuo

    2012-01-01

    The amount of inspection effort for verification of declared nuclear material needs to be optimized in the situation where qualitative and quantitative measures are applied. Game theory was referred to investigate the relation of detection probability and deterrence of diversion. Payoffs used in the theory were quantified for cases of conventional safeguards and integrated safeguards by using AHP, Analytical Hierarchy Process. Then, it became possible to estimate detection probability under integrated safeguards which had equivalent deterrence capability for detection probability under conventional safeguards. In addition the distribution of inspection effort for qualitative and quantitative measures was estimated. Although the AHP has some ambiguities in quantifying qualitative factors, its application to optimization in safeguards is useful to reconsider the detection probabilities under integrated safeguards. (author)

  11. Quality control in quantitative computed tomography

    International Nuclear Information System (INIS)

    Jessen, K.A.; Joergensen, J.

    1989-01-01

    Computed tomography (CT) has for several years been an indispensable tool in diagnostic radiology, but it is only recently that extraction of quantitative information from CT images has been of practical clinical value. Only careful control of the scan parameters, and especially the scan geometry, allows useful information to be obtained; and it can be demonstrated by simple phantom measurements how sensitive a CT system can be to variations in size, shape and position of the phantom in the gantry aperture. Significant differences exist between systems that are not manifested in normal control of image quality and general performance tests. Therefore an actual system has to be analysed for its suitability for quantitative use of the images before critical clinical applications are justified. (author)

  12. Quantitative Nuclear Medicine. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, J.; El Fakhri, G. [Massachusetts General Hospital and Harvard Medical School, Boston (United States)

    2014-12-15

    Planar imaging is still used in clinical practice although tomographic imaging (single photon emission computed tomography (SPECT) and positron emission tomography (PET)) is becoming more established. In this chapter, quantitative methods for both imaging techniques are presented. Planar imaging is limited to single photon. For both SPECT and PET, the focus is on the quantitative methods that can be applied to reconstructed images.

  13. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  14. Quantitative analysis of receptor imaging

    International Nuclear Information System (INIS)

    Fu Zhanli; Wang Rongfu

    2004-01-01

    Model-based methods for quantitative analysis of receptor imaging, including kinetic, graphical and equilibrium methods, are introduced in detail. Some technical problem facing quantitative analysis of receptor imaging, such as the correction for in vivo metabolism of the tracer and the radioactivity contribution from blood volume within ROI, and the estimation of the nondisplaceable ligand concentration, is also reviewed briefly

  15. Shedding quantitative fluorescence light on novel regulatory mechanisms in skeletal biomedicine and biodentistry.

    Science.gov (United States)

    Lee, Ji-Won; Iimura, Tadahiro

    2017-02-01

    Digitalized fluorescence images contain numerical information such as color (wavelength), fluorescence intensity and spatial position. However, quantitative analyses of acquired data and their validation remained to be established. Our research group has applied quantitative fluorescence imaging on tissue sections and uncovered novel findings in skeletal biomedicine and biodentistry. This review paper includes a brief background of quantitative fluorescence imaging and discusses practical applications by introducing our previous research. Finally, the future perspectives of quantitative fluorescence imaging are discussed.

  16. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    Science.gov (United States)

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  17. Simultaneous quantitation of polygalaxanthone III and four ginsenosides by ultra-fast liquid chromatography with tandem mass spectrometry in rat and beagle dog plasma after oral administration of Kai-Xin-San: application to a comparative pharmacokinetic study.

    Science.gov (United States)

    Lv, Chunxiao; Li, Qing; Zhang, Xiaowen; He, Bosai; Xu, Huarong; Yin, Yidi; Liu, Ran; Liu, Jingjing; Chen, Xiaohui; Bi, Kaishun

    2014-05-01

    A fast, selective, and quantitative ultra-fast liquid chromatography with tandem mass spectrometry method has been developed and validated for the simultaneous quantitation of polygalaxanthone III, ginsenoside Rb1, ginsenoside Rd, ginsenoside Re, and ginsenoside Rg1 in the plasma of rat and beagle dog after oral administration of Kai-Xin-San. After addition of the internal standard, salidroside, the plasma samples were extracted by liquid-liquid extraction and separated on a Venusil MP C18 column with methanol/0.01% acetic acid water as mobile phase. The tandem mass spectrometric detection was performed in the multiple reaction monitoring with turbo ion spray source in a switching ionization mode. The method was examined, and found to be precise and accurate with the linearity range of the compounds. The intra- and interday precision and accuracy of the analytes were well within acceptance criteria (±15%). The mean extraction recoveries of analytes and internal standard were all >75.0%. The validated method has been successfully applied to comparing pharmacokinetic profiles of analytes in rat and beagle dog plasma. The results indicated that no significant differences were observed in pharmacokinetic parameters of ginsenoside Rg1, while the others had significant differences, which may due to the different mechanisms of absorption and metabolism. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  19. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  20. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  1. Mixing quantitative with qualitative methods

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  2. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  3. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  4. Enhancing quantitative approaches for assessing community resilience

    Science.gov (United States)

    Chuang, W. C.; Garmestani, A.S.; Eason, T. N.; Spanbauer, T. L.; Fried-Peterson, H. B.; Roberts, C.P.; Sundstrom, Shana M.; Burnett, J.L.; Angeler, David G.; Chaffin, Brian C.; Gunderson, L.; Twidwell, Dirac; Allen, Craig R.

    2018-01-01

    Scholars from many different intellectual disciplines have attempted to measure, estimate, or quantify resilience. However, there is growing concern that lack of clarity on the operationalization of the concept will limit its application. In this paper, we discuss the theory, research development and quantitative approaches in ecological and community resilience. Upon noting the lack of methods that quantify the complexities of the linked human and natural aspects of community resilience, we identify several promising approaches within the ecological resilience tradition that may be useful in filling these gaps. Further, we discuss the challenges for consolidating these approaches into a more integrated perspective for managing social-ecological systems.

  5. Quantitative risk in radiation protection standards

    International Nuclear Information System (INIS)

    Bond, V.P.

    1978-01-01

    The bases for developing quantitative assessment of exposure risks in the human being, and the several problems that accompany the assessment and introduction of the risk of exposure to high and low LET radiation into radiation protection, will be evaluated. The extension of the pioneering radiation protection philosophies to the control of other hazardous agents that cannot be eliminated from the environment will be discussed, as will the serious misunderstandings and misuse of concepts and facts that have inevitably surrounded the application to one agent alone, of the protection philosophy that must in time be applied to a broad spectrum of potentially hazardous agents. (orig.) [de

  6. Development, Validation, and Interlaboratory Evaluation of a Quantitative Multiplexing Method To Assess Levels of Ten Endogenous Allergens in Soybean Seed and Its Application to Field Trials Spanning Three Growing Seasons.

    Science.gov (United States)

    Hill, Ryan C; Oman, Trent J; Wang, Xiujuan; Shan, Guomin; Schafer, Barry; Herman, Rod A; Tobias, Rowel; Shippar, Jeff; Malayappan, Bhaskar; Sheng, Li; Xu, Austin; Bradshaw, Jason

    2017-07-12

    As part of the regulatory approval process in Europe, comparison of endogenous soybean allergen levels between genetically engineered (GE) and non-GE plants has been requested. A quantitative multiplex analytical method using tandem mass spectrometry was developed and validated to measure 10 potential soybean allergens from soybean seed. The analytical method was implemented at six laboratories to demonstrate the robustness of the method and further applied to three soybean field studies across multiple growing seasons (including 21 non-GE soybean varieties) to assess the natural variation of allergen levels. The results show environmental factors contribute more than genetic factors to the large variation in allergen abundance (2- to 50-fold between environmental replicates) as well as a large contribution of Gly m 5 and Gly m 6 to the total allergen profile, calling into question the scientific rational for measurement of endogenous allergen levels between GE and non-GE varieties in the safety assessment.

  7. Development and validation of a sensitive UPLC-MS/MS method for the quantitation of [(13)C]sucrose in rat plasma, blood, and brain: Its application to the measurement of blood-brain barrier permeability.

    Science.gov (United States)

    Miah, Mohammad K; Bickel, Ulrich; Mehvar, Reza

    2016-03-15

    Accurate and reproducible measurement of blood-brain barrier (BBB) integrity is critical in the assessment of the pathophysiology of the central nervous system disorders and in monitoring therapeutic effects. The widely-used low molecular weight marker [(14)C]sucrose is non-specific in the absence of chromatographic separation. The purpose of this study was to develop and validate a sensitive and reproducible LC-MS/MS method for the analysis of stable isotope-modified [(13)C12]sucrose in brain, plasma, and blood to determine BBB permeability to sucrose. After addition of internal standard (IS, [(13)C6]sucrose), the marker and IS were recovered from diluted rat blood, plasma, and brain homogenate by protein precipitation using acetonitrile. The recovery of the marker and IS was almost quantitative (90-106%) for all three matrices. The recovered samples were directly injected into an isocratic UPLC system with a run time of 6 min. Mass spectrometry was conducted using multiple reaction monitoring in negative mode. The method was linear (r(2)≥0.99) in the concentration ranges tested for the diluted blood and plasma (10-1000 ng/mL) and brain homogenate (1-200 ng/mL). The lower limit of quantitation of the assay was 0.5 pg injected on column. The assay was validated (n=5) based on acceptable intra- and inter-run accuracy and precision values. The method was successfully used for the measurement of serial blood and plasma and terminal brain concentrations of [(13)C12]sucrose after a single intravenous dose (10 mg/kg) of the marker to rats. As expected, the apparent brain uptake clearance values of [(13)C12]sucrose were low in healthy rats. The method may be useful for determination of the BBB integrity in animal models. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Application of ultra-high-performance liquid chromatography coupled with LTQ-Orbitrap mass spectrometry for identification, confirmation and quantitation of illegal adulterated weight-loss drugs in plant dietary supplements.

    Science.gov (United States)

    Cheng, Qiaoyuan; Shou, Linjun; Chen, Cen; Shi, Shi; Zhou, Minghao

    2017-10-01

    In this paper, an ultra-high-performance liquid chromatography coupled with linear ion trap quadrupole Orbitrap high resolution mass spectrometry (UHPLC-LTQ-Orbitrap HRMS) method was developed and validated for identification, confirmation and quantitation of illegal adulterated weight-loss drugs in plant dietary supplements. 13 wt-loss drugs were well separated by the gradient elution of 10mmol/L ammonium acetate - 0.05% formic acid H 2 O and acetonitrile at a flow rate of 0.2mL/min within 12min. The MS analysis was operated under the positive ion and in full MS/dd-MS 2 (data-dependent MS 2 ) mode. The full MS scan with resolution at 60 000 FWHM and narrow mass windows at 5ppm acquired data for identification and quantitation, and dd-MS 2 scan with resolution at 15 000 FWHM obtained product ions for confirmation. The method validation showed good linearity with coefficients of determination (r 2 ) higher than 0.9951 for all analytes. Meantime, all the LOD and LLOQ values were in the respective range of 0.3-2 and 1-9ng/g. The accuracy, intra- and inter-day precision were in the ranges of -1.7 to 3.4%, 1.7-5.0% and 1.9-4.4%, respectively. The mean recoveries ranged from 85.4 to 107.1%, while the absolute and relative matrix effect were in the corresponding range of 98.2-108.6% and 2.6-8.7%. Among 120 batches of weight loss plant dietary supplements, sibutramine and fluoxertine or both were positive in 29 samples. In general, LTQ-Orbitrap HRMS technology was a powerful tool for the analysis of illegal ingredients in dietary supplements. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Quantitative assessment of breast density from mammograms

    International Nuclear Information System (INIS)

    Jamal, N.; Ng, K.H.

    2004-01-01

    Full text: It is known that breast density is increasingly used as a risk factor for breast cancer. This study was undertaken to develop and validate a semi-automated computer technique for the quantitative assessment of breast density from digitised mammograms. A computer technique had been developed using MATLAB (Version 6.1) based GUI applications. This semi-automated image analysis tool consists of gradient correction, segmentation of breast region from background, segmentation of fibroglandular and adipose region within the breast area and calculation of breast density. The density is defined as the percentage of fibroglandular tissue area divided by the total breast area in the mammogram. This technique was clinically validated with 122 normal mammograms; these were subjectively evaluated and classified according to the five parenchyma patterns of the Tabar's scheme (Class I- V) by a consultant radiologist. There was a statistical significant correlation between the computer technique and subjective classification (r 2 = 0.84, p<0.05). 71.3% of subjective classification was correctly classified using the computer technique. We had developed a computer technique for the quantitative assessment of breast density and validated its accuracy for computerized classification based on Tabar's scheme. This quantitative tool is useful for the evaluation of a large dataset of mammograms to predict breast cancer risk based on density. Furthermore it has the potential to provide an early marker for success or failure in chemoprevention studies such as hormonal replacement therapy. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  10. Stable isotope dimethyl labelling for quantitative proteomics and beyond

    Science.gov (United States)

    Hsu, Jue-Liang; Chen, Shu-Hui

    2016-01-01

    Stable-isotope reductive dimethylation, a cost-effective, simple, robust, reliable and easy-to- multiplex labelling method, is widely applied to quantitative proteomics using liquid chromatography-mass spectrometry. This review focuses on biological applications of stable-isotope dimethyl labelling for a large-scale comparative analysis of protein expression and post-translational modifications based on its unique properties of the labelling chemistry. Some other applications of the labelling method for sample preparation and mass spectrometry-based protein identification and characterization are also summarized. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644970

  11. Miniaturization of Fresnel lenses for solar concentration: a quantitative investigation.

    Science.gov (United States)

    Duerr, Fabian; Meuret, Youri; Thienpont, Hugo

    2010-04-20

    Sizing down the dimensions of solar concentrators for photovoltaic applications offers a number of promising advantages. It provides thinner modules and smaller solar cells, which reduces thermal issues. In this work a plane Fresnel lens design is introduced that is first analyzed with geometrical optics. Because of miniaturization, pure ray tracing may no longer be valid to determine the concentration performance. Therefore, a quantitative wave optical analysis of the miniaturization's influence on the obtained concentration performance is presented. This better quantitative understanding of the impact of diffraction in microstructured Fresnel lenses might help to optimize the design of several applications in nonimaging optics.

  12. On the quantitativeness of EDS STEM

    Energy Technology Data Exchange (ETDEWEB)

    Lugg, N.R. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan); Kothleitner, G. [Institute for Electron Microscopy and Nanoanalysis, Graz University of Technology, Steyrergasse 17, 8010 Graz (Austria); Centre for Electron Microscopy, Steyrergasse 17, 8010 Graz (Austria); Shibata, N.; Ikuhara, Y. [Institute of Engineering Innovation, The University of Tokyo, 2-11-16, Yayoi, Bunkyo-ku, Tokyo 113-8656 (Japan)

    2015-04-15

    Chemical mapping using energy dispersive X-ray spectroscopy (EDS) in scanning transmission electron microscopy (STEM) has recently shown to be a powerful technique in analyzing the elemental identity and location of atomic columns in materials at atomic resolution. However, most applications of EDS STEM have been used only to qualitatively map whether elements are present at specific sites. Obtaining calibrated EDS STEM maps so that they are on an absolute scale is a difficult task and even if one achieves this, extracting quantitative information about the specimen – such as the number or density of atoms under the probe – adds yet another layer of complexity to the analysis due to the multiple elastic and inelastic scattering of the electron probe. Quantitative information may be obtained by comparing calibrated EDS STEM with theoretical simulations, but in this case a model of the structure must be assumed a priori. Here we first theoretically explore how exactly elastic and thermal scattering of the probe confounds the quantitative information one is able to extract about the specimen from an EDS STEM map. We then show using simulation how tilting the specimen (or incident probe) can reduce the effects of scattering and how it can provide quantitative information about the specimen. We then discuss drawbacks of this method – such as the loss of atomic resolution along the tilt direction – but follow this with a possible remedy: precession averaged EDS STEM mapping. - Highlights: • Signal obtained in EDS STEM maps (of STO) compared to non-channelling signal. • Deviation from non-channelling signal occurs in on-axis experiments. • Tilting specimen: signal close to non-channelling case but atomic resolution is lost. • Tilt-precession series: non-channelling signal and atomic-resolution features obtained. • Associated issues are discussed.

  13. Quantitative Accelerated Life Testing of MEMS Accelerometers.

    Science.gov (United States)

    Bâzu, Marius; Gălăţeanu, Lucian; Ilian, Virgil Emil; Loicq, Jerome; Habraken, Serge; Collette, Jean-Paul

    2007-11-20

    Quantitative Accelerated Life Testing (QALT) is a solution for assessing thereliability of Micro Electro Mechanical Systems (MEMS). A procedure for QALT is shownin this paper and an attempt to assess the reliability level for a batch of MEMSaccelerometers is reported. The testing plan is application-driven and contains combinedtests: thermal (high temperature) and mechanical stress. Two variants of mechanical stressare used: vibration (at a fixed frequency) and tilting. Original equipment for testing at tiltingand high temperature is used. Tilting is appropriate as application-driven stress, because thetilt movement is a natural environment for devices used for automotive and aerospaceapplications. Also, tilting is used by MEMS accelerometers for anti-theft systems. The testresults demonstrated the excellent reliability of the studied devices, the failure rate in the"worst case" being smaller than 10 -7 h -1 .

  14. Plant and animal communities along the Swedish Baltic Sea coast - the building of a database of quantitative data collected by SCUBA divers, its use and some GIS applications in the Graesoe area

    International Nuclear Information System (INIS)

    Sandman, Antonia; Kautsky, Hans

    2004-06-01

    The aim of the project was to compile a single database with quantitative data collected by SCUBA divers from the whole Swedish Baltic Sea coast. Data of plant and animal biomass, together with position, depth and type of substrate from 19 areas along the Swedish coast from the county of Blekinge to Kalix in the Bothnian Bay were compiled in a single database. In all, the database contains 2,170 records (samples) from 179 different stations where in total 161 plant and 145 animal species have been found. The data were then illustrated by the geographical distribution of plant and animal biomass and by constructing a model to estimate future changes of the plant and animal communities in the Graesoe area in the Aaland Sea applying GIS-techniques. To illustrate the opportunities of the database the change of the composition of benthic plant and animal biomass with salinity was calculated. The proportion of marine species increased with increasing salinity and the benthic biomass was at its highest in the southern Baltic proper. Quantitative data from Grepen and the Graesoe-Singoe area were used to calculate present biomass in the Graesoe area. A scenario of the change in biomass distribution and total biomass caused by shore displacement was created using data from Raaneaa and Kalix in the Bothnian Bay. To map the biomass distribution the material was divided into different depth intervals. The change of biomass with time was calculated as a function of salinity change and reduction of the available area, caused by shore displacement. The total biomass for all plants and animals in the investigated area was 50,500 tonnes at present. In 2,000 years the total biomass will be 25,000 tonnes and in 4,000 years 3,600 tonnes due to shore displacement causing a decrease in both salinity and available substrate.To make an estimate of the species distribution and a rough estimate of their biomass in an unknown geographic area, the type of substrate, the depth and the wave

  15. Qualitative and quantitative evaluation of rigid and deformable motion correction algorithms using dual-energy CT images in view of application to CT perfusion measurements in abdominal organs affected by breathing motion.

    Science.gov (United States)

    Skornitzke, S; Fritz, F; Klauss, M; Pahn, G; Hansen, J; Hirsch, J; Grenacher, L; Kauczor, H-U; Stiller, W

    2015-02-01

    To compare six different scenarios for correcting for breathing motion in abdominal dual-energy CT (DECT) perfusion measurements. Rigid [RRComm(80 kVp)] and non-rigid [NRComm(80 kVp)] registration of commercially available CT perfusion software, custom non-rigid registration [NRCustom(80 kVp], demons algorithm) and a control group [CG(80 kVp)] without motion correction were evaluated using 80 kVp images. Additionally, NRCustom was applied to dual-energy (DE)-blended [NRCustom(DE)] and virtual non-contrast [NRCustom(VNC)] images, yielding six evaluated scenarios. After motion correction, perfusion maps were calculated using a combined maximum slope/Patlak model. For qualitative evaluation, three blinded radiologists independently rated motion correction quality and resulting perfusion maps on a four-point scale (4 = best, 1 = worst). For quantitative evaluation, relative changes in metric values, R(2) and residuals of perfusion model fits were calculated. For motion-corrected images, mean ratings differed significantly [NRCustom(80 kVp) and NRCustom(DE), 3.3; NRComm(80 kVp), 3.1; NRCustom(VNC), 2.9; RRComm(80 kVp), 2.7; CG(80 kVp), 2.7; all p VNC), 22.8%; RRComm(80 kVp), 0.6%; CG(80 kVp), 0%]. Regarding perfusion maps, NRCustom(80 kVp) and NRCustom(DE) were rated highest [NRCustom(80 kVp), 3.1; NRCustom(DE), 3.0; NRComm(80 kVp), 2.8; NRCustom(VNC), 2.6; CG(80 kVp), 2.5; RRComm(80 kVp), 2.4] and had significantly higher R(2) and lower residuals. Correlation between qualitative and quantitative evaluation was low to moderate. Non-rigid motion correction improves spatial alignment of the target region and fit of CT perfusion models. Using DE-blended and DE-VNC images for deformable registration offers no significant improvement. Non-rigid algorithms improve the quality of abdominal CT perfusion measurements but do not benefit from DECT post processing.

  16. Plant and animal communities along the Swedish Baltic Sea coast - the building of a database of quantitative data collected by SCUBA divers, its use and some GIS applications in the Graesoe area

    Energy Technology Data Exchange (ETDEWEB)

    Sandman, Antonia; Kautsky, Hans [Stockholm Univ. (Sweden). Dept. of Systems Ecology

    2005-03-01

    The aim of the project was to compile a single database with quantitative data collected by SCUBA divers from the whole Swedish Baltic Sea coast. Data of plant and animal biomass, together with position, depth and type of substrate from 19 areas along the Swedish coast from the county of Blekinge to Kalix in the Bothnian Bay were compiled in a single database. In all, the database contains 2,170 records (samples) from 179 different stations where in total 161 plant and 145 animal species have been found. The data were then illustrated by the geographical distribution of plant and animal biomass and by constructing a model to estimate future changes of the plant and animal communities in the Graesoe area in the Aaland Sea applying GIS-techniques. To illustrate the opportunities of the database the change of the composition of benthic plant and animal biomass with salinity was calculated. The proportion of marine species increased with increasing salinity and the benthic biomass was at its highest in the southern Baltic proper. Quantitative data from Grepen and the Graesoe-Singoe area were used to calculate present biomass in the Graesoe area. A scenario of the change in biomass distribution and total biomass caused by shore displacement was created using data from Raaneaa and Kalix in the Bothnian Bay. To map the biomass distribution the material was divided into different depth intervals. The change of biomass with time was calculated as a function of salinity change and reduction of the available area, caused by shore displacement. The total biomass for all plants and animals in the investigated area was 50,500 tonnes at present. In 2,000 years the total biomass will be 25,000 tonnes and in 4,000 years 3,600 tonnes due to shore displacement causing a decrease in both salinity and available substrate.To make an estimate of the species distribution and a rough estimate of their biomass in an unknown geographic area, the type of substrate, the depth and the wave

  17. Quantitative reconstruction from a single diffraction-enhanced image

    International Nuclear Information System (INIS)

    Paganin, D.M.; Lewis, R.A.; Kitchen, M.

    2003-01-01

    Full text: We develop an algorithm for using a single diffraction-enhanced image (DEI) to obtain a quantitative reconstruction of the projected thickness of a single-material sample which is embedded within a substrate of approximately constant thickness. This algorithm is used to quantitatively map inclusions in a breast phantom, from a single synchrotron DEI image. In particular, the reconstructed images quantitatively represent the projected thickness in the bulk of the sample, in contrast to DEI images which greatly emphasise sharp edges (high spatial frequencies). In the context of an ultimate aim of improved methods for breast cancer detection, the reconstructions are potentially of greater diagnostic value compared to the DEI data. Lastly, we point out that the methods of analysis presented here are also applicable to the quantitative analysis of differential interference contrast (DIC) images

  18. Quantitative mass spectrometry: an overview

    Science.gov (United States)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  19. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  20. Quantitative PCR high-resolution melting (qPCR-HRM) curve analysis, a new approach to simultaneously screen point mutations and large rearrangements: application to MLH1 germline mutations in Lynch syndrome.

    Science.gov (United States)

    Rouleau, Etienne; Lefol, Cédrick; Bourdon, Violaine; Coulet, Florence; Noguchi, Tetsuro; Soubrier, Florent; Bièche, Ivan; Olschwang, Sylviane; Sobol, Hagay; Lidereau, Rosette

    2009-06-01

    Several techniques have been developed to screen mismatch repair (MMR) genes for deleterious mutations. Until now, two different techniques were required to screen for both point mutations and large rearrangements. For the first time, we propose a new approach, called "quantitative PCR (qPCR) high-resolution melting (HRM) curve analysis (qPCR-HRM)," which combines qPCR and HRM to obtain a rapid and cost-effective method suitable for testing a large series of samples. We designed PCR amplicons to scan the MLH1 gene using qPCR HRM. Seventy-six patients were fully scanned in replicate, including 14 wild-type patients and 62 patients with known mutations (57 point mutations and five rearrangements). To validate the detected mutations, we used sequencing and/or hybridization on a dedicated MLH1 array-comparative genomic hybridization (array-CGH). All point mutations and rearrangements detected by denaturing high-performance liquid chromatography (dHPLC)+multiplex ligation-dependent probe amplification (MLPA) were successfully detected by qPCR HRM. Three large rearrangements were characterized with the dedicated MLH1 array-CGH. One variant was detected with qPCR HRM in a wild-type patient and was located within the reverse primer. One variant was not detected with qPCR HRM or with dHPLC due to its proximity to a T-stretch. With qPCR HRM, prescreening for point mutations and large rearrangements are performed in one tube and in one step with a single machine, without the need for any automated sequencer in the prescreening process. In replicate, its reagent cost, sensitivity, and specificity are comparable to those of dHPLC+MLPA techniques. However, qPCR HRM outperformed the other techniques in terms of its rapidity and amount of data provided.

  1. Potential of capillary-column-switching liquid chromatography-tandem mass spectrometry for the quantitative trace analysis of small molecules. Application to the on-line screening of drugs in water.

    Science.gov (United States)

    Pitarch, Elena; Hernandez, Felix; ten Hove, Jan; Meiring, Hugo; Niesing, Willem; Dijkman, Ellen; Stolker, Linda; Hogendoorn, Elbert

    2004-03-26

    We have investigated the potential of capillary-column-switching liquid chromatography coupled to tandem mass spectrometry (cLC-MS-MS) for the quantitative on-line trace analysis of target compounds in aqueous solutions. The technical design of the nano-scale cLC system developed at our Institute for peptide and protein identification has been tested and evaluated for the direct trace analysis of drugs in water samples. Sulphametoxazole, bezafibrate, metoprolol, carbamazepine and bisoprolol occurring frequently in Dutch waters, were selected as test compounds. Adequate conditions for trapping, elution and MS-MS detection were investigated by employing laboratory made 200 microm i.d. capillary columns packed with 5 microm aqua C18 material. In the final cLC-MS-MS conditions, a 1 cm length trapping column and a 4 cm length analytical column were selected. Under these conditions, the target compounds could be directly determined in water down to a level of around 50 ng/l employing only 25 microl of water sample. Validation was done by recovery experiments in ground-, surface- and drinking-water matrices as well as by the analysis of water samples with incurred residues and previously analyzed with a conventional procedure involving off-line solid-phase extraction and narrow-bore LC with MS-MS detection. The new methodology provided recoveries (50-500 ng/l level) between 50 and 114% with RSDs (n = 3, each level) below 20% for most of the compounds. Despite the somewhat less analytical performance in comparison to the conventional procedure, the on-line approach of the new methodology is very suitable for screening of drugs in aqueous samples.

  2. Application of Ultra-High-Performance Liquid Chromatography Coupled with LTQ-Orbitrap Mass Spectrometry for the Qualitative and Quantitative Analysis of Polygonum multiflorum Thumb. and Its Processed Products

    Directory of Open Access Journals (Sweden)

    Teng-Hua Wang

    2015-12-01

    Full Text Available In order to quickly and simultaneously obtain the chemical profiles and control the quality of the root of Polygonum multiflorum Thumb. and its processed form, a rapid qualitative and quantitative method, using ultra-high-performance liquid chromatography coupled with electrospray ionization-linear ion trap-Orbitrap hybrid mass spectrometry (UHPLC-LTQ-Orbitrap MSn has been developed. The analysis was performed within 10 min on an AcQuity UPLC™ BEH C18 column with a gradient elution of 0.1% formic acid-acetonitrile at flow rate of 400 μL/min. According to the fragmentation mechanism and high resolution MSn data, a diagnostic ion searching strategy was used for rapid and tentative identification of main phenolic components and 23 compounds were simultaneously identified or tentatively characterized. The difference in chemical profiles between P. multiflorum and its processed preparation were observed by comparing the ions abundances of main constituents in the MS spectra and significant changes of eight metabolite biomarkers were detected in the P. multiflorum samples and their preparations. In addition, four of the representative phenols, namely gallic acid, trans-2,3,5,4′-tetra-hydroxystilbene-2-O-β-d-glucopyranoside, emodin and emodin-8-O-β-d-glucopyranoside were quantified by the validated UHPLC-MS/MS method. These phenols are considered to be major bioactive constituents in P. multiflorum, and are generally regarded as the index for quality assessment of this herb. The method was successfully used to quantify 10 batches of P. multiflorum and 10 batches of processed P. multiflorum. The results demonstrated that the method is simple, rapid, and suitable for the discrimination and quality control of this traditional Chinese herb.

  3. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  4. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  5. Quantitative densitometry of neurotransmitter receptors

    International Nuclear Information System (INIS)

    Rainbow, T.C.; Bleisch, W.V.; Biegon, A.; McEwen, B.S.

    1982-01-01

    An autoradiographic procedure is described that allows the quantitative measurement of neurotransmitter receptors by optical density readings. Frozen brain sections are labeled in vitro with [ 3 H]ligands under conditions that maximize specific binding to neurotransmitter receptors. The labeled sections are then placed against the 3 H-sensitive LKB Ultrofilm to produce the autoradiograms. These autoradiograms resemble those produced by [ 14 C]deoxyglucose autoradiography and are suitable for quantitative analysis with a densitometer. Muscarinic cholinergic receptors in rat and zebra finch brain and 5-HT receptors in rat brain were visualized by this method. When the proper combination of ligand concentration and exposure time are used, the method provides quantitative information about the amount and affinity of neurotransmitter receptors in brain sections. This was established by comparisons of densitometric readings with parallel measurements made by scintillation counting of sections. (Auth.)

  6. Energy Education: The Quantitative Voice

    Science.gov (United States)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  7. Quantitative fluorescence microscopy and image deconvolution.

    Science.gov (United States)

    Swedlow, Jason R

    2013-01-01

    Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used

  8. Quantitative nature of overexpression experiments

    Science.gov (United States)

    Moriya, Hisao

    2015-01-01

    Overexpression experiments are sometimes considered as qualitative experiments designed to identify novel proteins and study their function. However, in order to draw conclusions regarding protein overexpression through association analyses using large-scale biological data sets, we need to recognize the quantitative nature of overexpression experiments. Here I discuss the quantitative features of two different types of overexpression experiment: absolute and relative. I also introduce the four primary mechanisms involved in growth defects caused by protein overexpression: resource overload, stoichiometric imbalance, promiscuous interactions, and pathway modulation associated with the degree of overexpression. PMID:26543202

  9. Factors Influencing Students' Perceptions of Their Quantitative Skills

    Science.gov (United States)

    Matthews, Kelly E.; Hodgson, Yvonne; Varsavsky, Cristina

    2013-01-01

    There is international agreement that quantitative skills (QS) are an essential graduate competence in science. QS refer to the application of mathematical and statistical thinking and reasoning in science. This study reports on the use of the Science Students Skills Inventory to capture final year science students' perceptions of their QS across…

  10. Quantitative Indicators for Behaviour Drift Detection from Home Automation Data.

    Science.gov (United States)

    Veronese, Fabio; Masciadri, Andrea; Comai, Sara; Matteucci, Matteo; Salice, Fabio

    2017-01-01

    Smart Homes diffusion provides an opportunity to implement elderly monitoring, extending seniors' independence and avoiding unnecessary assistance costs. Information concerning the inhabitant behaviour is contained in home automation data, and can be extracted by means of quantitative indicators. The application of such approach proves it can evidence behaviour changes.

  11. Quantitative determination of phases by X-ray diffraction

    International Nuclear Information System (INIS)

    Azevedo, A.L.T.

    1979-01-01

    The internal standard method for the quantitative determination of phases by X-ray diffraction is presented. The method is applicable to multi-phase materials which may be treated as powder. A discussion on sample preparation and some examples follow. (Author) [pt

  12. Integrated quantitative pharmacology for treatment optimization in oncology

    NARCIS (Netherlands)

    van Hasselt, J.G.C.

    2014-01-01

    This thesis describes the development and application of quantitative pharmacological models in oncology for treatment optimization and for the design and analysis of clinical trials with respect to pharmacokinetics, toxicity, efficacy and cost-effectiveness. A recurring theme throughout this thesis

  13. Quantitative evaluation of the enamel caries which were treated with ...

    African Journals Online (AJOL)

    Objectives: The aim of this in vivo study was to quantitatively evaluate the remineralization of the enamel caries on smooth and occlusal surfaces using DIAGNOdent, after daily application of casein phosphopeptide‑amorphous calcium fluoride phosphate (CPP‑ACFP). Materials and Methods: Thirty volunteers, aged 18–30 ...

  14. Method for quantitative assessment of nuclear safety computer codes

    International Nuclear Information System (INIS)

    Dearien, J.A.; Davis, C.B.; Matthews, L.J.

    1979-01-01

    A procedure has been developed for the quantitative assessment of nuclear safety computer codes and tested by comparison of RELAP4/MOD6 predictions with results from two Semiscale tests. This paper describes the developed procedure, the application of the procedure to the Semiscale tests, and the results obtained from the comparison

  15. A potential quantitative method for assessing individual tree performance

    Science.gov (United States)

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  16. Quantitative measurement of mixtures by terahertz time–domain ...

    Indian Academy of Sciences (India)

    Administrator

    earth and space science, quality control of food and agricultural products and global environmental monitoring. In quantitative applications, terahertz technology has been widely used for studying dif- ferent kinds of mixtures, such as amino acids,. 8 ter- nary chemical mixtures,. 9 pharmaceuticals,. 10 racemic compounds. 11.

  17. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  18. Methodologies for quantitative systems pharmacology (QSP) models : Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, Hp; Agoram, B.; Davies, M.R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, Ph.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  19. Methodologies for Quantitative Systems Pharmacology (QSP) Models: Design and Estimation

    NARCIS (Netherlands)

    Ribba, B.; Grimm, H. P.; Agoram, B.; Davies, M. R.; Gadkar, K.; Niederer, S.; van Riel, N.; Timmis, J.; van der Graaf, P. H.

    2017-01-01

    With the increased interest in the application of quantitative systems pharmacology (QSP) models within medicine research and development, there is an increasing need to formalize model development and verification aspects. In February 2016, a workshop was held at Roche Pharma Research and Early

  20. The Spectra Count Label-free Quantitation in Cancer Proteomics

    OpenAIRE

    Zhou, Weidong; Liotta, Lance A.; Petricoin, Emanuel F.

    2012-01-01

    Mass spectrometry is used routinely for large-scale protein identification from complex biological mixtures. Recently, relative quantitation approach on the basis of spectra count has been applied in several cancer proteomic studies. In this review, we examine the mechanism of this technique and highlight several important parameters associated with its application.

  1. Development of an UPLC-MS/MS method for simultaneous quantitation of 11 d-amino acids in different regions of rat brain: Application to a study on the associations of d-amino acid concentration changes and Alzheimer's disease.

    Science.gov (United States)

    Li, Zhe; Xing, Yuping; Guo, Xingjie; Cui, Yan

    2017-07-15

    There are significant differences in d-amino acid concentrations between healthy people and Alzheimer's disease patients. In order to investigate the potential correlation between d-amino acids and Alzheimer's disease, a simple and sensitive ultra high performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method has been developed. The method was applied to simultaneous determination of 11 d-amino acids in different regions of rat brain. Rat brain homogenates were firstly pretreated with protein precipitation procedure and then derivatized with (S)-N-(4-nitrophenoxycarbonyl) phenylalanine methoxyethyl ester [(S)-NIFE]. Baseline separation of the derivatives was achieved on an ACQUITY UPLC BEH C 18 column (2.1 mm×50mm, 1.7μm). The mobile phase consisted of acetonitrile and water (containing 8mM ammonium hydrogen carbonate) and the flow rate was 0.6mLmin -1 . The derived analytes were sensitively detected by multiple reaction monitoring in the positive ion mode. The lower limits of quantitation ranged from 0.06 to 10ngmL -1 with excellent linearity (r≥0.9909). The intra- and inter-day RSD were in the range of 3.6-12% and 5.7-12%, respectively. The recovery rate was 82.5%-95.3%. With this UPLC-MS/MS method, the 11 d-amino acids in hippocampus, cerebral cortex, olfactory bulb and cerebellum from Alzheimer's disease rats and age-matched controls could be simultaneously determined. Compared with the normal controls, the concentrations of d-serine, d-alanine, d-leucine, and d-proline in hippocampus and cerebral cortex of Alzheimer's disease rat brain were significantly decreased, while no differences in olfactory bulb and cerebellum of all the d-amino acids were observed. The different amounts and distribution of d-amino acids in brain between the two groups, which regulated by particular pathological changes of Alzheimer's disease, would give new insights into further study in neuropathogenesis and provide novel therapeutic targets of Alzheimer

  2. Application of a Receptor-Binding Capture Quantitative Reverse Transcription-PCR Assay To Concentrate Human Norovirus from Sewage and To Study the Distribution and Stability of the Virus

    Science.gov (United States)

    Yang, David; Pan, Liangwen; Mandrell, Robert

    2012-01-01

    Water is an important route for human norovirus (HuNoV) transmission. Using magnetic beads conjugated with blood group-like antigens (HuNoV receptors), we developed a simple and rapid receptor-binding capture and magnetic sequestration (RBCMS) method and compared it to the existing negatively charged membrane absorption/elution (NCMAE) method for concentrating HuNoV from sewage effluent. RBCMS required 6-fold-less sample volume than the NCMAE method and also resulted in a significantly higher yield of HuNoV. The NCMAE and RBCMS concentrations of genogroup I (GI) HuNoV measured by quantitative reverse transcription-PCR (qRT-PCR) resulted in average threshold cycle (CT) values of 34.68 (8.68 copies, 252-fold concentration) versus 34.07 (13.05 copies, 477-fold concentration), respectively; the NCMAE and RBCMS concentrations of genogroup II (GII) HuNoV were measured as average CT values of 33.32 (24.7 copies, 239-fold concentration) versus 32.38 (46.9 copies, 333-fold concentration), respectively. The specificity of qRT-PCR was confirmed by traditional RT-PCR and an RNase I protection assay. The qRT-PCR signal from RBCMS-concentrated HuNoV treated with RNase I indicated that it was from encapsidated RNA and, probably, viable virus. In contrast, the qRT-PCR signal from NCMAE-concentrated HuNoV was not protected from RNase I and, likely, degradation. Both GI and GII HuNoV were detected from sewage effluent samples collected between April and July with average concentrations of 7.8 × 103 genomic copies per liter (gc/liter) and 4.3 × 104 gc/liter, respectively. No GI and sewage samples stored at room temperature for 4 weeks. We conclude that RBCMS requires less sample volume, has better recovery and sensitivity, and is faster than NCMAE for detection of HuNoV in sewage. PMID:22101044

  3. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  4. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  5. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  6. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  7. Quantitative Reasoning in Problem Solving

    Science.gov (United States)

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  8. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  9. The effect of plant growth promoting rhizobacteria (PGPR on quantitative and qualitative characteristics of Sesamum indicum L. with application of cover crops of Lathyrus sp. and Persian clover (Trifolium resopinatum L.

    Directory of Open Access Journals (Sweden)

    M. Jahan

    2016-05-01

    Full Text Available Cover crops cultivation and application of plant growth rhizobacteria are the key factors to enhance agroecosystem health. A field experiment was conducted at the Research Farm of Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, during growing season of 2009-2010. A split plot arrangement based on a complete randomized block design with three replications was used. Cultivation and no cultivation of Lathyrus sp. and Persian clover (Trifolium resopinatum in autumn assigned to the main plots. The sub plot factor consisted of three different types of biofertilizers plus control, including 1-nitroxin (containing of Azotobacter sp. and Azospirillum sp., 2- phosphate solubilizing bacteria (PSB (containing of Bacillus sp. and Pseudomonas sp., 3- biosulfur (containing of Thiobacillus ssp. and 4- control (no fertilizer. The results showed the effect of cover crops on seed number and seed weight per plant, biological and seed yield was significant, as the seed yield increased of 9 %. In general, biofertilizers showed superiority due to the most studied traits compared to control. Nitroxin, PSB and biosulfur increased biological yield of 44, 28 and 26 % compared to control, respectively. Cover crops and biofertilizers interactions, showed significant effect on all studied traits, as the highest and the lowest harvest index resulted in cover crop combined with biofertilizers (22.1% and cultivation and no cultivation of cover crops combined with control (15.3%, respectively. The highest seed oil and protein content resulted from cover crops plus biofertilizers (42.4% and cover crops plus PSB (22.5%, respectively. In general, the results showed cover crops cultivation in combination with biofertilizers application could be an ecological alternative for chemical fertilizers, in addition of achieving advantages of cover crops. According to the results, it should be possible to design an ecological cropping system and produce appropriate and healthy

  10. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  11. EFEITOS QUALITATIVO E QUANTITATIVO DE APLICAÇÃO DO ZINCO NO CAPIM TANZÂNIA-1 QUALITIVE AND QUANTITATIVE EFFECTS OF THE ZINC SULPHATE APPLICATION ON TANZÂNIA-1 GRASS

    Directory of Open Access Journals (Sweden)

    Cideon Donizete de Faria

    2007-09-01

    latosol, having as objective to evaluate the effect of the doses 0, 10, 20, 40 and 80 kg ha-1 of zinc sulphate in the productivity, quality and leaf chemical composition of the Tanzânia-1 grass. The soil was prepared with a heavy grade bar in the beginning of the rainy station. As basic fertilization were applied 20 kg of N, 50 kg of P2O5 and 30 kg of K2O ha-1 as ammonium sulphate, commercial Yoorin and potassium chloride, respectively. The plant height and number of budding, gross protein, and fiber in neutral detergent and leaf mineral nutrient were determined just after were evaluated at 60 days after germination. Green mass, dry matter, gross the crop. Although no significant, the dose of 20 kg ha-1 of zinc sulphate influenced qualitative and quantitatively the forage produced.

    KEY-WORDS: Pasture; soil; fertility; sudding; dry matter.

  12. EFEITOS QUALITATIVO E QUANTITATIVO DA APLICAÇÃO DE FÓSFORO NO CAPIM TANZÂNIA-1 QUALITATIVE AND QUANTITATIVE EFFECTS OF PHOSPHORUS APPLICATION ON TANZÂNIA-1 GRASS

    Directory of Open Access Journals (Sweden)

    Renato Sérgio Mota dos Santos

    2007-09-01

    Full Text Available

    Para a correção das carências de fósforo do solo, e considerada imprescindível para elevar a capacidade de suporte animal de uma pastagem, para conhecimento do efeito qualitativo e quantitativo do fósforo na forrageira Tanzânia-1, foi realizado um experimento em um latossolo vermelho-escuro, em Santo Antônio de Goiás, no Estado de Goiás. Para isso, preparou-se o solo com uma gradagem e uma aração e, após uma semana, realizou-se a semeadura. Os tratamentos incluíram 0,50 e 100 kg de P ha-1 de termofosfato comercial, em cobertura, em terreno previamente corrigido com 3 t ha-1 calcário dolomítico. Com o incremento de fósforo, houve um aumento dos valores médios de altura das plantas, número de perfilhos, massa verde e matéria seca, mas uma diminuição dos teores de proteína bruta, como também um aumento das concentrações de fibra. O fósforo não interferiu nos teores de potássio, cálcio, zinco e manganês no tecido foliar das plantas, contudo reduziu os teores de fósforo, cobre e ferro e aumentou os teores de magnésio.

    PALAVRAS-CHAVE: Cerrado; nutrição animal; Panicum maximum; pastagem.

    The amendment of phosphorus shortage is indispensable to elevate the capacity of animal support capacity of a pasture. To improve P status of these soils and to know the qualitative and quantitative effect of phosphorus fertilizer on the forage Tanzânia-1, an experiment was conducted in one dark red latosol , at

  13. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  14. Immune chromatography: a quantitative radioimmunological assay

    International Nuclear Information System (INIS)

    Davis, J.W.; Demetriades, M.; Bowen, J.M.

    1984-01-01

    Immune chromatography, a radioimmunological binding assay, employs paper chromatography to separate immune complexes from free antigen and antibodies. During chromatography free antigen and antibodies become distributed throughout the paper, while immune complexes remain near the bottoms of the strips. The chromatographic differences can be made quantitative by using either iodinated antigens or antibodies. Under these conditions nanogram quantities of antigen can be detected or antibodies in sera diluted several 1000-fold. The immune chromatography assay can also be performed as an indirect assay, since the paper strips are cut from nitrocellulose paper. In this case the immune components are absorbed by the paper during chromatography. Antigen is then detected with an iodinated second antibody. The indirect immune chromatography assay is particularly useful for identifying different sera that react with the same antigen. Reaction with the first serum before chromatography reduces the amount of antigen available to the second serum following chromatography. In addition to characterizing the immune chromatography procedure, we discuss the possible applications of chromatography assays for the quantitation of other types of molecular binding interactions. (Auth.)

  15. Quantitative fluorescence nanoscopy for cancer biomedicine

    Science.gov (United States)

    Huang, Tao; Nickerson, Andrew; Peters, Alec; Nan, Xiaolin

    2015-08-01

    Cancer is a major health threat worldwide. Options for targeted cancer therapy, however, are often limited, in a large part due to our incomplete understanding of how key processes including oncogenesis and drug response are mediated at the molecular level. New imaging techniques for visualizing biomolecules and their interactions at the nanometer and single molecule scales, collectively named fluorescence nanoscopy, hold the promise to transform biomedical research by providing direct mechanistic insight into cellular processes. We discuss the principles of quantitative single-molecule localization microscopy (SMLM), a subset of fluorescence nanoscopy, and their applications to cancer biomedicine. In particular, we will examine oncogenesis and drug resistance mediated by mutant Ras, which is associated with ~1/3 of all human cancers but has remained an intractable drug target. At ~20 nm spatial and single-molecule stoichiometric resolutions, SMLM clearly showed that mutant Ras must form dimers to activate its effector pathways and drive oncogenesis. SMLM further showed that the Raf kinase, one of the most important effectors of Ras, also forms dimers upon activation by Ras. Moreover, treatment of cells expressing wild type Raf with Raf inhibitors induces Raf dimer formation in a manner dependent on Ras dimerization. Together, these data suggest that Ras dimers mediate oncogenesis and drug resistance in tumors with hyperactive Ras and can potentially be targeted for cancer therapy. We also discuss recent advances in SMLM that enable simultaneous imaging of multiple biomolecules and their interactions at the nanoscale. Our work demonstrates the power of quantitative SMLM in cancer biomedicine.

  16. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  17. Quantitative diagnosis of skeletons with demineralizing osteopathy

    International Nuclear Information System (INIS)

    Banzer, D.

    1979-01-01

    The quantitative diagnosis of bone diseases must be assessed according to the accuracy of the applied method, the expense in apparatus, personnel and financial resources and the comparability of results. Nuclide absorptiometry and in the future perhaps computed tomography represent the most accurate methods for determining the mineral content of bones. Their application is the clinics' prerogative because of the costs. Morphometry provides quantiative information, in particular in course control, and enables an objective judgement of visual pictures. It requires little expenditure and should be combined with microradioscopy. Direct comparability of the findings of different working groups is most easy in morphometry; it depends on the equipment in computerized tomography and is still hardly possible in nuclide absorptiometry. For fundamental physical reason, it will hardly be possible to produce a low-cost, fast and easy-to-handle instrument for the determination of the mineral salt concentration in bones. Instead, there is rather a trend towards more expensive equipment, e.g. CT instruments; the universal use of these instruments, however, will help to promote quantitative diagnoses. (orig.) [de

  18. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  19. Developments in quantitative electron probe microanalysis

    International Nuclear Information System (INIS)

    Tixier, R.

    1977-01-01

    A study of the range of validity of the formulae for corrections used with massive specimen analysis is made. The method used is original; we have shown that it was possible to use a property of invariability of corrected intensity ratios for standards. This invariance property provides a test for the self consistency of the theory. The theoretical and experimental conditions required for quantitative electron probe microanalysis of thin transmission electron microscope specimens are examined. The correction formulae for atomic number, absorption and fluorescence effects are calculated. Several examples of experimental results are given, relative to the quantitative analysis of intermetallic precipitates and carbides in steels. Advances in applications of electron probe instruments related to the use of computer and the present development of fully automated instruments are reviewed. The necessary statistics for measurements of X ray count data are studied. Estimation procedure and tests are developed. These methods are used to perform a statistical check of electron probe microanalysis measurements and to reject rogue values. An estimator of the confidence interval of the apparent concentration is derived. Formulae were also obtained to optimize the counting time in order to obtain the best precision in a minimum amount of time [fr

  20. Qualitative and quantitative descriptions of glenohumeral motion.

    Science.gov (United States)

    Hill, A M; Bull, A M J; Wallace, A L; Johnson, G R

    2008-02-01

    Joint modelling plays an important role in qualitative and quantitative descriptions of both normal and abnormal joints, as well as predicting outcomes of alterations to joints in orthopaedic practice and research. Contemporary efforts in modelling have focussed upon the major articulations of the lower limb. Well-constrained arthrokinematics can form the basis of manageable kinetic and dynamic mathematical predictions. In order to contain computation of shoulder complex modelling, glenohumeral joint representations in both limited and complete shoulder girdle models have undergone a generic simplification. As such, glenohumeral joint models are often based upon kinematic descriptions of inadequate degrees of freedom (DOF) for clinical purposes and applications. Qualitative descriptions of glenohumeral motion range from the parody of a hinge joint to the complex realism of a spatial joint. In developing a model, a clear idea of intention is required in order to achieve a required application. Clinical applicability of a model requires both descriptive and predictive output potentials, and as such, a high level of validation is required. Without sufficient appreciation of the clinical intention of the arthrokinematic foundation to a model, error is all too easily introduced. Mathematical description of joint motion serves to quantify all relevant clinical parameters. Commonly, both the Euler angle and helical (screw) axis methods have been applied to the glenohumeral joint, although concordance between these methods and classical anatomical appreciation of joint motion is limited, resulting in miscommunication between clinician and engineer. Compounding these inconsistencies in motion quantification is gimbal lock and sequence dependency.

  1. A strategy for extending the applicability of a validated plasma calibration curve to quantitative measurements in multiple tissue homogenate samples: a case study from a rat tissue distribution study of JI-101, a triple kinase inhibitor.

    Science.gov (United States)

    Gurav, Sandip Dhondiram; Jeniffer, Sherine; Punde, Ravindra; Gilibili, Ravindranath Reddy; Giri, Sanjeev; Srinivas, Nuggehally R; Mullangi, Ramesh

    2012-04-01

    A general practice in bioanalysis is that, whatever the biological matrix the analyte is being quantified in, the validation is performed in the same matrix as per regulatory guidelines. In this paper, we are presenting the applicability of a validated LC-MS/MS method in rat plasma for JI-101, to estimate the concentrations of JI-101 in various tissues that were harvested in a rat tissue distribution study. A simple protein precipitation technique was used to extract JI-101 and internal standard from the tissue homogenates. The recovery of JI-101 in all the matrices was found to be >70%. Chromatographic separation was achieved using a binary gradient using mobile phase A (acetonitrile) and B (0.2% formic acid in water) at a flow rate of 0.30 mL/min on a Prodigy ODS column with a total run time of 4.0 min. The MS/MS ion transitions monitored were 466.1 → 265 for JI-101 and 180.1 → 110.1 for internal standard. The linearity range was 5.02-4017 ng/mL. The JI-101 levels were quantifiable in the various tissue samples harvested in this study. Therefore, the use of a previously validated JI-101 assay in plasma circumvented the tedious process of method development/validation in various tissue matrices. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Quantitative models for sustainable supply chain management

    DEFF Research Database (Denmark)

    Brandenburg, M.; Govindan, Kannan; Sarkis, J.

    2014-01-01

    and directions of this research area, this paper provides a content analysis of 134 carefully identified papers on quantitative, formal models that address sustainability aspects in the forward SC. It was found that a preponderance of the publications and models appeared in a limited set of six journals......Sustainability, the consideration of environmental factors and social aspects, in supply chain management (SCM) has become a highly relevant topic for researchers and practitioners. The application of operations research methods and related models, i.e. formal modeling, for closed-loop SCM...... and reverse logistics has been effectively reviewed in previously published research. This situation is in contrast to the understanding and review of mathematical models that focus on environmental or social factors in forward supply chains (SC), which has seen less investigation. To evaluate developments...

  3. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  4. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  5. Quantitative aspects of myocardial perfusion imaging

    International Nuclear Information System (INIS)

    Vogel, R.A.

    1980-01-01

    Myocardial perfusion measurements have traditionally been performed in a quantitative fashion using application of the Sapirstein, Fick, Kety-Schmidt, or compartmental analysis principles. Although global myocardial blood flow measurements have not proven clinically useful, regional determinations have substantially advanced our understanding of and ability to detect myocardial ischemia. With the introduction of thallium-201, such studies have become widely available, although these have generally undergone qualitative evaluation. Using computer-digitized data, several methods for the quantification of myocardial perfusion images have been introduced. These include orthogonal and polar coordinate systems and anatomically oriented region of interest segmentation. Statistical ranges of normal and time-activity analyses have been applied to these data, resulting in objective and reproducible means of data evaluation

  6. Quantitative Estimation for the Effectiveness of Automation

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun

    2012-01-01

    In advanced MCR, various automation systems are applied to enhance the human performance and reduce the human errors in industrial fields. It is expected that automation provides greater efficiency, lower workload, and fewer human errors. However, these promises are not always fulfilled. As the new types of events related to application of the imperfect and complex automation are occurred, it is required to analyze the effects of automation system for the performance of human operators. Therefore, we suggest the quantitative estimation method to analyze the effectiveness of the automation systems according to Level of Automation (LOA) classification, which has been developed over 30 years. The estimation of the effectiveness of automation will be achieved by calculating the failure probability of human performance related to the cognitive activities

  7. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  8. Safety culture management and quantitative indicator evaluation

    International Nuclear Information System (INIS)

    Mandula, J.

    2002-01-01

    This report discuses a relationship between safety culture and evaluation of quantitative indicators. It shows how a systematic use of generally shared operational safety indicators may contribute to formation and reinforcement of safety culture characteristics in routine plant operation. The report also briefly describes the system of operational safety indicators used at the Dukovany plant. It is a PC database application enabling an effective work with the indicators and providing all users with an efficient tool for making synoptic overviews of indicator values in their links and hierarchical structure. Using color coding, the system allows quick indicator evaluation against predefined limits considering indicator value trends. The system, which has resulted from several-year development, was completely established at the plant during the years 2001 and 2002. (author)

  9. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  10. Quantitative (real-time) PCR

    International Nuclear Information System (INIS)

    Denman, S.E.; McSweeney, C.S.

    2005-01-01

    Many nucleic acid-based probe and PCR assays have been developed for the detection tracking of specific microbes within the rumen ecosystem. Conventional PCR assays detect PCR products at the end stage of each PCR reaction, where exponential amplification is no longer being achieved. This approach can result in different end product (amplicon) quantities being generated. In contrast, using quantitative, or real-time PCR, quantification of the amplicon is performed not at the end of the reaction, but rather during exponential amplification, where theoretically each cycle will result in a doubling of product being created. For real-time PCR, the cycle at which fluorescence is deemed to be detectable above the background during the exponential phase is termed the cycle threshold (Ct). The Ct values obtained are then used for quantitation, which will be discussed later

  11. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  12. Qualitative discussion of quantitative radiography

    International Nuclear Information System (INIS)

    Berger, H.; Motz, J.W.

    1975-01-01

    Since radiography yields an image that can be easily related to the tested object, it is superior to many nondestructive testing techniques in revealing the size, shape, and location of certain types of discontinuities. The discussion is limited to a description of the radiographic process, examination of some of the quantitative aspects of radiography, and an outline of some of the new ideas emerging in radiography. The advantages of monoenergetic x-ray radiography and neutron radiography are noted

  13. Quantitative analysis of coupler tuning

    International Nuclear Information System (INIS)

    Zheng Shuxin; Cui Yupeng; Chen Huaibi; Xiao Liling

    2001-01-01

    The author deduces the equation of coupler frequency deviation Δf and coupling coefficient β instead of only giving the adjusting direction in the process of matching coupler, on the basis of coupling-cavity chain equivalent circuits model. According to this equation, automatic measurement and quantitative display are realized on a measuring system. It contributes to industrialization of traveling-wave accelerators for large container inspection systems

  14. Quantitative Methods for Teaching Review

    OpenAIRE

    Irina Milnikova; Tamara Shioshvili

    2011-01-01

    A new method of quantitative evaluation of teaching processes is elaborated. On the base of scores data, the method permits to evaluate efficiency of teaching within one group of students and comparative teaching efficiency in two or more groups. As basic characteristics of teaching efficiency heterogeneity, stability and total variability indices both for only one group and for comparing different groups are used. The method is easy to use and permits to rank results of teaching review which...

  15. Computational complexity a quantitative perspective

    CERN Document Server

    Zimand, Marius

    2004-01-01

    There has been a common perception that computational complexity is a theory of "bad news" because its most typical results assert that various real-world and innocent-looking tasks are infeasible. In fact, "bad news" is a relative term, and, indeed, in some situations (e.g., in cryptography), we want an adversary to not be able to perform a certain task. However, a "bad news" result does not automatically become useful in such a scenario. For this to happen, its hardness features have to be quantitatively evaluated and shown to manifest extensively. The book undertakes a quantitative analysis of some of the major results in complexity that regard either classes of problems or individual concrete problems. The size of some important classes are studied using resource-bounded topological and measure-theoretical tools. In the case of individual problems, the book studies relevant quantitative attributes such as approximation properties or the number of hard inputs at each length. One chapter is dedicated to abs...

  16. In-vivo quantitative measurement

    International Nuclear Information System (INIS)

    Ito, Takashi

    1992-01-01

    So far by positron CT, the quantitative analyses of oxygen consumption rate, blood flow distribution, glucose metabolic rate and so on have been carried out. The largest merit of using the positron CT is the observation and verification of mankind have become easy. Recently, accompanying the rapid development of the mapping tracers for central nervous receptors, the observation of many central nervous receptors by the positron CT has become feasible, and must expectation has been placed on the elucidation of brain functions. The conditions required for in vitro processes cannot be realized in strict sense in vivo. The quantitative measurement of in vivo tracer method is carried out by measuring the accumulation and movement of a tracer after its administration. The movement model of the mapping tracer for central nervous receptors is discussed. The quantitative analysis using a steady movement model, the measurement of dopamine receptors by reference method, the measurement of D 2 receptors using 11C-Racloprode by direct method, and the possibility of measuring dynamics bio-reaction are reported. (K.I.)