WorldWideScience

Sample records for elastique quantitative applications

  1. Quantitative elastic migration. Applications to 3D borehole seismic surveys; Migration elastique quantitative. Applications a la sismique de puits 3D

    Energy Technology Data Exchange (ETDEWEB)

    Clochard, V.

    1998-12-02

    3D VSP imaging is nowadays a strategic requirement by petroleum companies. It is used to precise in details the geology close to the well. Because of the lack of redundancy and limited coverage in the data. this kind of technology is more restrictive than surface seismic which allows an investigation at a higher scale. Our contribution was to develop an elastic quantitative imagine (GRT migration) which can be applied to 3 components borehole dataset. The method is similar to the Kirchhoff migration using sophistical weighting of the seismic amplitudes. In reality. GRT migration uses pre-calculated Green functions (travel time. amplitude. polarization). The maps are obtained by 3D ray tracing (wavefront construction) in the velocity model. The migration algorithm works with elementary and independent tasks. which is useful to process different kind of dataset (fixed or moving geophone antenna). The study has been followed with validations using asymptotic analytical solution. The ability of reconstruction in 3D borehole survey has been tested in the Overthrust synthetic model. The application to a real circular 3D VSP shows various problems like velocity model building, anisotropy factor and the preprocessing (deconvolution. wave mode separation) which can destroy seismic amplitudes. An isotropic 3 components preprocessing of the whole dataset allows a better lateral reconstruction. The choice of a big migration aperture can help the reconstruction of strong geological dip in spite of migration smiles. Finally, the methodology can be applied to PS converted waves. (author)

  2. Quantitative multi-waves migration in elastic anisotropic media; Migration quantitative multi-ondes en milieu elastique anisotrope

    Energy Technology Data Exchange (ETDEWEB)

    Borgne, H.

    2004-12-01

    modelling of waves propagation in anisotropic media. With the approximations of ray theory, 1 develop an expression of the geometrical spreading, the amplitude, and their reciprocity relations. I set up imaging formulas in order to reconstruct the reflection coefficients of the subsurface in elastic anisotropic media. In a first time, 1 salve the direct problem, by expressing the integral relation between the scattered wave field recorded by the receivers and the subsurface reflection coefficients. In a second time, 1 apply an elastic anisotropic quantitative migration method, based on the properties of the inverse Radon transforms (Beylkin's approach), in order to express the reflection coefficient in 2D, 2.5D and 3D media. 1 implemented these formulas in a new preserved amplitude migration algorithm, where the images are sorted by angle classes. At last, 1 apply these theoretical results to synthetic and real datasets. 1 show that migration is able to reconstruct the correct A V A behavior of anisotropic reflection coefficients if hath. modifications are achieved. Then, 1 degrade the process, by keeping an anisotropic ray tracing but using the classical isotropic imaging formula. F'or this commonly used configuration, 1 evaluate the error that can be expected in the A V A response of the migrated reflection coefficient. Methodological applications show the sensibility of the migration results to the velocity model smoothing and to an error on the anisotropic axis. (author)

  3. Quantitative graph theory mathematical foundations and applications

    CERN Document Server

    Dehmer, Matthias

    2014-01-01

    The first book devoted exclusively to quantitative graph theory, Quantitative Graph Theory: Mathematical Foundations and Applications presents and demonstrates existing and novel methods for analyzing graphs quantitatively. Incorporating interdisciplinary knowledge from graph theory, information theory, measurement theory, and statistical techniques, this book covers a wide range of quantitative-graph theoretical concepts and methods, including those pertaining to real and random graphs such as:Comparative approaches (graph similarity or distance)Graph measures to characterize graphs quantitat

  4. Applications of Microfluidics in Quantitative Biology.

    Science.gov (United States)

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2017-10-04

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 The Authors. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  5. Quantitative Robust Control Engineering: Theory and Applications

    Science.gov (United States)

    2006-09-01

    to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data...the D tank as a dashed line [mg/l] and the control input IR(t) is shown as a solid line [ per unit of the influent flow rate] with the G22(z...1992). Discrete quantitative feedback technique, Capítulo 16 en el libro : Digital Control Systems: theory, hardware, software, 2ª edicion. McGraw

  6. Quantitative approach of Min protein researches and applications ...

    African Journals Online (AJOL)

    Quantitative approach of Min protein researches and applications: Experiments, mathematical modeling and computer simulations. W Ngamsaad, J Yojina, P Kanthang, C Modchang, C Krittanai, D Triampo, N Nuttawut, W Triampo ...

  7. Statistical Applications and Quantitative Design for Injury Prevention ...

    African Journals Online (AJOL)

    editor of the International Journal of Injury Control and Safety Promotion, conducted a five-day workshop on “Statistical applications and quantitative design for injury prevention research” from 18–21 August 2008 at the MRC in Cape Town, South Africa. The target audience for this workshop was researchers (with some ...

  8. Novel applications of quantitative MRI for the fetal brain

    Energy Technology Data Exchange (ETDEWEB)

    Clouchoux, Cedric [Children' s National Medical Center, Division of Diagnostic Imaging and Radiology, Washington, DC (United States); Limperopoulos, Catherine [Children' s National Medical Center, Division of Diagnostic Imaging and Radiology, Washington, DC (United States); McGill University, McConnell Brain Imaging Center, Montreal Neurological Institute, Montreal (Canada); McGill University, Department of Neurology and Neurosurgery, Montreal (Canada); Children' s National Medical Center, Division of Fetal and Transitional Medicine, Washington, DC (United States)

    2012-01-15

    The advent of ultrafast MRI acquisitions is offering vital insights into the critical maturational events that occur throughout pregnancy. Concurrent with the ongoing enhancement of ultrafast imaging has been the development of innovative image-processing techniques that are enabling us to capture and quantify the exuberant growth, and organizational and remodeling processes that occur during fetal brain development. This paper provides an overview of the role of advanced neuroimaging techniques to study in vivo brain maturation and explores the application of a range of new quantitative imaging biomarkers that can be used clinically to monitor high-risk pregnancies. (orig.)

  9. Automated quantitative micro-mineralogical characterization for environmental applications

    Science.gov (United States)

    Smith, Kathleen S.; Hoal, K.O.; Walton-Day, Katherine; Stammer, J.G.; Pietersen, K.

    2013-01-01

    Characterization of ore and waste-rock material using automated quantitative micro-mineralogical techniques (e.g., QEMSCAN® and MLA) has the potential to complement traditional acid-base accounting and humidity cell techniques when predicting acid generation and metal release. These characterization techniques, which most commonly are used for metallurgical, mineral-processing, and geometallurgical applications, can be broadly applied throughout the mine-life cycle to include numerous environmental applications. Critical insights into mineral liberation, mineral associations, particle size, particle texture, and mineralogical residence phase(s) of environmentally important elements can be used to anticipate potential environmental challenges. Resources spent on initial characterization result in lower uncertainties of potential environmental impacts and possible cost savings associated with remediation and closure. Examples illustrate mineralogical and textural characterization of fluvial tailings material from the upper Arkansas River in Colorado.

  10. Quantitative risk aspects with new working fluids in practical applications

    Energy Technology Data Exchange (ETDEWEB)

    Van Gerwen, R.J.M. [TNO Inst. of Environmental Sciences, Energy Research and Process Innovation, Apeldoorn (Netherlands)

    1995-12-01

    Natural working fluids like ammonia, butane and propane are excellent refrigerants. However, the safety aspects due to their flammable and toxic nature are dominant in relation to their application. The way of treating safety aspects still differ from country to country. In The Netherlands a quantitative risk approach is common use, in contrary to most of the other countries. For most of the refrigeration and heat pump experts this approach is not well known and many of them believe that it is not possible to calculate risks at all. However, we feel that risk quantification is essential in the discussion related to responsible application of natural refrigerants. It has to be realised that techniques for risk quantification have to be developed and used by specialists, preferably a combination of experts in the field of industrial safety and refrigeration. In this paper the methods and backgrounds for assessing risks are described, including some practical examples for ammonia and propane applications. 1 fig., 2 tabs., 11 refs.

  11. Applications of quantitative biomarker analysis in petroleum geochemistry

    Energy Technology Data Exchange (ETDEWEB)

    Requejo, A.G. (Arco Oil and Gas Co., Plano, TX (USA))

    1989-03-01

    Traditionally, results of triterpenoid and steroid biomarker analyses in petroleum geochemistry have been presented using one of two formats: (1) mass-chromatogram fingerprints, illustrating the distribution of individual components within a specific biomarker compound class (e.g. m/z 217 for the steranes), or (2) ratios of chromatographic peak areas, which are most often used to depict the relative abundance of precursors and products in a geochemical reaction (e.g., the % 20S sterane isomerization parameter) or the proportions of different biomarker compounds present in a sample (e.g., hopane/sterane ratios). These approaches possess several inherent limitations. For example, it is often difficult to relate distributions of biomarkers isolated in different liquid chromatographic fractions (e.g., aromatized steranes vs saturated steranes). Comparisons of the quantities of these compounds in different samples can also be complicated. With the increasing commercial availability of high-purity, authentic steroid and triterpenoid hydrocarbon standards, the ability to accurately quantify the concentrations of biological marker compounds in geologic samples is being facilitated. Determination of absolute concentrations adds a new dimension to biomarker analysis which could not be adequately addressed using the conventional fingerprinting approaches. This presentation will briefly overview the analytical methodology employed in biomarker quantitation and will present applications of the technique in geochemical studies of oils and source rocks from various regions.

  12. Preparation of Homogeneous MALDI Samples for Quantitative Applications.

    Science.gov (United States)

    Ou, Yu-Meng; Tsao, Chien-Wei; Lai, Yin-Hung; Lee, Hsun; Chang, Huan-Tsung; Wang, Yi-Sheng

    2016-10-28

    This protocol demonstrates a simple sample preparation to reduce spatial heterogeneity in ion signals during matrix-assisted laser desorption/ionization (MALDI) mass spectrometry. The heterogeneity of ion signals is a severe problem in MALDI, which results in poor data reproducibility and makes MALDI unsuitable for quantitative analysis. By regulating sample plate temperature during sample preparation, thermal-induced hydrodynamic flows inside droplets of sample solution are able to reduce the heterogeneity problem. A room-temperature sample preparation chamber equipped with a temperature-regulated copper base block that holds MALDI sample plates facilitates precise control of the sample drying condition. After drying of sample droplets, the temperature of sample plates is returned to room temperature and removed from the chamber for subsequent mass spectrometric analysis. The areas of samples are examined with MALDI-imaging mass spectrometry to obtain the spatial distribution of all components in the sample. In comparison with the conventional dried-droplet method that prepares samples under ambient conditions without temperature control, the samples prepared with the method demonstrated herein show significantly better spatial distribution and signal intensity. According to observations using carbohydrate and peptide samples, decreasing substrate temperature while maintaining the surroundings at ambient temperature during the drying process can effectively reduce the heterogeneity of ion signals. This method is generally applicable to various combinations of samples and matrices.

  13. Quantitative NMR: an applicable method for quantitative analysis of medicinal plant extracts and herbal products.

    Science.gov (United States)

    Chauthe, Siddheshwar K; Sharma, Ram Jee; Aqil, Farrukh; Gupta, Ramesh C; Singh, Inder Pal

    2012-01-01

    Quantitative analysis and standardisation of plant extracts or herbal products is a tedious process requiring time-consuming sample preparation and analytical method development for the resolution of analyte peaks from the complex natural extract. Quantitative analysis by HPLC requires a pure authentic standard of the compound being quantified. We report here a quantitative NMR (qNMR) method for quantitative analysis of three medicinal plant extracts and their herbal products without the need of authentic standards. Quantitation can be done by using any commercially available pure sample as an internal reference standard. To develop a reliable method for standardisation and quantitative analysis of extracts from medicinal plants Eugenia jambolana, Withania somnifera and Aegle marmelos and their herbal products using qNMR. The (1) H-NMR spectra of known amounts of crude plant extracts with internal standards were recorded in deuterated solvents and quantitation was performed by calculating the relative ratio of the peak area of selected proton signals of the target compounds and the internal reference standard. Anthocyanins [delphinidin-3,5-diglucoside (1), petunidin-3,5-diglucoside (2) and malvidin-3,5-diglucoside (3)] for E. jambolana fruit extract and imperatorin (4) for A. marmelos fruit extract were selected as marker constituents for quantitation and 1,3,5-trimethoxybenzene (TMB) was used as an internal reference standard. Total withanolide content was determined for W. somnifera using 2,4-diformyl phloroglucinol as an internal reference standard. The (1) H-NMR gave a linear response for the marker constituents, anthocyanins, withaferin A and imperatorin. Using the described method, the amount of anthocyanins in Amberlite(R) XAD7HP and Sephadex enriched extracts of E. jambolana was 3.77% and 9.57% (delphinidin-3,5-diglucoside), 4.72% and 12.0% (petunidin-3,5-diglucoside), 6.55% and 15.70% (malvidin-3,5-diglucoside), respectively. The imperatorin content was 0

  14. Quantitative 3-D imaging topogrammetry for telemedicine applications

    Science.gov (United States)

    Altschuler, Bruce R.

    1994-01-01

    The technology to reliably transmit high-resolution visual imagery over short to medium distances in real time has led to the serious considerations of the use of telemedicine, telepresence, and telerobotics in the delivery of health care. These concepts may involve, and evolve toward: consultation from remote expert teaching centers; diagnosis; triage; real-time remote advice to the surgeon; and real-time remote surgical instrument manipulation (telerobotics with virtual reality). Further extrapolation leads to teledesign and telereplication of spare surgical parts through quantitative teleimaging of 3-D surfaces tied to CAD/CAM devices and an artificially intelligent archival data base of 'normal' shapes. The ability to generate 'topogrames' or 3-D surface numerical tables of coordinate values capable of creating computer-generated virtual holographic-like displays, machine part replication, and statistical diagnostic shape assessment is critical to the progression of telemedicine. Any virtual reality simulation will remain in 'video-game' realm until realistic dimensional and spatial relational inputs from real measurements in vivo during surgeries are added to an ever-growing statistical data archive. The challenges of managing and interpreting this 3-D data base, which would include radiographic and surface quantitative data, are considerable. As technology drives toward dynamic and continuous 3-D surface measurements, presenting millions of X, Y, Z data points per second of flexing, stretching, moving human organs, the knowledge base and interpretive capabilities of 'brilliant robots' to work as a surgeon's tireless assistants becomes imaginable. The brilliant robot would 'see' what the surgeon sees--and more, for the robot could quantify its 3-D sensing and would 'see' in a wider spectral range than humans, and could zoom its 'eyes' from the macro world to long-distance microscopy. Unerring robot hands could rapidly perform machine-aided suturing with

  15. Quantitative descriptions of rice plant architecture and their application.

    Science.gov (United States)

    Li, Xumeng; Wang, Xiaohui; Peng, Yulin; Wei, Hailin; Zhu, Xinguang; Chang, Shuoqi; Li, Ming; Li, Tao; Huang, Huang

    2017-01-01

    Plant architecture is an important agronomic trait, and improving plant architecture has attracted the attention of scientists for decades, particularly studies to create desirable plant architecture for high grain yields through breeding and culture practices. However, many important structural phenotypic traits still lack quantitative description and modeling on structural-functional relativity. This study defined new architecture indices (AIs) derived from the digitalized plant architecture using the virtual blade method. The influences of varieties and crop management on these indices and the influences of these indices on biomass accumulation were analyzed using field experiment data at two crop growth stages: early and late panicle initiation. The results indicated that the vertical architecture indices (LAI, PH, 90%-DRI, MDI, 90%-LI) were significantly influenced by variety, water, nitrogen management and the interaction of water and nitrogen, and compact architecture indices (H-CI, Q-CI, 90%-LI, 50%-LI) were significantly influenced by nitrogen management and the interaction of variety and water. Furthermore, there were certain trends in the influence of variety, water, and nitrogen management on AIs. Biomass accumulation has a positive linear correlation with vertical architecture indices and has a quadratic correlation with compact architecture indices, respectively. Furthermore, the combination of vertical and compact architecture indices is the indicator for evaluating the effects of plant architecture on biomass accumulation.

  16. Quantitative 3D Optical Imaging: Applications in Dosimetry and Biophysics

    Science.gov (United States)

    Thomas, Andrew Stephen

    Optical-CT has been shown to be a potentially useful imaging tool for the two very different spheres of biologists and radiation therapy physicists, but it has yet to live up to that potential. In radiation therapy, researchers have used optical-CT for the readout of 3D dosimeters, but it is yet to be a clinically relevant tool as the technology is too slow to be considered practical. Biologists have used the technique for structural imaging, but have struggled with emission tomography as the reality of photon attenuation for both excitation and emission have made the images quantitatively irrelevant. Dosimetry. The DLOS (Duke Large field of view Optical-CT Scanner) was designed and constructed to make 3D dosimetry utilizing optical-CT a fast and practical tool while maintaining the accuracy of readout of the previous, slower readout technologies. Upon construction/optimization/implementation of several components including a diffuser, band pass filter, registration mount & fluid filtration system the dosimetry system provides high quality data comparable to or exceeding that of commercial products. In addition, a stray light correction algorithm was tested and implemented. The DLOS in combination with the 3D dosimeter it was designed for, PREAGETM, then underwent rigorous commissioning and benchmarking tests validating its performance against gold standard data including a set of 6 irradiations. DLOS commissioning tests resulted in sub-mm isotropic spatial resolution (MTF >0.5 for frequencies of 1.5lp/mm) and a dynamic range of ˜60dB. Flood field uniformity was 10% and stable after 45minutes. Stray light proved to be small, due to telecentricity, but even the residual can be removed through deconvolution. Benchmarking tests showed the mean 3D passing gamma rate (3%, 3mm, 5% dose threshold) over the 6 benchmark data sets was 97.3% +/- 0.6% (range 96%-98%) scans totaling ˜10 minutes, indicating excellent ability to perform 3D dosimetry while improving the speed of

  17. Clinical applications of a quantitative analysis of regional lift ventricular wall motion

    Science.gov (United States)

    Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.

    1975-01-01

    Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.

  18. Domestication of smartphones and mobile applications: A quantitative mixed-method study

    NARCIS (Netherlands)

    de Reuver, G.A.; Nikou, S; Bouwman, W.A.G.A.

    2016-01-01

    Smartphones are finding their way into our daily lives. This paper examines the domestication of smartphones by looking at how the way we use mobile applications affects our everyday routines. Data is collected through an innovative quantitative mixed-method approach, combining log data from

  19. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.

  20. Quantitative relationship of application rate and pesticide residues in greenhouse tomatoes.

    Science.gov (United States)

    Sadło, S

    2000-01-01

    The association between application rate of a pesticide and its residue in ripe tomatoes was studied. The average residue level (R) of any pesticide in ripe tomatoes remained in quantitative relation to its dose (D), expressed by the following regression equation: R = 0.24 D (mg/kg), where the numerical factor, 0.24, represents the average residue in mg/kg after application of 1 kg active ingredient per hectare with relative standard deviation of 23%. Quantitative association between these 2 factors enables evaluation of greenhouse tomato growers with respect to their observation of Good Agricultural Practice rules and the Plant Protection Act, obligatory in Poland since 1996, and thus may be a reliable basis for the registration of new agrochemicals.

  1. Clinical applications of quantitative proteomics using targeted and untargeted data-independent acquisition techniques.

    Science.gov (United States)

    Meyer, Jesse G; Schilling, Birgit

    2017-05-01

    While selected/multiple-reaction monitoring (SRM or MRM) is considered the gold standard for quantitative protein measurement, emerging data-independent acquisition (DIA) using high-resolution scans have opened a new dimension of high-throughput, comprehensive quantitative proteomics. These newer methodologies are particularly well suited for discovery of biomarker candidates from human disease samples, and for investigating and understanding human disease pathways. Areas covered: This article reviews the current state of targeted and untargeted DIA mass spectrometry-based proteomic workflows, including SRM, parallel-reaction monitoring (PRM) and untargeted DIA (e.g., SWATH). Corresponding bioinformatics strategies, as well as application in biological and clinical studies are presented. Expert commentary: Nascent application of highly-multiplexed untargeted DIA, such as SWATH, for accurate protein quantification from clinically relevant and disease-related samples shows great potential to comprehensively investigate biomarker candidates and understand disease.

  2. Quantitative Phase Imaging Techniques for the Study of Cell Pathophysiology: From Principles to Applications

    Directory of Open Access Journals (Sweden)

    Hyunjoo Park

    2013-03-01

    Full Text Available A cellular-level study of the pathophysiology is crucial for understanding the mechanisms behind human diseases. Recent advances in quantitative phase imaging (QPI techniques show promises for the cellular-level understanding of the pathophysiology of diseases. To provide important insight on how the QPI techniques potentially improve the study of cell pathophysiology, here we present the principles of QPI and highlight some of the recent applications of QPI ranging from cell homeostasis to infectious diseases and cancer.

  3. Application of magnetic carriers to two examples of quantitative cell analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States); Todd, Paul, E-mail: pwtodd@hotmail.com [Techshot, Inc., 7200 Highway 150, Greenville, IN 47124 (United States); Hanley, Thomas R. [Department of Chemical Engineering, 212 Ross Hall, Auburn University, Auburn, AL 36849 (United States)

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility. - Highlights: • Commercial particle tracking velocimetry measures magnetophoretic mobility of labeled cells. • Magnetically labeled tumor cells were shown to have adequate mobility for capture in a specific sorter. • The kinetics of nonspecific endocytosis of magnetic nanomaterials by CHO cells was characterized. • Magnetic labeling of cells can be used like fluorescence flow cytometry for quantitative cell analysis.

  4. Magnetorelaxometry procedures for quantitative imaging and characterization of magnetic nanoparticles in biomedical applications.

    Science.gov (United States)

    Liebl, Maik; Wiekhorst, Frank; Eberbeck, Dietmar; Radon, Patricia; Gutkelch, Dirk; Baumgarten, Daniel; Steinhoff, Uwe; Trahms, Lutz

    2015-10-01

    Quantitative knowledge about the spatial distribution and local environment of magnetic nanoparticles (MNPs) inside an organism is essential for guidance and improvement of biomedical applications such as magnetic hyperthermia and magnetic drug targeting. Magnetorelaxometry (MRX) provides such quantitative information by detecting the magnetic response of MNPs following a fast change in the applied magnetic field. In this article, we review our MRX based procedures that enable both the characterization and the quantitative imaging of MNPs in a biomedical environment. MRX characterization supported the selection of an MNP system with colloidal stability and suitable cellular MNP uptake. Spatially resolved MRX, a procedure employing multi-channel MRX measurements allowed for in-vivo monitoring of the MNP distribution in a pre-clinical carcinoma animal model. Extending spatially resolved MRX by consecutive magnetization of distinct parts of the sample led to a demonstration of MRX tomography. With this tomography, we reconstructed the three dimensional MNP distribution inside animal sized phantoms with a sensitivity of milligrams of MNPs per cm3. In addition, the targeting efficiency of MNPs in whole blood was assessed using a flow phantom and MRX quantification. These MRX based measurement and analysis procedures have substantially supported the development of MNP based biomedical applications.

  5. Challenges to quantitative applications of Landsat observations for the urban thermal environment.

    Science.gov (United States)

    Chen, Feng; Yang, Song; Yin, Kai; Chan, Paul

    2017-09-01

    Since the launch of its first satellite in 1972, the Landsat program has operated continuously for more than forty years. A large data archive collected by the Landsat program significantly benefits both the academic community and society. Thermal imagery from Landsat sensors, provided with relatively high spatial resolution, is suitable for monitoring urban thermal environment. Growing use of Landsat data in monitoring urban thermal environment is demonstrated by increasing publications on this subject, especially over the last decade. Urban thermal environment is usually delineated by land surface temperature (LST). However, the quantitative and accurate estimation of LST from Landsat data is still a challenge, especially for urban areas. This paper will discuss the main challenges for urban LST retrieval, including urban surface emissivity, atmospheric correction, radiometric calibration, and validation. In addition, we will discuss general challenges confronting the continuity of quantitative applications of Landsat observations. These challenges arise mainly from the scan line corrector failure of the Landsat 7 ETM+ and channel differences among sensors. Based on these investigations, the concerns are to: (1) show general users the limitation and possible uncertainty of the retrieved urban LST from the single thermal channel of Landsat sensors; (2) emphasize efforts which should be done for the quantitative applications of Landsat data; and (3) understand the potential challenges for the continuity of Landsat observation (i.e., thermal infrared) for global change monitoring, while several climate data record programs being in progress. Copyright © 2017. Published by Elsevier B.V.

  6. QUANTITATIVE MAGNETIC RESONANCE IMAGING OF ARTICULAR CARTILAGE AND ITS CLINICAL APPLICATIONS

    Science.gov (United States)

    Li, Xiaojuan; Majumdar, Sharmila

    2013-01-01

    Cartilage is one of the most essential tissues for healthy joint function and is compromised in degenerative and traumatic joint diseases. There have been tremendous advances during the past decade using quantitative MRI techniques as a non-invasive tool for evaluating cartilage, with a focus on assessing cartilage degeneration during osteoarthritis (OA). In this review, after a brief overview of cartilage composition and degeneration, we discuss techniques that grade and quantify morphologic changes as well as the techniques that quantify changes in the extracellular matrix. The basic principles, in vivo applications, advantages and challenges for each technique are discussed. Recent studies using the OA Initiative (OAI) data are also summarized. Quantitative MRI provides non-invasive measures of cartilage degeneration at the earliest stages of joint degeneration, which is essential for efforts towards prevention and early intervention in OA. PMID:24115571

  7. Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy

    Science.gov (United States)

    Sugiyama, Naruhisa; Shirakawa, Tomohiro

    2017-07-01

    The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.

  8. [The study of tomato fruit weight quantitative trait locus and its application in genetics teaching].

    Science.gov (United States)

    Wang, Hai-yan

    2015-08-01

    The classical research cases, which have greatly promoted the development of genetics in history, can be combined with the content of courses in genetics teaching to train students' ability of scientific thinking and genetic analysis. The localization and clone of gene controlling tomato fruit weight is a pioneer work in quantitative trait locus (QTL) studies and represents a complete process of QTL research in plants. Application of this integrated case in genetics teaching, which showed a wonderful process of scientific discovery and the fascination of genetic research, has inspired students' interest in genetics and achieved a good teaching effect.

  9. The current state of the art of quantitative phosphoproteomics and its applications to diabetes research

    Energy Technology Data Exchange (ETDEWEB)

    Chan, Chi Yuet X’avia; Gritsenko, Marina A.; Smith, Richard D.; Qian, Wei-Jun

    2016-03-17

    Protein phosphorylation is a fundamental regulatory mechanism in many cellular processes and aberrant perturbation of phosphorylation has been revealed in various human diseases. Kinases and their cognate inhibitors have been hotspot for drug development. Therefore, the emerging tools, which enable a system-wide quantitative profiling of phosphoproteome, would offer a powerful impetus in unveiling novel signaling pathways, drug targets and/or biomarkers for the disease of interest. In this review, we will highlight recent advances in phosphoproteomics, the current state-of-the-art of the technologies, and the challenges and future perspectives of this research area. Finally, we will underscore some exemplary applications of phosphoproteomics in diabetes research.

  10. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  11. Application of a nitrocellulose immunoassay for quantitation of proteins secreted in cultured media

    Energy Technology Data Exchange (ETDEWEB)

    LaDuca, F.M.; Dang, C.V.; Bell, W.R.

    1986-11-01

    A macro-dot immunoassay was developed to quantitate proteins (antigens) secreted in the culture media of primary rat hepatocytes. Dilutions of protein standards and undiluted spent culture media were applied to numbered sheets of nitrocellulose (NC) paper by vacuum filtration (in volumes up to 1 ml) through a specially designed macrofiltration apparatus constructed of plexiglas. Sequential incubation of the NC with bovine serum albumin blocking buffer, monospecific antibody, and /sup 125/I Protein A enabled quantitation of protein concentration by determination of NC bound radioactivity. Linear and reproducible standard curves were obtained with fibrinogen, albumin, transferrin, and haptoglobin. A high degree of coefficient of correlation between radioactivity (cmp) and protein concentration was found. Intra- and inter-test reproducibility was excellent. By using monospecific antibodies, single proteins (i.e., fibrinogen), as low as 32 ng/ml, could be quantified in heterogeneous protein mixtures and in spent culture media. The assay was sensitive to the difference of fibrinogen secretion under nonstimulatory (serum-free hormonally define medium, SFHD) and stimulatory (SFHD plus hydrocortisone) culture conditions. The procedure and techniques described are applicable to the quantitation of any protein in a suitable buffer.

  12. Application of magnetic carriers to two examples of quantitative cell analysis

    Science.gov (United States)

    Zhou, Chen; Qian, Zhixi; Choi, Young Suk; David, Allan E.; Todd, Paul; Hanley, Thomas R.

    2017-04-01

    The use of magnetophoretic mobility as a surrogate for fluorescence intensity in quantitative cell analysis was investigated. The objectives of quantitative fluorescence flow cytometry include establishing a level of labeling for the setting of parameters in fluorescence activated cell sorters (FACS) and the determination of levels of uptake of fluorescently labeled substrates by living cells. Likewise, the objectives of quantitative magnetic cytometry include establishing a level of labeling for the setting of parameters in flowing magnetic cell sorters and the determination of levels of uptake of magnetically labeled substrates by living cells. The magnetic counterpart to fluorescence intensity is magnetophoretic mobility, defined as the velocity imparted to a suspended cell per unit of magnetic ponderomotive force. A commercial velocimeter available for making this measurement was used to demonstrate both applications. Cultured Gallus lymphoma cells were immunolabeled with commercial magnetic beads and shown to have adequate magnetophoretic mobility to be separated by a novel flowing magnetic separator. Phagocytosis of starch nanoparticles having magnetic cores by cultured Chinese hamster ovary cells, a CHO line, was quantified on the basis of magnetophoretic mobility.

  13. Application of quantitative PCR for the detection of microorganisms in water.

    Science.gov (United States)

    Botes, Marelize; de Kwaadsteniet, Michéle; Cloete, Thomas Eugene

    2013-01-01

    The occurrence of microorganisms in water due to contamination is a health risk and control thereof is a necessity. Conventional detection methods may be misleading and do not provide rapid results allowing for immediate action. The quantitative polymerase chain reaction (qPCR) method has proven to be an effective tool to detect and quantify microorganisms in water within a few hours. Quantitative PCR assays have recently been developed for the detection of specific adeno- and polyomaviruses, bacteria and protozoa in different water sources. The technique is highly sensitive and able to detect low numbers of microorganisms. Quantitative PCR can be applied for microbial source tracking in water sources, to determine the efficiency of water and wastewater treatment plants and act as a tool for risk assessment. Different qPCR assays exist depending on whether an internal control is used or whether measurements are taken at the end of the PCR reaction (end-point qPCR) or in the exponential phase (real-time qPCR). Fluorescent probes are used in the PCR reaction to hybridise within the target sequence to generate a signal and, together with specialised systems, quantify the amount of PCR product. Quantitative reverse transcription polymerase chain reaction (q-RT-PCR) is a more sensitive technique that detects low copy number RNA and can be applied to detect, e.g. enteric viruses and viable microorganisms in water, and measure specific gene expression. There is, however, a need to standardise qPCR protocols if this technique is to be used as an analytical diagnostic tool for routine monitoring. This review focuses on the application of qPCR in the detection of microorganisms in water.

  14. Fibrosis assessment: impact on current management of chronic liver disease and application of quantitative invasive tools.

    Science.gov (United States)

    Wang, Yan; Hou, Jin-Lin

    2016-05-01

    Fibrosis, a common pathogenic pathway of chronic liver disease (CLD), has long been indicated to be significantly and most importantly associated with severe prognosis. Nowadays, with remarkable advances in understanding and/or treatment of major CLDs such as hepatitis C, B, and nonalcoholic fatty liver disease, there is an unprecedented requirement for the diagnosis and assessment of liver fibrosis or cirrhosis in various clinical settings. Among the available approaches, liver biopsy remains the one which possibly provides the most direct and reliable information regarding fibrosis patterns and changes in the parenchyma at different clinical stages and with different etiologies. Thus, many endeavors have been undertaken for developing methodologies based on the strategy of quantitation for the invasive assessment. Here, we analyze the impact of fibrosis assessment on the CLD patient care based on the data of recent clinical studies. We discuss and update the current invasive tools regarding their technological features and potentials for the particular clinical applications. Furthermore, we propose the potential resolutions with application of quantitative invasive tools for some major issues in fibrosis assessment, which appear to be obstacles against the nowadays rapid progress in CLD medicine.

  15. Leaf anatomy of emerald grass submitted to quantitative application of herbicides

    Directory of Open Access Journals (Sweden)

    Renata Pereira Marques

    2016-08-01

    Full Text Available The aim of this work was to evaluate the selectivity of herbicides applied in post-emergence on Zoysia japonica Steud (Poaceae and determine associations with the leaf anatomy of this grass. The experimental design was randomized blocks with four replications. The treatments were the application of the herbicides bentazon (720 g ha-1, nicosulfuron (50 g ha-1, halosulfuron (112.5 g ha-1, oxadiazon (875 g ha-1 and 2.4-D (698 g ha-1, plus a control treatment without herbicide application. Phytotoxicity was assessed every seven days after application (DAA of the herbicides until the symptoms disappeared. Foliar anatomical analyses of the leaves in the collected grass were conducted until the 35th DAA. The quantitative characters of the keel and wing region of the blade of Z. japonica were assessed, as well as the biometric characters, which were submitted to an analysis of variance F test, and the averages were compared by Tukey’s test at a probability of 5%. The values of the anatomical characters of the foliar blade were tested by cluster analysis. The application of herbicides did not negatively influence the height of the plants but did reduce their dry mass. Toxic symptoms disappeared after 21 DAA, with the only symptoms of injury observed in plants treated with the herbicides oxadiazon and nicosulfuron. In addition, the cluster analysis indicated the formation of a unique discriminatory group. Thus, the results show that the herbicides applied to Z. japonica were selective for the species.

  16. Contribution to the study of proton elastic and inelastic scattering on {sup 12}C; Contribution a l'etude des diffusions elastiques et inelastiques des protons sur le carbone 12

    Energy Technology Data Exchange (ETDEWEB)

    Sadeghi, A

    1966-07-01

    The results of absolute measurements of cross sections for the scattering of protons by {sup 12}C to the two first excited levels are given. The measurements were made from 4.6 to 11.4 MeV at 17 angles for (p,p) and at 15 angles for (p,p') (1. excited level) as well as 8 angles for (p,p'') (2. excited level). A gaseous target with differential pumping was used. The elastic scattering was analyzed using the R-matrix theory with the optical model. Then a new analysis of both (p,p) and (p,p') was achieved using the coupled-wave formalism. The information on the levels of the compound nucleus was completed and was confirmed. (author) [French] Cette these rapporte le resultat de mesures absolues des sections efficaces de diffusion p,p et pp' (conduisant aux deux premiers niveaux excites) de protons par '1'2C. Ces mesures ont ete faites de 4,6 a 11,4 MeV, a 17 angles pour (p,p), a 15 angles pour pp' (1er niveau excite) et a 8 angles pour pp'' (2eme niveau excite). Une chambre a cible gazeuse avec pompage differentiel a ete utilisee. La diffusion elastique a ete analysee au moyen de la theorie de la matrice R avec modele optique pour (p,p). Cette analyse a ete reprise en meme temps que celle de la diffusion inelastique par l'emploi d'equations couplees. Les resultats anterieurs sur les niveaux du noyau compose ont ete confirmes et completes. (auteur)

  17. Quantitative analysis of sharp-force trauma: an application of scanning electron microscopy in forensic anthropology.

    Science.gov (United States)

    Bartelink, E J; Wiersema, J M; Demaree, R S

    2001-11-01

    Scanning electron microscopy (SEM) has occasionally been used by anthropologists and forensic scientists to look at morphological characteristics that certain implements leave on bone. However, few studies have addressed techniques or protocols for assessing quantitative differences between tool marks on bone made by different bladed implements. In this study, the statistical variation in cut mark width was examined between control and test samples on bone using a scalpel blade, paring knife, and kitchen utility knife. Statistically significant differences (p marks made by the same knife under control and test conditions for all three knife types used in the study. When the control sample and test samples were examined individually for differences in mean variation between knife types, significant differences were also found (p mark width were found, caution should be used in trying to classify individual cut marks as being inflicted by a particular implement, due to the overlap in cut mark width that exists between different knife types. When combined, both quantitative and qualitative analyses of cut marks should prove to be more useful in trying to identify a suspect weapon. Furthermore, the application of SEM can be particularly useful for assessing many of these features.

  18. Cement application techniques in luting implant-supported crowns: a quantitative and qualitative survey.

    Science.gov (United States)

    Wadhwani, Chandur; Hess, Timothy; Piñeyro, Alfonso; Opler, Richard; Chung, Kwok-Hung

    2012-01-01

    To investigate different techniques used by dentists when luting an implant-supported crown and to evaluate the application of cement quantitatively and qualitatively. Participants were given a bag containing cement sachet, mixing pad, spatula, a variety of application instruments, and a polycarbonate crown form. The participants were instructed with a standardized audio-video presentation to proportion the cement, mix it, and apply it to the intaglio of the crown as they would if they were to cement it onto an implant abutment in a clinical situation. The crowns were weighed, first unfilled and then again once the applied cement had set. The mean weights of fully-loaded crowns (n = 10) were used as a control group. The patterns of cement loading were recorded. The weights of collected cement-loaded crowns were compared to those of the control group and analyzed statistically. Four hundred and one dentists in several different geographic locations were surveyed. Three distinct cement loading patterns were observed: gross application (GA), brush-on application (BA), and margin application (MA). The mean weights for each cement loading pattern were 242.2 mg for the GA group, 59.9 mg for the BA group, and 59.0 mg for the MA group. The weight of cement in the GA group was significantly higher than that in the other groups. No statistically significant difference between groups BA and MA was seen. The diversity of the cement loading patterns disclosed in this study indicates that there is a lack of uniformity and precision in methods and a lack of consensus in the dental community regarding the appropriate quantity of cement and placement method for a cement-retained implant crown.

  19. [Application and Integration of Qualitative and Quantitative Research Methods in Intervention Studies in Rehabilitation Research].

    Science.gov (United States)

    Wirtz, M A; Strohmer, J

    2016-06-01

    In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.

  20. Web Applications Vulnerability Management using a Quantitative Stochastic Risk Modeling Method

    Directory of Open Access Journals (Sweden)

    Sergiu SECHEL

    2017-01-01

    Full Text Available The aim of this research is to propose a quantitative risk modeling method that reduces the guess work and uncertainty from the vulnerability and risk assessment activities of web based applications while providing users the flexibility to assess risk according to their risk appetite and tolerance with a high degree of assurance. The research method is based on the research done by the OWASP Foundation on this subject but their risk rating methodology needed de-bugging and updates in different in key areas that are presented in this paper. The modified risk modeling method uses Monte Carlo simulations to model risk characteristics that can’t be determined without guess work and it was tested in vulnerability assessment activities on real production systems and in theory by assigning discrete uniform assumptions to all risk charac-teristics (risk attributes and evaluate the results after 1.5 million rounds of Monte Carlo simu-lations.

  1. In-focal-plane characterization of excitation distribution for quantitative fluorescence microscopy applications

    Science.gov (United States)

    Dietrich, Klaus; Brülisauer, Martina; ćaǧin, Emine; Bertsch, Dietmar; Lüthi, Stefan; Heeb, Peter; Stärker, Ulrich; Bernard, André

    2017-06-01

    The applications of fluorescence microscopy span medical diagnostics, bioengineering and biomaterial analytics. Full exploitation of fluorescent microscopy is hampered by imperfections in illumination, detection and filtering. Mainly, errors stem from deviations induced by real-world components inducing spatial or angular variations of propagation properties along the optical path, and they can be addressed through consistent and accurate calibration. For many applications, uniform signal to noise ratio (SNR) over the imaging area is required. Homogeneous SNR can be achieved by quantifying and compensating for the signal bias. We present a method to quantitatively characterize novel reference materials as a calibration reference for biomaterials analytics. The reference materials under investigation comprise thin layers of fluorophores embedded in polymer matrices. These layers are highly homogeneous in their fluorescence response, where cumulative variations do not exceed 1% over the field of view (1.5 x 1.1 mm). An automated and reproducible measurement methodology, enabling sufficient correction for measurement artefacts, is reported. The measurement setup is equipped with an autofocus system, ensuring that the measured film quality is not artificially increased by out-of-focus reduction of the system modulation transfer function. The quantitative characterization method is suitable for analysis of modified bio-materials, especially through patterned protein decoration. The imaging method presented here can be used to statistically analyze protein patterns, thereby increasing both precision and throughput. Further, the method can be developed to include a reference emitter and detector pair on the image surface of the reference object, in order to provide traceable measurements.

  2. Effect of Biofertilizers Application on the Quantitative and Qualitative Characteristics of Linseed (Linum usitatissimum L. Lines

    Directory of Open Access Journals (Sweden)

    B. Motalebizadeh

    2015-09-01

    Full Text Available In order to investigate the effect of bio-fertilizers on the yield and yield components of flax lines, a study was conducted during 2010 growing season at the Agricultural Research Station of Saatlo in Urmia. A split plot design based on randomized complete blocks with four replications was performed in this study. Main factor (a consisted of fertilizer application form (a1 = control without nitrogen fertilizer, a2 = nitrogen fertilizer, a3 = nitroxin + N, a4 = phosphate barvar 2 + N, and a5 = nitroxin + phosphate barvar 2 + N and sub factor (b consisted of five lines of oily flax (b1 = 97-26, b2 = 97-14, b3 = 97-3, b4 = 97-21, b5 = 97-19. Quantitative and qualitative traits such as number of sub stems, leaf weight, capsule weight per main stem and sub stems, seed yield, oil and protein content were calculated or estimated. Results showed that the main factor (fertilizer form had significant effect (at α=0.01 probability level on all the parameters which have been studied in this experiment. Sub factor (linseed lines and interaction between the two factors had statistically significant effects on all traits. The highest seed yield (4781 kg h-1 and the highest seed oil content (36.5% were obtained from applying nitroxin + phosphateye barvare 2 + N on 97-14 and 97-3 lines. Results showed that using of Nitroxin and Phosphateye barvare 2 biofertilizers could be effective in increasing grain yield of linseed. Therefore, application of Nitroxin and Phosphateye barvare 2 biofertilizers could be used to improve soil physio-chemical properties and to increase quantitative and qualitative yield parameters of linseed.

  3. A Mechatronic System for Quantitative Application and Assessment of Massage-Like Actions in Small Animals

    Science.gov (United States)

    Wang, Qian; Zeng, Hansong; Best, Thomas M.; Haas, Caroline; Heffner, Ned T.; Agarwal, Sudha; Zhao, Yi

    2013-01-01

    Massage therapy has a long history and has been widely believed effective in restoring tissue function, relieving pain and stress, and promoting overall well-being. However, the application of massage-like actions and the efficacy of massage are largely based on anecdotal experiences that are difficult to define and measure. This leads to a somewhat limited evidence-based interface of massage therapy with modern medicine. In this study, we introduce a mechatronic device that delivers highly reproducible massage-like mechanical loads to the hind limbs of small animals (rats and rabbits), where various massage-like actions are quantified by the loading parameters (magnitude, frequency and duration) of the compressive and transverse forces on the subject tissues. The effect of massage is measured by the difference in passive viscoelastic properties of the subject tissues before and after mechanical loading, both obtained by the same device. Results show that this device is useful in identifying the loading parameters that are most conducive to a change in tissue mechanical properties, and can determine the range of loading parameters that result in sustained changes in tissue mechanical properties and function. This device presents the first step in our effort for quantifying the application of massage-like actions used clinically and measurement of their efficacy that can readily be combined with various quantitative measures (e.g., active mechanical properties and physiological assays) for determining the therapeutic and mechanistic effects of massage therapies. PMID:23943071

  4. Quantitative material analysis by dual-energy computed tomography for industrial NDT applications

    Science.gov (United States)

    Nachtrab, F.; Weis, S.; Keßling, P.; Sukowski, F.; Haßler, U.; Fuchs, T.; Uhlmann, N.; Hanke, R.

    2011-05-01

    Dual-energy computed tomography (DECT) is an established method in the field of medical CT to obtain quantitative information on a material of interest instead of mean attenuation coefficients only. In the field of industrial X-ray imaging dual-energy techniques have been used to solve special problems on a case-by-case basis rather than as a standard tool. Our goal is to develop an easy-to-use dual-energy solution that can be handled by the average industrial operator without the need for a specialist. We are aiming at providing dual-energy CT as a measurement tool for those cases where qualitative images are not enough and one needs additional quantitative information (e.g. mass density ρ and atomic number Z) about the sample at hand. Our solution is based on an algorithm proposed by Heismann et al. (2003) [1] for application in medical CT . As input data this algorithm needs two CT data sets, one with low (LE) and one with high effective energy (HE). A first order linearization is applied to the raw data, and two volumes are reconstructed thereafter. The dual-energy analysis is done voxel by voxel, using a pre-calculated function F(Z) that implies the parameters of the low and high energy measurement (such as tube voltage, filtration and detector sensitivity). As a result, two volume data sets are obtained, one providing information about the mass density ρ in each voxel, the other providing the effective atomic number Z of the material therein. One main difference between medical and industrial CT is that the range of materials that can be contained in a sample is much wider and can cover the whole range of elements, from hydrogen to uranium. Heismann's algorithm is limited to the range of elements Z=1-30, because for Z>30 the function F(Z) as given by Heismann is not a bijective function anymore. While this still seems very suitable for medical application, it is not enough to cover the complete range of industrial applications. We therefore investigated the

  5. Light-emitting-diode Lambertian light sources as low-radiant-flux standards applicable to quantitative luminescence-intensity imaging

    Science.gov (United States)

    Yoshita, Masahiro; Kubota, Hidehiro; Shimogawara, Masahiro; Mori, Kaneo; Ohmiya, Yoshihiro; Akiyama, Hidefumi

    2017-09-01

    Planar-type Lambertian light-emitting diodes (LEDs) with a circular aperture of several tens of μ m to a few mm in diameter were developed for use as radiant-flux standard light sources, which have been in strong demand for applications such as quantitative or absolute intensity measurements of weak luminescence from solid-state materials and devices. Via pulse-width modulation, time-averaged emission intensity of the LED devices was controlled linearly to cover a wide dynamic range of about nine orders of magnitude, from 10 μ W down to 10 fW. The developed planar LED devices were applied as the radiant-flux standards to quantitative measurements and analyses of photoluminescence (PL) intensity and PL quantum efficiency of a GaAs quantum-well sample. The results demonstrated the utility and applicability of the LED standards in quantitative luminescence-intensity measurements in Lambertian-type low radiant-flux level sources.

  6. On detector linearity and precision of beam shift detection for quantitative differential phase contrast applications

    Energy Technology Data Exchange (ETDEWEB)

    Zweck, Josef, E-mail: josef.zweck@ur.de; Schwarzhuber, Felix; Wild, Johannes; Galioit, Vincent

    2016-09-15

    Differential phase contrast is a STEM imaging mode where minute sideways deflections of the electron probe are monitored, usually by using a position sensitive device (Chapman, 1984 [1]; Lohr et al., 2012 [2]) or, alternatively in some cases, a fast camera (Müller et al., 2012 [3,4]; Yang et al., 2015 [5]; Pennycook et al., 2015 [6]) as a pixelated detector. While traditionally differential phase contrast electron microscopy was mainly focused on investigations of micro-magnetic domain structures and their specific features, such as domain wall widths, etc. (Chapman, 1984 [1]; Chapman et al., 1978, 1981, 1985 [7–9]; Sannomiya et al., 2004 [10]), its usage has recently been extended to mesoscopic (Lohr et al., 2012, 2016 [2,12]; Bauer et al., 2014 [11]; Shibata et al., 2015 [13]) and nano-scale electric fields (Shibata et al., 2012 [14]; Mueller et al., 2014 [15]). In this paper, the various interactions which can cause a beam deflection are reviewed and expanded by two so far undiscussed mechanisms which may be important for biological applications. As differential phase contrast microscopy strongly depends on the ability to detect minute beam deflections we first treat the linearity problem for an annular four quadrant detector and then determine the factors which limit the minimum measurable deflection angle, such as S/N ratio, current density, dwell time and detector geometry. Knowing these factors enables the experimenter to optimize the set-up for optimum performance of the microscope and to get a clear figure for the achievable field resolution error margins. - Highlights: • Detector linearity range determined. • Quantitative treatment of measurement precision for differential phase contrast. • Optimization strategy for detector geometry. • Possible application of differential phase contrast in biology.

  7. Clinical Application of Quantitative Foetal Fibronectin for the Prediction of Preterm Birth in Symptomatic Women.

    Science.gov (United States)

    Radford, Samara K; Da Silva Costa, Fabricio; Araujo Júnior, Edward; Sheehan, Penelope M

    2017-11-29

    To evaluate the clinical application of the new Hologic quantitative foetal fibronectin (qfFN) bedside test for the prediction of spontaneous preterm birth (sPTB) in patients with symptoms suggestive of spontaneous threatened preterm labour (sPTL). A prospective observational study with 154 pregnant women presenting signs and symptoms of sPTL was conducted. These women were subjected to a qfFN test between 22 and 35 weeks of gestation For each cut-off threshold, the ability to predict sPTB at within 14 days of conducting the test and 200 ng/mL produced a 50.0% PPV; thus, qfFN added enhanced discrimination between high- and low-risk patients. The overall rate of sPTB (<37) was 13.3% (16/120), which increased progressively with increasing levels of fFN, with rates of 9.8% (8/81), 11.5% (3/26), 14.2% (1/7), 50% (3/6) within the 4 categories (fFN 0-9, 10-49, 50-200, 200+) respectively. The use of the qfFN testing in symptomatic patients allowed for more accurate identification of women at risk of sPTB and thus more directed management. © 2017 S. Karger AG, Basel.

  8. Application of quantitative structure-property relationship analysis to estimate the vapor pressure of pesticides.

    Science.gov (United States)

    Goodarzi, Mohammad; Coelho, Leandro dos Santos; Honarparvar, Bahareh; Ortiz, Erlinda V; Duchowicz, Pablo R

    2016-06-01

    The application of molecular descriptors in describing Quantitative Structure Property Relationships (QSPR) for the estimation of vapor pressure (VP) of pesticides is of ongoing interest. In this study, QSPR models were developed using multiple linear regression (MLR) methods to predict the vapor pressure values of 162 pesticides. Several feature selection methods, namely the replacement method (RM), genetic algorithms (GA), stepwise regression (SR) and forward selection (FS), were used to select the most relevant molecular descriptors from a pool of variables. The optimum subset of molecular descriptors was used to build a QSPR model to estimate the vapor pressures of the selected pesticides. The Replacement Method improved the predictive ability of vapor pressures and was more reliable for the feature selection of these selected pesticides. The results provided satisfactory MLR models that had a satisfactory predictive ability, and will be important for predicting vapor pressure values for compounds with unknown values. This study may open new opportunities for designing and developing new pesticide. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Quantitative surface electromyography (qEMG): applications in anaesthesiology and critical care.

    Science.gov (United States)

    Paloheimo, M

    1990-01-01

    During general anaesthesia and in lowered vigilance states such as after major trauma and during heavy sedation or analgesic medication, patients' ability to communicate with their surroundings is limited. Subjective intuitional interpretation may be the only means to ascertain a patient's emotional state, mood, and pain perception. Electromyographic detection and quantification of minimal and covert facial mimic muscle activity in anaesthesiology and critical care was an interesting concept worth further evaluation. In this study, the behaviour of quantitative surface-detected electromyographic activity (qEMG) was investigated during common anaesthetic events, post-operatively, and in volunteers as well as in experimental animals. A review of the methodology includes the necessary details for reproduction of the studies, including computerized processing of numerical data available in the commercial equipment. Results from the monitoring of 218 patients, seven volunteers and 31 rats are discussed. Conclusions are based on 32 testable null-hypotheses, the earlier documented literature and the author's own experience. The qEMG signal was derived from two electrodes placed on the frontal area and on the mastoid process behind the ipsilateral ear. After amplification, the signal was filtered to obtain a portion containing electrical activity between 60-300 Hz, which was considered to represent electromyographic activity. The signals were thereafter full-wave rectified and averaged with a 1-s time constant. The output of the processing unit consisted of a graphics display and a numeric computer output. A variety of clinical conditions and drug effects were studied in order to evaluate the method's applicability in research and in routine anaesthetic practice. The facial muscles turned out to be less sensitive to the effects of neuromuscular blocking drugs than the hand muscles, the normal monitoring site of neuromuscular transmission. Although muscle relaxants had a

  10. Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    , the abort triggers must have low false negative rates to be sure that real crew-threatening failures are detected, and also low false positive rates to ensure that the crew does not abort from non-crew-threatening launch vehicle behaviors. The analysis process described in this paper is a compilation of over six years of lessons learned and refinements from experiences developing abort triggers for NASA's Constellation Program (Ares I Project) and the SLS Program, as well as the simultaneous development of SHM/FM theory. The paper will describe the abort analysis concepts and process, developed in conjunction with SLS Safety and Mission Assurance (S&MA) to define a common set of mission phase, failure scenario, and Loss of Mission Environment (LOME) combinations upon which the SLS Loss of Mission (LOM) Probabilistic Risk Assessment (PRA) models are built. This abort analysis also requires strong coordination with the Multi-Purpose Crew Vehicle (MPCV) and SLS Structures and Environments (STE) to formulate a series of abortability tables that encapsulate explosion dynamics over the ascent mission phase. The design and assessment of abort conditions and triggers to estimate their Loss of Crew (LOC) Benefits also requires in-depth integration with other groups, including Avionics, Guidance, Navigation and Control(GN&C), the Crew Office, Mission Operations, and Ground Systems. The outputs of this analysis are a critical input to SLS S&MA's LOC PRA models. The process described here may well be the first full quantitative application of SHM/FM theory to the selection of a sensor suite for any aerospace system.

  11. Quantitative imaging of magnetic nanoparticles by magneto-relaxometric tomography for biomedical applications; Quantitative Bildgebung magnetischer Nanopartikel mittels magnetrelaxometrischer Tomographie fuer biomedizinische Anwendungen

    Energy Technology Data Exchange (ETDEWEB)

    Liebl, Maik

    2016-11-18

    Current biomedical research focuses on the development of novel biomedical applications based on magnetic nanoparticles (MNPs), e.g. for local cancer treatment. These therapy approaches employ MNPs as remotely controlled drug carriers or local heat generators. Since location and quantity of MNPs determine drug enrichment and heat production, quantitative knowledge of the MNP distribution inside a body is essential for the development and success of these therapies. Magnetorelaxometry (MRX) is capable to provide such quantitative information based on the specific response of the MNPs after switching-off an applied magnetic field. Applying a uniform (homogeneous) magnetic field to a MNP distribution and measuring the MNP response by multiple sensors at different locations allows for spatially resolved MNP quantification. However, to reconstruct the MNP distribution from this spatially resolved MRX data, an ill posed inverse problem has to be solved. So far, the solution of this problem was stabilized incorporating a-priori knowledge in the forward model, e.g. by setting priors on the vertical position of the distribution using a 2D reconstruction grid or setting priors on the number and geometry of the MNP sources inside the body. MRX tomography represents a novel approach for quantitative 3D imaging of MNPs, where the inverse solution is stabilized by a series of MRX measurements. In MRX tomography, only parts of the MNP distribution are sequentially magnetized by the use of inhomogeneous magnetic fields. Each magnetizing is followed by detection of the response of the corresponding part of the distribution by multiple sensors. The 3D reconstruction of the MNP distribution is then accomplished by a common evaluation of the distinct MRX measurement series. In this thesis the first experimental setup for MRX tomography was developed for quantitative 3D imaging of biomedical MNP distributions. It is based on a multi-channel magnetizing unit which has been engineered to

  12. Preparation of reversible colorimetric temperature nanosensors and their application in quantitative two-dimensional thermo-imaging.

    Science.gov (United States)

    Wang, Xu-dong; Song, Xin-hong; He, Chun-yan; Yang, Chaoyong James; Chen, Guonan; Chen, Xi

    2011-04-01

    Reversible colorimetric temperature nanosensors were prepared using a very simple precipitation method to encapsulate two color luminescent dyes. These nanosensors presented obvious reversible temperature response and enabled both rapid colorimetric temperature estimation using the eyes and quantitative two-dimensional thermo-imaging. Heat-exchange induced fluid motion was, for the first time, rapidly, precisely, and quantitatively imaged by just taking color pictures, and this presented good temporal and spatial resolution for studying heat-driven hydrodynamics. These nanosensors should have great application in micro/nanoscale research and also fabrication into films for macroscopic study.

  13. Quantitative Morphometric Analysis of Terrestrial Glacial Valleys and the Application to Mars

    Science.gov (United States)

    Allred, Kory

    Although the current climate on Mars is very cold and dry, it is generally accepted that the past environments on the planet were very different. Paleo-environments may have been warm and wet with oceans and rivers. And there is abundant evidence of water ice and glaciers on the surface as well. However, much of that comes from visual interpretation of imagery and other remote sensing data. For example, some of the characteristics that have been utilized to distinguish glacial forms are the presence of landscape features that appear similar to terrestrial glacial landforms, constraining surrounding topography, evidence of flow, orientation, elevation and valley shape. The main purpose of this dissertation is to develop a model that uses quantitative variables extracted from elevation data that can accurately categorize a valley basin as either glacial or non-glacial. The application of this model will limit the inherent subjectivity of image analysis by human interpretation. The model developed uses hypsometric attributes (elevation-area relationship), a newly defined variable similar to the equilibrium line altitude for an alpine glacier, and two neighborhood search functions intended to describe the valley cross-sectional curvature, all based on a digital elevation model (DEM) of a region. The classification model uses data-mining techniques trained on several terrestrial mountain ranges in varied geologic and geographic settings. It was applied to a select set of previously catalogued locations on Mars that resemble terrestrial glaciers. The results suggest that the landforms do have a glacial origin, thus supporting much of the previous research that has identified the glacial landforms. This implies that the paleo-environment of Mars was at least episodically cold and wet, probably during a period of increased planetary obliquity. Furthermore, the results of this research and the implications thereof add to the body of knowledge for the current and past

  14. New journal selection for quantitative survey of infectious disease research: application for Asian trend analysis

    Science.gov (United States)

    2009-01-01

    Background Quantitative survey of research articles, as an application of bibliometrics, is an effective tool for grasping overall trends in various medical research fields. This type of survey has been also applied to infectious disease research; however, previous studies were insufficient as they underestimated articles published in non-English or regional journals. Methods Using a combination of Scopus™ and PubMed, the databases of scientific literature, and English and non-English keywords directly linked to infectious disease control, we identified international and regional infectious disease journals. In order to ascertain whether the newly selected journals were appropriate to survey a wide range of research articles, we compared the number of original articles and reviews registered in the selected journals to those in the 'Infectious Disease Category' of the Science Citation Index Expanded™ (SCI Infectious Disease Category) during 1998-2006. Subsequently, we applied the newly selected journals to survey the number of original articles and reviews originating from 11 Asian countries during the same period. Results One hundred journals, written in English or 7 non-English languages, were newly selected as infectious disease journals. The journals published 14,156 original articles and reviews of Asian origin and 118,158 throughout the world, more than those registered in the SCI Infectious Disease Category (4,621 of Asian origin and 66,518 of the world in the category). In Asian trend analysis of the 100 journals, Japan had the highest percentage of original articles and reviews in the area, and no noticeable increase in articles was revealed during the study period. China, India and Taiwan had relatively large numbers and a high increase rate of original articles among Asian countries. When adjusting the publication of original articles according to the country population and the gross domestic product (GDP), Singapore and Taiwan were the most

  15. Application of a Bayesian dominance model improves power in quantitative trait genome-wide association analysis.

    Science.gov (United States)

    Bennewitz, Jörn; Edel, Christian; Fries, Ruedi; Meuwissen, Theo H E; Wellmann, Robin

    2017-01-14

    Multi-marker methods, which fit all markers simultaneously, were originally tailored for genomic selection purposes, but have proven to be useful also in association analyses, especially the so-called BayesC Bayesian methods. In a recent study, BayesD extended BayesC towards accounting for dominance effects and improved prediction accuracy and persistence in genomic selection. The current study investigated the power and precision of BayesC and BayesD in genome-wide association studies by means of stochastic simulations and applied these methods to a dairy cattle dataset. The simulation protocol was designed to mimic the genetic architecture of quantitative traits as realistically as possible. Special emphasis was put on the joint distribution of the additive and dominance effects of causative mutations. Additive marker effects were estimated by BayesC and additive and dominance effects by BayesD. The dependencies between additive and dominance effects were modelled in BayesD by choosing appropriate priors. A sliding-window approach was used. For each window, the R. Fernando window posterior probability of association was calculated and this was used for inference purpose. The power to map segregating causal effects and the mapping precision were assessed for various marker densities up to full sequence information and various window sizes. Power to map a QTL increased with higher marker densities and larger window sizes. This held true for both methods. Method BayesD had improved power compared to BayesC. The increase in power was between -2 and 8% for causative genes that explained more than 2.5% of the genetic variance. In addition, inspection of the estimates of genomic window dominance variance allowed for inference about the magnitude of dominance at significant associations, which remains hidden in BayesC analysis. Mapping precision was not substantially improved by BayesD. BayesD improved power, but precision only slightly. Application of BayesD needs large

  16. Advancing genetic theory and application by metabolic quantitative trait loci analysis.

    Science.gov (United States)

    Kliebenstein, Danielj

    2009-06-01

    This review describes recent advances in the analysis of metabolism using quantitative genetics. It focuses on how recent metabolic quantitative trait loci (QTL) studies enhance our understanding of the genetic architecture underlying naturally variable phenotypes and the impact of this fundamental research on agriculture, specifically crop breeding. In particular, the role of whole-genome duplications in generating quantitative genetic variation within a species is highlighted and the potential uses of this phenomenon presented. Additionally, the review describes how new observations from metabolic QTL mapping analyses are helping to shape and expand the concepts of genetic epistasis.

  17. Application of quantitative XRD on the precipitation of struvite from Brine Water

    Science.gov (United States)

    Heraldy, E.; Rahmawati, F.; Heryanto; Putra, D. P.

    2017-02-01

    The present studies have been conducted to quantify the varied phases in struvite formation from brine water as the magnesium source. The quantitative X-ray Diffraction (QXRD) method was performed to quantitatively determine the crystal phases and amorphous content of struvite samples. Substantial phase samples were employed quantitative analysis to calibrate against known phase composition information by Rietveld refinement on powder XRD data. The results showed that brine water could be considered as magnesium source the formation of struvite products. The study demonstrated that in general, the high N:P molar ratio (both pH 9 and 10) might lead to the significant formation of struvite.

  18. Clinical application of quantitative spect in patient specific dosimetry and beta cell quantification

    NARCIS (Netherlands)

    Woliner-van der Weg, Wietske

    2015-01-01

    This thesis demonstrates how, despite different challenges, quantitative analysis of single photon emission computed tomography (SPECT) images, facilitates in answering specific clinical questions that cannot (yet) be answered with another method. First, different technical and practical challenges

  19. Molecular and Cellular Quantitative Microscopy: theoretical investigations, technological developments and applications to neurobiology

    OpenAIRE

    Esposito, Alessandro

    2006-01-01

    This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the invest...

  20. Quantitative biokinetics of titanium dioxide nanoparticles after oral application in rats: Part 2.

    Science.gov (United States)

    Kreyling, Wolfgang G; Holzwarth, Uwe; Schleh, Carsten; Kozempel, Ján; Wenk, Alexander; Haberl, Nadine; Hirn, Stephanie; Schäffler, Martin; Lipka, Jens; Semmler-Behnke, Manuela; Gibson, Neil

    2017-05-01

    The biokinetics of a size-selected fraction (70 nm median size) of commercially available and 48V-radiolabeled [48V]TiO2 nanoparticles has been investigated in female Wistar-Kyoto rats at retention timepoints 1 h, 4 h, 24 h and 7 days after oral application of a single dose of an aqueous [48V]TiO2-nanoparticle suspension by intra-esophageal instillation. A completely balanced quantitative body clearance and biokinetics in all organs and tissues was obtained by applying typical [48V]TiO2-nanoparticle doses in the range of 30-80 μg•kg-1 bodyweight, making use of the high sensitivity of the radiotracer technique. The [48V]TiO2-nanoparticle content was corrected for nanoparticles in the residual blood retained in organs and tissue after exsanguination and for 48V-ions not bound to TiO2-nanoparticles. Beyond predominant fecal excretion about 0.6% of the administered dose passed the gastro-intestinal-barrier after one hour and about 0.05% were still distributed in the body after 7 days, with quantifiable [48V]TiO2-nanoparticle organ concentrations present in liver (0.09 ng•g-1), lungs (0.10 ng•g-1), kidneys (0.29 ng•g-1), brain (0.36 ng•g-1), spleen (0.45 ng•g-1), uterus (0.55 ng•g-1) and skeleton (0.98 ng•g-1). Since chronic, oral uptake of TiO2 particles (including a nano-fraction) by consumers has continuously increased in the past decades, the possibility of chronic accumulation of such biopersistent nanoparticles in secondary organs and the skeleton raises questions about the responsiveness of their defense capacities, and whether these could be leading to adverse health effects in the population at large. After normalizing the fractions of retained [48V]TiO2-nanoparticles to the fraction that passed the gastro-intestinal-barrier and reached systemic circulation, the biokinetics was compared to the biokinetics determined after IV-injection (Part 1). Since the biokinetics patterns differ largely, IV-injection is not an

  1. [Application of multiplex quantitative fluorescent PCR with non-polymorphic loci in prenatal diagnosis].

    Science.gov (United States)

    Zhu, Xiang-Yu; Hu, Ya-Li; Wang, Ya-Ping; Zhu, Hai-Yan; Li, Jie; Zhu, Rui-Fang; Zhang, Ying; Wu, Xing; Yang, Ying

    2008-11-01

    To explore the feasibility of application of multiplex quantitative fluorescent PCR with non-polymorphic loci in prenatal diagnosis of aneuploidies. From Mar 2006 to Nov 2007, a total of 63 samples were collected from the Department of Obstetrics and Gynecology, Affiliated Drum Tower Hospital of Medical College, Nanjing University, including 54 villous samples obtained for karyotyping because of spontaneous abortion, six amniotic fluid samples of second trimester and three umbilical cord blood samples of third trimester. Blood samples of 60 healthy adults were obtained at the same time as a control group, including 30 males and 30 females. Non-polymorphic QF-PCR was performed on both testing group and control group for the detection of aneuploidies. The Amelogenin gene (AMXY) was selected as an internal control, and dosage quotiety (DQ) of each locus was calculated according to the known formula. If DQ was between 0.7 and 1.3, the sample was considered as normal. If the figure turned out to be > 1.3 or Cell culture and karyotyping were carried out for every sample simultaneously. The results of non-polymorphic QF-PCR were checked with karyotypes. (1) In the control group, all female samples presented only an AMX peak for sex chromosome while all males showed AMX and AMY amplified peaks. The AMY/AMX ratios were between 0.7 - 1.3, and SD was between 0.05 - 0.12. (2) Among 19 QF-PCR abnormal cases, 13 cases were proved by karyotyping. Of the six cases which turned out to be conflicting, one case of trisomy 18 shown by karyotyping was not completely detected by QF-PCR, a locus on chromosome 18 implied trisomy, while another turned out to be normal (DQ = 1.28). Four cases were detected by non-polymorphic QF-PCR as trisomies but showed normal female karyotype because of maternal contamination during cell culture. A karyotypingly '46, XY' case did not present an AMY peak. Thirty-six out of 44 (82%) normal results implied by non-polymorphic QF-PCR were in accordance with

  2. Technical advances and clinical applications of quantitative myocardial blood flow imaging with cardiac MRI.

    Science.gov (United States)

    Heydari, Bobak; Kwong, Raymond Y; Jerosch-Herold, Michael

    2015-01-01

    The recent FAME 2 study highlights the importance of myocardial ischemia assessment, particularly in the post-COURAGE trial era of managing patients with stable coronary artery disease. Qualitative assessment of myocardial ischemia by stress cardiovascular magnetic resonance imaging (CMR) has gained widespread clinical acceptance and utility. Despite the high diagnostic and prognostic performance of qualitative stress CMR, the ability to quantitatively assess myocardial perfusion reserve and absolute myocardial blood flow remains an important and ambitious goal for non-invasive imagers. Quantitative perfusion by stress CMR remains a research technique that has yielded progressively more encouraging results in more recent years. The ability to safely, rapidly, and precisely procure quantitative myocardial perfusion data would provide clinicians with a powerful tool that may substantially alter clinical practice and improve downstream patient outcomes and the cost effectiveness of healthcare delivery. This may also provide a surrogate endpoint for clinical trials, reducing study population sizes and costs through increased power. This review will cover emerging quantitative CMR techniques for myocardial perfusion assessment by CMR, including novel methods, such as 3-dimensional quantitative myocardial perfusion, and some of the challenges that remain before more widespread clinical adoption of these techniques may take place. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Effects of Single and Combined Application of Organic and Biological Fertilizers on Quantitative and Qualitative Yield of Anisum (Pimpinella anisum

    Directory of Open Access Journals (Sweden)

    N Kamayestani

    2015-07-01

    Full Text Available In order to study the effects of single and combined applications of biofertilazer and organic fertilizers on quantitative and qualitative characteristics of anisum (Pimpinella anisum, an experiment was conducted based on a Randomized Complete Block Design with three replications and fifteen treatments at Research Station, Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, in 2011 year. Treatments were: (1 mycorrhiza (Glomus intraradices, (2 mycorrhiza + cow manure, (3 mycorrhiza + vermicompost, (4 mycorrhiza+ compost, (5 mycorrhiza + chemical fertilizer, (6 biosulfur (Thiobacillus sp. + Bentonite, (7 biosulfur + chemical fertilizer, (8 biosulfur + cow manure, (9 biosulfur + vermicompost, (10 biosulfur+compost,11 (cow manure, (12 vermicompost, (13 chemical fertilizer (NPK, (14compost and (15 control. The results showed that application of fertilizer treatments had significant effect on most characteristics of anisum. The highest number of seed per umbelet (7.24, economic yield (1263.4kg/ha were obtained fram biosulfur treatment. The highest dry matter yield (4504.1 kg/ha resulted from combined application of biosulfur + chemical fertilizer and the highest harvest index (25.97% observed in biosulfur+cow manure. The combined application of mycorrhiza affected some qualification traits, as the highest number of umbel per plant (65.7, 1000 seed-weight (3.24 g and essential oil percentage (5.3% resulted from combined application of mycorrhiza+chemical fertilizer. In general, it can be concluded that application of organic and biological fertilizer particularly mycorrhiza and biosulfur had a significant effect on improving of quantitative and qualitative characteristics of anisum. Furthermore, the combined application of organic and biological fertilizer had higher positive effects than their single application.

  4. The Application of the Semi-quantitative Risk Assessment Method to Urban Natural Gas Pipelines

    Directory of Open Access Journals (Sweden)

    YongQiang BAI

    2013-07-01

    Full Text Available This paper provides a method of semi-quantitative risk assessment for urban gas pipelines, by modifying Kent analysis method. The influence factors of fault frequency and consequence for urban gas pipelines are analyzed, and the grade rules are studied on. The grade rules of fault frequency and consequence for urban natural gas pipelines are provided. Using semi-quantitative risk matrix, the risk grade of the urban gas pipelines is obtained, and the risk primary sort for gas pipelines can be accomplished, so as to find out the high risk pipeline unit.

  5. Functional neuroimaging and quantitative electroencephalography in adult traumatic head injury: clinical applications and interpretive cautions.

    Science.gov (United States)

    Ricker, J H; Zafonte, R D

    2000-04-01

    Functional neuroimaging and quantitative electroencephalographic procedures are being used increasingly in brain injury research and clinical care. These procedures are also seeing increased use in the context of forensic evaluations, particularly in cases of mild head trauma. This article provides an overview of the use of procedures such as positron emission tomography, single photon emission computed tomography, and quantitative electroencephalogram in adults. Also discussed are the clinical limitations of each procedure within the context of myriad interpretive confounds that can interfere with accurate differential diagnosis of mild head trauma.

  6. Theory of quantitative trend analysis and its application to the South African elections

    CSIR Research Space (South Africa)

    Greben, JM

    2006-02-28

    Full Text Available In this paper the author discusses a quantitative theory of trend analysis. Often trends are based on qualitative considerations and subjective assumptions. In the current approach the author makes use of extensive data bases to optimise the so...

  7. Molecular and Cellular Quantitative Microscopy: theoretical investigations, technological developments and applications to neurobiology

    NARCIS (Netherlands)

    Esposito, Alessandro

    2006-01-01

    This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These

  8. Quantitative Surface Analysis by Xps (X-Ray Photoelectron Spectroscopy: Application to Hydrotreating Catalysts

    Directory of Open Access Journals (Sweden)

    Beccat P.

    1999-07-01

    Full Text Available XPS is an ideal technique to provide the chemical composition of the extreme surface of solid materials, vastly applied to the study of catalysts. In this article, we will show that a quantitative approach, based upon fundamental expression of the XPS signal, has enabled us to obtain a consistent set of response factors for the elements of the periodic table. In-depth spadework has been necessary to know precisely the transmission function of the spectrometer used at IFP. The set of response factors obtained enables to perform, on a routine basis, a quantitative analysis with approximately 20% relative accuracy, which is quite acceptable for an analysis of such a nature. While using this quantitative approach, we have developed an analytical method specific to hydrotreating catalysts that allows obtaining the sulphiding degree of molybdenum quite reliably and reproducibly. The usage of this method is illustrated by two examples for which XPS spectroscopy has provided with information sufficiently accurate and quantitative to help understand the reactivity differences between certain MoS2/Al2O3 or NiMoS/Al2O3-type hydrotreating catalysts.

  9. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    Science.gov (United States)

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. APPLICATION OF TEXTURE ANALYSIS TECHNIQUES TO NMR IMAGES FOR QUANTITATIVE ASSESSMENT OF MUSCLE DISORDERS

    Directory of Open Access Journals (Sweden)

    E. V. Snezhko

    2014-01-01

    Full Text Available Results on development of the basic concept of the medical image analysis methodology are presented. The techniques used in the research are based on adaptation and utilization of the gene-ralized image features for quantitative interpretation of muscles on nuclear magnetic resonance im-ages.

  11. Semi-quantitative digital analysis of polymerase chain reactionelectrophoresis gel: Potential applications in low-income veterinary laboratories

    Directory of Open Access Journals (Sweden)

    John F. Antiabong

    2016-09-01

    Full Text Available Aim: The interpretation of conventional polymerase chain reaction (PCR assay results is often limited to either positive or negative (non-detectable. The more robust quantitative PCR (qPCR method is mostly reserved for quantitation studies and not a readily accessible technology in laboratories across developing nations. The aim of this study was to evaluate a semi-quantitative method for conventional PCR amplicons using digital image analysis of electrophoretic gel. The potential applications are also discussed. Materials and Methods: This study describes standard conditions for the digital image analysis of PCR amplicons using the freely available ImageJ software and confirmed using the qPCR assay. Results and Conclusion: Comparison of ImageJ analysis of PCR-electrophoresis gel and qPCR methods showed similar trends in the Fusobacterium necrophorum DNA concentration associated with healthy and periodontal disease infected wallabies (p≤0.03. Based on these empirical data, this study adds descriptive attributes (“more” or “less” to the interpretation of conventional PCR results. The potential applications in low-income veterinary laboratories are suggested, and guidelines for the adoption of the method are also highlighted.

  12. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    Science.gov (United States)

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  13. Elastic and plastic properties of iron-aluminium alloys. Special problems raised by the brittleness of alloys of high aluminium content; Proprietes elastiques et plastiques des alliages fer-aluminium. Problemes particuliers poses par la fragilite des alliages a forte teneur en aluminium

    Energy Technology Data Exchange (ETDEWEB)

    Mouturat, P. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1966-06-01

    The present study embodies the results obtained with iron-aluminium alloys whose composition runs from 0 to nearly 50 atoms per cent aluminium. Conditions of elaboration and transformation have been studied successively, as well as the Young's modulus and the flow stress; the last chapter embodies, a study of the Portevin-le-Chatelier effect in alloys of 40 atoms per cent of aluminium. I) The principal difficulty to clear up consisted in the intergranular brittleness of ordered alloys; this brittleness has been considerably reduced with appropriate conditions of elaboration and transformation. II) The studies upon the Young's modulus are in connection with iron-aluminium alloys; transformation temperatures are well shown up. The formation of covalent bonds on and after 25 atoms per cent show the highest values of the modulus. III) The analysis of variations of the flow stress according to the temperature show some connection with ordered structures, the existence of antiphase domains and the existence of sur-structure dislocations. IV) In the ordered Fe Al domain the kinetics of the Portevin-le-Chatelier effect could be explained by a mechanism of diffusion of vacancies. The role they play has been specified by the influence they exert upon the dislocations; this has led us to the inhomogeneous Rudman order; this inhomogeneous order could explain the shape of the traction curves. (author) [French] Cette etude comporte les resultats obtenus avec des alliages fer-aluminium dont la composition s'etend de 0 a pres de 50 atomes pour cent d'aluminium. Nous avons etudie successivement les conditions d'elaboration et de transformation, le module elastique et la limite elastique; un dernier chapitre est consacre a l'etude du phenomene Portevin-le-Chatelier dans les alliages a 40 atomes pour cent d'aluminium. I) La principale difficulte a resoudre residait dans la fragilite intergranulaire des alliages ordonnes; celle-ci a ete

  14. Application of quantitative second-harmonic generation microscopy to posterior cruciate ligament for crimp analysis studies

    Science.gov (United States)

    Lee, Woowon; Rahman, Hafizur; Kersh, Mariana E.; Toussaint, Kimani C.

    2017-04-01

    We use second-harmonic generation (SHG) microscopy to quantitatively characterize collagen fiber crimping in the posterior cruciate ligament (PCL). The obtained SHG images are utilized to define three distinct categories of crimp organization in the PCL. Using our previously published spatial-frequency analysis, we develop a simple algorithm to quantitatively distinguish the various crimp patterns. In addition, SHG microscopy reveals both the three-dimensional structural variation in some PCL crimp patterns as well as an underlying helicity in these patterns that have mainly been observed using electron microscopy. Our work highlights how SHG microscopy could potentially be used to link the fibrous structural information in the PCL to its mechanical properties.

  15. Molecular and Cellular Quantitative Microscopy: theoretical investigations, technological developments and applications to neurobiology

    Science.gov (United States)

    Esposito, Alessandro

    2006-05-01

    This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the investigation of molecular and cellular properties at high throughput levels and the analysis of cellular heterogeneity. State-of-the-art Förster Resonance Energy Transfer imaging, Fluorescence Lifetime Imaging Microscopy, Confocal Laser Scanning Microscopy and the newly developed tools have been combined with cellular and molecular biology techniques for the investigation of protein-protein interactions, oligomerization and post-translational modifications of α-Synuclein and Tau, two proteins involved in Parkinson’s and Alzheimer’s disease, respectively. The high inter-disciplinarity of this project required the merging of the expertise of both the Molecular Biophysics Group at the Debye Institute - Utrecht University and the Cell Biophysics Group at the European Neuroscience Institute - Gαttingen University. This project was conducted also with the support and the collaboration of the Center for the Molecular Physiology of the Brain (Göttingen), particularly with the groups associated with the Molecular Quantitative Microscopy and Parkinson’s Disease and Aggregopathies areas. This work demonstrates that molecular and cellular quantitative microscopy can be used in combination with high-throughput screening as a powerful tool for the investigation of the molecular mechanisms of complex biological phenomena like those occurring in neurodegenerative diseases.

  16. Stewart's quantitative acid-base chemistry: applications in biology and medicine.

    Science.gov (United States)

    Fencl, V; Leith, D E

    1993-01-01

    We review P.A. Stewart's quantitative approach to acid-base chemistry, starting with its historical context. We outline its implications for cellular and membrane processes in acid-base physiology; discuss its contributions to the understanding and analysis of acid-base phenomena; show how it can be applied in clinical problems; and propose a classification of clinical acid-base disturbances based on this general approach.

  17. From genetical genomics to systems genetics: potential applications in quantitative genomics and animal breeding.

    Science.gov (United States)

    Kadarmideen, Haja N; von Rohr, Peter; Janss, Luc L G

    2006-06-01

    This article reviews methods of integration of transcriptomics (and equally proteomics and metabolomics), genetics, and genomics in the form of systems genetics into existing genome analyses and their potential use in animal breeding and quantitative genomic modeling of complex traits. Genetical genomics or the expression quantitative trait loci (eQTL) mapping method and key findings in this research are reviewed. Various procedures and potential uses of eQTL mapping, global linkage clustering, and systems genetics are illustrated using actual analysis on recombinant inbred lines of mice with data on gene expression (for diabetes- and obesity-related genes), pathway, and single nucleotide polymorphism (SNP) linkage maps. Experimental and bioinformatics difficulties and possible solutions are discussed. The main uses of this systems genetics approach in quantitative genomics were shown to be in refinement of the identified QTL, candidate gene and SNP discovery, understanding gene-environment and gene-gene interactions, detection of candidate regulator genes/eQTL, discriminating multiple QTL/eQTL, and detection of pleiotropic QTL/eQTL, in addition to its use in reconstructing regulatory networks. The potential uses in animal breeding are direct selection on heritable gene expression measures, termed "expression assisted selection," and genetical genomic selection of both QTL and eQTL based on breeding values of the respective genes, termed "expression-assisted evaluation."

  18. Overview of quantitative measurement methods. Equivalence, invariance, and differential item functioning in health applications.

    Science.gov (United States)

    Teresi, Jeanne A

    2006-11-01

    Reviewed in this article are issues relating to the study of invariance and differential item functioning (DIF). The aim of factor analyses and DIF, in the context of invariance testing, is the examination of group differences in item response conditional on an estimate of disability. Discussed are parameters and statistics that are not invariant and cannot be compared validly in crosscultural studies with varying distributions of disability in contrast to those that can be compared (if the model assumptions are met) because they are produced by models such as linear and nonlinear regression. The purpose of this overview is to provide an integrated approach to the quantitative methods used in this special issue to examine measurement equivalence. The methods include classical test theory (CTT), factor analytic, and parametric and nonparametric approaches to DIF detection. Also included in the quantitative section is a discussion of item banking and computerized adaptive testing (CAT). Factorial invariance and the articles discussing this topic are introduced. A brief overview of the DIF methods presented in the quantitative section of the special issue is provided together with a discussion of ways in which DIF analyses and examination of invariance using factor models may be complementary. Although factor analytic and DIF detection methods share features, they provide unique information and can be viewed as complementary in informing about measurement equivalence.

  19. [Comparison between application of fecal occult blood quantitive testing instrument and colloidal gold strip method in colorectal cancer screening].

    Science.gov (United States)

    Huang, Yan-qin; Zhang, Meng-wen; Shen, Yong-zhou; Ma, Hao-qing; Cai, Shan-rong; Zhang, Su-zhan; Zheng, Shu

    2013-08-01

    To compare the performances of fecal occult blood quantitive testing instrument and colloidal gold strip method in colorectal cancer screening. A representative random population of 9000 subjects aging between 40 and 74 years old were selected from Xuxiang, Haining city, Zhejiang province, by random cluster sampling method in year 2011. The fecal samples from each subject were separately detected by the two methods, namely fecal occult blood quantitive testing instrument and colloidal gold strip method. The positive result was standardized by hemoglobin concentration (HGB) ≥ 100 ng/ml under the application of quantitive testing instrument, or color-developing by colloidal gold strip method. The positive subjects from either method would be provided a further colonoscopy examination for pathological diagnosis. The positive rate and consistency of the two methods were compared, as well as the positive predictive value and population detecting rate of the colorectal cancer and adenoma. A total of 6475 (71.9%) subjects submitted their two fecal samples according to our requirement in 9000 subjects. There were separately 319 positive cases (4.9%) and 146 positive cases (2.3%) by the performances of fecal occult blood quantitive testing instrument and colloidal gold strip method, including 45 positive in both tests (Kappa = 0.168, 95%CI:0.119-0.217).184 out of the 319 positive cases (57.7%) in the test by quantitive testing instrument and 89 out of 146 positive cases (61.0%) in the test by colloidal gold strip method received the colonoscopy examination. There were no significant statistical differences between the two methods in the positive predictive value of colorectal cancer (P > 0.05) , developing adenoma and non-developing adenoma.However, the population detecting rate of the colorectal cancer and developing adenoma were higher in the test by quantitive testing instrument (26 cases, 0.402%) than it in the test by colloidal gold strip method (10 cases, 0

  20. The use of digital PCR to improve the application of quantitative molecular diagnostic methods for tuberculosis.

    Science.gov (United States)

    Devonshire, Alison S; O'Sullivan, Denise M; Honeyborne, Isobella; Jones, Gerwyn; Karczmarczyk, Maria; Pavšič, Jernej; Gutteridge, Alice; Milavec, Mojca; Mendoza, Pablo; Schimmel, Heinz; Van Heuverswyn, Fran; Gorton, Rebecca; Cirillo, Daniela Maria; Borroni, Emanuele; Harris, Kathryn; Barnard, Marinus; Heydenrych, Anthenette; Ndusilo, Norah; Wallis, Carole L; Pillay, Keshree; Barry, Thomas; Reddington, Kate; Richter, Elvira; Mozioğlu, Erkan; Akyürek, Sema; Yalçınkaya, Burhanettin; Akgoz, Muslum; Žel, Jana; Foy, Carole A; McHugh, Timothy D; Huggett, Jim F

    2016-08-03

    Real-time PCR (qPCR) based methods, such as the Xpert MTB/RIF, are increasingly being used to diagnose tuberculosis (TB). While qualitative methods are adequate for diagnosis, the therapeutic monitoring of TB patients requires quantitative methods currently performed using smear microscopy. The potential use of quantitative molecular measurements for therapeutic monitoring has been investigated but findings have been variable and inconclusive. The lack of an adequate reference method and reference materials is a barrier to understanding the source of such disagreement. Digital PCR (dPCR) offers the potential for an accurate method for quantification of specific DNA sequences in reference materials which can be used to evaluate quantitative molecular methods for TB treatment monitoring. To assess a novel approach for the development of quality assurance materials we used dPCR to quantify specific DNA sequences in a range of prototype reference materials and evaluated accuracy between different laboratories and instruments. The materials were then also used to evaluate the quantitative performance of qPCR and Xpert MTB/RIF in eight clinical testing laboratories. dPCR was found to provide results in good agreement with the other methods tested and to be highly reproducible between laboratories without calibration even when using different instruments. When the reference materials were analysed with qPCR and Xpert MTB/RIF by clinical laboratories, all laboratories were able to correctly rank the reference materials according to concentration, however there was a marked difference in the measured magnitude. TB is a disease where the quantification of the pathogen could lead to better patient management and qPCR methods offer the potential to rapidly perform such analysis. However, our findings suggest that when precisely characterised materials are used to evaluate qPCR methods, the measurement result variation is too high to determine whether molecular quantification

  1. Off-axis quantitative phase imaging processing using CUDA: toward real-time applications.

    Science.gov (United States)

    Pham, Hoa; Ding, Huafeng; Sobh, Nahil; Do, Minh; Patel, Sanjay; Popescu, Gabriel

    2011-07-01

    We demonstrate real time off-axis Quantitative Phase Imaging (QPI) using a phase reconstruction algorithm based on NVIDIA's CUDA programming model. The phase unwrapping component is based on Goldstein's algorithm. By mapping the process of extracting phase information and unwrapping to GPU, we are able to speed up the whole procedure by more than 18.8× with respect to CPU processing and ultimately achieve video rate for mega-pixel images. Our CUDA implementation also supports processing of multiple images simultaneously. This enables our imaging system to support high speed, high throughput, and real-time image acquisition and visualization.

  2. La PCR quantitative en temps réel : application à la quantification des OGM

    Directory of Open Access Journals (Sweden)

    Alary Rémi

    2002-11-01

    Full Text Available Suite à l’obligation d’étiquetage, au seuil de 1 %, des aliments contenant des OGM autorisés, il est nécessaire de disposer de méthodes fiables de quantification. Pour répondre à cette obligation, la technique de PCR quantitative en temps réel semble actuellement la mieux adaptée. Son principe, ses avantages et sa mise en oeuvre pour la détermination de la teneur en OGM de farines de soja sont présentés. Les PCR simplex et duplex sont comparées.

  3. Quantitative and qualitative research across cultures and languages: cultural metrics and their application.

    Science.gov (United States)

    Wagner, Wolfgang; Hansen, Karolina; Kronberger, Nicole

    2014-12-01

    Growing globalisation of the world draws attention to cultural differences between people from different countries or from different cultures within the countries. Notwithstanding the diversity of people's worldviews, current cross-cultural research still faces the challenge of how to avoid ethnocentrism; comparing Western-driven phenomena with like variables across countries without checking their conceptual equivalence clearly is highly problematic. In the present article we argue that simple comparison of measurements (in the quantitative domain) or of semantic interpretations (in the qualitative domain) across cultures easily leads to inadequate results. Questionnaire items or text produced in interviews or via open-ended questions have culturally laden meanings and cannot be mapped onto the same semantic metric. We call the culture-specific space and relationship between variables or meanings a 'cultural metric', that is a set of notions that are inter-related and that mutually specify each other's meaning. We illustrate the problems and their possible solutions with examples from quantitative and qualitative research. The suggested methods allow to respect the semantic space of notions in cultures and language groups and the resulting similarities or differences between cultures can be better understood and interpreted.

  4. The application of topology optimization on the quantitative description of the external shape of bone structure.

    Science.gov (United States)

    Xinghua, Zhu; He, Gong; Bingzhao, Gao

    2005-08-01

    The aim of this paper was to introduce the idea of topology optimization in engineering to the simulation of bone morphology. The external shape of bone structure was predicted with the quantitative bone functional adaptation theory. The high-order nonlinear equation of bone remodeling proposed by Zhu et al. (J. Biomech. 35(7)(2002)951) combining with the finite element method was adopted to a rectangular design domain, which occupies a larger space than the external shape of bone structure. It was at this point we imported the idea of topology optimization in engineering. The proximal femur was used here as an example, whose external shape and internal density distribution were simultaneously simulated quantitatively to validate that the external shape of bone structure could be successfully predicted in this way. Then the growth of vertebral body from young to old was simulated numerically in its coronal section to discuss the significance of the prediction of external shape. The study in this paper provides computational basis for further studies on osteophyte formation, osteoporosis, osteoarthritis, bone growth and even bone evolution, etc.

  5. Application of microcomputed tomography for quantitative analysis of dental root canal obturations

    Directory of Open Access Journals (Sweden)

    Małgorzata Jaworska

    2014-03-01

    Full Text Available Introduction: The aim of the study was to apply microcomputed tomography to quantitative evaluation of voids and to test any specific location of voids in tooth’s root canal obturations. Materials and Methods: Twenty root canals were prepared and obturated with gutta-percha and Tubli-Seal sealer using the thermoplastic compaction method (System B + Obtura II. Roots were scanned and three-dimensional visualization was obtained. The volume and Feret’s diameter of I-voids (at the filling/dentine interface and S-voids (surrounded by filling material were measured. Results: The results revealed that none of the scanned root canal fillings were void-free. For I-voids, the volume fraction was significantly larger, but their number was lower (P = 0.0007, than for S-voids. Both types of voids occurred in characteristic regions (P < 0.001. I-voids occurred mainly in the apical third, while S-voids in the coronal third of the canal filling. Conclusions: Within the limitations of this study, our results indicate that microtomography, with proposed semi-automatic algorithm, is a useful tools for three-dimensional quantitative evaluation of dental root canal fillings. In canals filled with thermoplastic gutta-percha and Tubli-Seal, voids at the interface between the filling and canal dentine deserve special attention due to of their periapical location, which might promote apical microleakage. Further studies might help to elucidate the clinical relevance of these results.

  6. Application of microcomputed tomography for quantitative analysis of dental root canal obturations

    Directory of Open Access Journals (Sweden)

    Anna Kierklo

    2014-03-01

    Full Text Available Introduction: The aim of the study was to apply microcomputed tomography to quantitative evaluation of voids and to test any specific location of voids in tooth’s root canal obturations. Materials and Methods: Twenty root canals were prepared and obturated with gutta-percha and Tubli-Seal sealer using the thermoplastic compaction method (System B + Obtura II. Roots were scanned and three-dimensional visualization was obtained. The volume and Feret’s diameter of I-voids (at the filling/dentine interface and S-voids (surrounded by filling material were measured.Results: The results revealed that none of the scanned root canal fillings were void-free. For I-voids, the volume fraction was significantly larger, but their number was lower (P = 0.0007, than for S-voids. Both types of voids occurred in characteristic regions (P < 0.001. I-voids occurred mainly in the apical third, while S-voids in the coronal third of the canal filling.Conclusions: Within the limitations of this study, our results indicate that microtomography, with proposed semi-automatic algorithm, is a useful tools for three-dimensional quantitative evaluation of dental root canal fillings. In canals filled with thermoplastic gutta-percha and Tubli-Seal, voids at the interface between the filling and canal dentine deserve special attention due to of their periapical location, which might promote apical microleakage. Further studies might help to elucidate the clinical relevance of these results.

  7. Application of Fault Management Theory to the Quantitive Selection of a Launch Vehicle Abort Trigger Suite

    Science.gov (United States)

    Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.

    2014-01-01

    SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial

  8. Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT

    Science.gov (United States)

    Isola, A. A.; Schmitt, H.; van Stevendaal, U.; Begemann, P. G.; Coulon, P.; Boussel, L.; Grass, M.

    2011-09-01

    Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

  9. Understanding responder neurobiology in schizophrenia using a quantitative systems pharmacology model: application to iloperidone.

    Science.gov (United States)

    Geerts, Hugo; Roberts, Patrick; Spiros, Athan; Potkin, Steven

    2015-04-01

    The concept of targeted therapies remains a holy grail for the pharmaceutical drug industry for identifying responder populations or new drug targets. Here we provide quantitative systems pharmacology as an alternative to the more traditional approach of retrospective responder pharmacogenomics analysis and applied this to the case of iloperidone in schizophrenia. This approach implements the actual neurophysiological effect of genotypes in a computer-based biophysically realistic model of human neuronal circuits, is parameterized with human imaging and pathology, and is calibrated by clinical data. We keep the drug pharmacology constant, but allowed the biological model coupling values to fluctuate in a restricted range around their calibrated values, thereby simulating random genetic mutations and representing variability in patient response. Using hypothesis-free Design of Experiments methods the dopamine D4 R-AMPA (receptor-alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid) receptor coupling in cortical neurons was found to drive the beneficial effect of iloperidone, likely corresponding to the rs2513265 upstream of the GRIA4 gene identified in a traditional pharmacogenomics analysis. The serotonin 5-HT3 receptor-mediated effect on interneuron gamma-aminobutyric acid conductance was identified as the process that moderately drove the differentiation of iloperidone versus ziprasidone. This paper suggests that reverse-engineered quantitative systems pharmacology is a powerful alternative tool to characterize the underlying neurobiology of a responder population and possibly identifying new targets. © The Author(s) 2015.

  10. A Combinational Strategy of Model Disturbance and Outlier Comparison to Define Applicability Domain in Quantitative Structural Activity Relationship.

    Science.gov (United States)

    Yan, Jun; Zhu, Wei-Wei; Kong, Bo; Lu, Hong-Bing; Yun, Yong-Huan; Huang, Jian-Hua; Liang, Yi-Zeng

    2014-08-01

    In order to define an applicability domain for quantitative structure-activity relationship modeling, a combinational strategy of model disturbance and outlier comparison is developed. An indicator named model disturbance index was defined to estimate the prediction error. Moreover, the information of the outliers in the training set was used to filter the unreliable samples in the test set based on "structural similarity". Chromatography retention indices data were used to investigate this approach. The relationship between model disturbance index and prediction error can be found. Also, the comparison between the outlier set and the test set could provide additional information about which unknown samples should be paid more attentions. A novel technique based on model population analysis was used to evaluate the validity of applicability domain. Finally, three commonly used methods, i.e. Leverage, descriptor range-based and model perturbation method, were compared with the proposed approach. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Development and Application of Quantitative Detection Method for Viral Hemorrhagic Septicemia Virus (VHSV) Genogroup IVa

    Science.gov (United States)

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-01-01

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R2 values of the primer set developed in this study were −0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID50) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID50, making it a very useful tool for VHSV diagnosis. PMID:24859343

  12. Development and Application of Quantitative Detection Method for Viral Hemorrhagic Septicemia Virus (VHSV Genogroup IVa

    Directory of Open Access Journals (Sweden)

    Jong-Oh Kim

    2014-05-01

    Full Text Available Viral hemorrhagic septicemia virus (VHSV is a problematic pathogen in olive flounder (Paralichthys olivaceus aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR method based on the nucleocapsid (N gene sequence of Korean VHSV isolate (Genogroup IVa. The slope and R2 values of the primer set developed in this study were −0.2928 (96% efficiency and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID50 showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID50, making it a very useful tool for VHSV diagnosis.

  13. The application of high-speed cinematography for the quantitative analysis of equine locomotion.

    Science.gov (United States)

    Fredricson, I; Drevemo, S; Dalin, G; Hjertën, G; Björne, K

    1980-04-01

    Locomotive disorders constitute a serious problem in horse racing which will only be rectified by a better understanding of the causative factors associated with disturbances of gait. This study describes a system for the quantitative analysis of the locomotion of horses at speed. The method is based on high-speed cinematography with a semi-automatic system of analysis of the films. The recordings are made with a 16 mm high-speed camera run at 500 frames per second (fps) and the films are analysed by special film-reading equipment and a mini-computer. The time and linear gait variables are presented in tabular form and the angles and trajectories of the joints and body segments are presented graphically.

  14. Quantitative analysis of 17O exchange and T1 relaxation data: application to zirconium tungstate.

    Science.gov (United States)

    Hodgkinson, Paul; Hampson, Matthew R

    2006-09-01

    The theoretical basis behind a recent quantitative analysis of 17O exchange in ZrW2O8 [M.R. Hampson, J.S.O. Evans, P. Hodgkinson, J. Am. Chem. Soc. 127 (2005) 15175-15181] is set out. Despite the complexities of combining the multi-exponential relaxation of half-integer quadrupolar nuclei with chemical exchange, it is shown how magnetisation transfer experiments can be analysed to obtain estimates of absolute exchange rates. The multi-exponential relaxation is best modelled using a magnetic mechanism, i.e. the rapid T1 relaxation observed, particularly at high temperatures, can be directly related to the relatively high degree of 17O labelling employed. The combination of the 1D EXSY results with T1 values as a function of temperature provides exchange rates and activation barriers over a wide temperature range (40-226 degrees C).

  15. Application of Organosilane Monolayer Template to Quantitative Evaluation of Cancer Cell Adhesive Ability

    Science.gov (United States)

    Tanii, Takashi; Sasaki, Kosuke; Ichisawa, Kota; Demura, Takanori; Beppu, Yuichi; Vu, Hoan Anh; Thanh Chi, Hoan; Yamamoto, Hideaki; Sato, Yuko

    2011-06-01

    The adhesive ability of two human pancreatic cancer cell lines was evaluated using organosilane monolayer templates (OMTs). Using the OMT, the spreading area of adhered cells can be limited, and this enables us to focus on the initial attachment process of adhesion. Moreover, it becomes possible to arrange the cells in an array and to quantitatively evaluate the number of attached cells. The adhesive ability of the cancer cells cultured on the OMT was controlled by adding (-)-epigallocatechin-3-gallate (EGCG), which blocks a receptor that mediates cell adhesion and is overexpressed in cancer cells. Measurement of the relative ability of the cancer cells to attach to the OMT revealed that the ability for attachment decreased with increasing EGCG concentration. The results agreed well with the western blot analysis, indicating that the OMT can potentially be employed to evaluate the adhesive ability of various cancer cells.

  16. Ethanol determination in frozen fruit pulps: an application of quantitative nuclear magnetic resonance.

    Science.gov (United States)

    da Silva Nunes, Wilian; de Oliveira, Caroline Silva; Alcantara, Glaucia Braz

    2016-04-01

    This study reports the chemical composition of five types of industrial frozen fruit pulps (acerola, cashew, grape, passion fruit and pineapple fruit pulps) and compares them with homemade pulps at two different stages of ripening. The fruit pulps were characterized by analyzing their metabolic profiles and determining their ethanol content using quantitative Nuclear Magnetic Resonance (qNMR). In addition, principal component analysis (PCA) was applied to extract more information from the NMR data. We detected ethanol in all industrial and homemade pulps; and acetic acid in cashew, grape and passion fruit industrial and homemade pulps. The ethanol content in some industrial pulps is above the level recommended by regulatory agencies and is near the levels of some post-ripened homemade pulps. This study demonstrates that qNMR can be used to rapidly detect ethanol content in frozen fruit pulps and food derivatives. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  17. Quantitative intracellular flux modeling and applications in biotherapeutic development and production using CHO cell cultures.

    Science.gov (United States)

    Huang, Zhuangrong; Lee, Dong-Yup; Yoon, Seongkyu

    2017-12-01

    Chinese hamster ovary (CHO) cells have been widely used for producing many recombinant therapeutic proteins. Constraint-based modeling, such as flux balance analysis (FBA) and metabolic flux analysis (MFA), has been developing rapidly for the quantification of intracellular metabolic flux distribution at a systematic level. Such methods would produce detailed maps of flows through metabolic networks, which contribute significantly to better understanding of metabolism in cells. Although these approaches have been extensively established in microbial systems, their application to mammalian cells is sparse. This review brings together the recent development of constraint-based models and their applications in CHO cells. The further development of constraint-based modeling approaches driven by multi-omics datasets is discussed, and a framework of potential modeling application in cell culture engineering is proposed. Improved cell culture system understanding will enable robust developments in cell line and bioprocess engineering thus accelerating consistent process quality control in biopharmaceutical manufacturing. © 2017 Wiley Periodicals, Inc.

  18. Quantitative measures of corrosion and prevention: application to corrosion in agriculture

    NARCIS (Netherlands)

    Schouten, J.C.; Gellings, P.J.

    1987-01-01

    The corrosion protection factor (c.p.f.) and the corrosion condition (c.c.) are simple instruments for the study and evaluation of the contribution and efficiency of several methods of corrosion prevention and control. The application of c.p.f. and c.c. to corrosion and prevention in agriculture in

  19. Benchmarking PET for geoscientific applications: 3D quantitative diffusion coefficient determination in clay rock

    Science.gov (United States)

    Lippmann-Pipke, J.; Gerasch, R.; Schikora, J.; Kulenkampff, J.

    2017-04-01

    The 3D diagonal anisotropic effective diffusion coefficient of Na+, Deff=(Dxx, Dyy, Dzz), was quantified in a clay material in one single experiment/simulation. That is possible due to the combination of the non-invasive observation of Na+ diffusion in Opalinus clay by means of GeoPET method (PET: positron emission tomography) followed by quantitative 3D+t data evaluation by means of the finite element numerical modelling (FEM). The extracted anisotropic effective diffusion coefficient parallel (||) and normal (⊥) to the bedding of the clay rock, Deff=(D||, D⊥, D||) are comparable to those obtained on earlier experimental studies in the same clay material but with different methods. We consider this study as benchmark for the long-standing development of our GeoPET method, that explicitly includes a resolute and physics based attenuation and Compton scatter correction algorithm (Kulenkampff, J., M. Gründig, A. Zakhnini and J. Lippmann-Pipke (2016). "Geoscientific process monitoring with positron emission tomography (GeoPET)." Solid Earth 7: 1217-1231). We suggest GeoPET based fluid flow transport visualization combined with computer based process simulation henceforth as a qualified way for the quantification of three-dimensional, effective transport parameters in geosciences.

  20. Quantitative Clinical Chemistry Proteomics (qCCP) using mass spectrometry: general characteristics and application.

    Science.gov (United States)

    Lehmann, Sylvain; Hoofnagle, Andrew; Hochstrasser, Denis; Brede, Cato; Glueckmann, Matthias; Cocho, José A; Ceglarek, Uta; Lenz, Christof; Vialaret, Jérôme; Scherl, Alexander; Hirtz, Christophe

    2013-05-01

    Proteomics studies typically aim to exhaustively detect peptides/proteins in a given biological sample. Over the past decade, the number of publications using proteomics methodologies has exploded. This was made possible due to the availability of high-quality genomic data and many technological advances in the fields of microfluidics and mass spectrometry. Proteomics in biomedical research was initially used in 'functional' studies for the identification of proteins involved in pathophysiological processes, complexes and networks. Improved sensitivity of instrumentation facilitated the analysis of even more complex sample types, including human biological fluids. It is at that point the field of clinical proteomics was born, and its fundamental aim was the discovery and (ideally) validation of biomarkers for the diagnosis, prognosis, or therapeutic monitoring of disease. Eventually, it was recognized that the technologies used in clinical proteomics studies [particularly liquid chromatography-tandem mass spectrometry (LC-MS/MS)] could represent an alternative to classical immunochemical assays. Prior to deploying MS in the measurement of peptides/proteins in the clinical laboratory, it seems likely that traditional proteomics workflows and data management systems will need to adapt to the clinical environment and meet in vitro diagnostic (IVD) regulatory constraints. This defines a new field, as reviewed in this article, that we have termed quantitative Clinical Chemistry Proteomics (qCCP).

  1. Quantitative T2 mapping of white matter: applications for ageing and cognitive decline

    Science.gov (United States)

    Knight, Michael J.; McCann, Bryony; Tsivos, Demitra; Dillon, Serena; Coulthard, Elizabeth; Kauppinen, Risto A.

    2016-08-01

    In MRI, the coherence lifetime T2 is sensitive to the magnetic environment imposed by tissue microstructure and biochemistry in vivo. Here we explore the possibility that the use of T2 relaxometry may provide information complementary to that provided by diffusion tensor imaging (DTI) in ageing of healthy controls (HC), Alzheimer’s disease (AD) and mild cognitive impairment (MCI). T2 and diffusion MRI metrics were quantified in HC and patients with MCI and mild AD using multi-echo MRI and DTI. We used tract-based spatial statistics (TBSS) to evaluate quantitative MRI parameters in white matter (WM). A prolonged T2 in WM was associated with AD, and able to distinguish AD from MCI, and AD from HC. Shorter WM T2 was associated with better cognition and younger age in general. In no case was a reduction in T2 associated with poorer cognition. We also applied principal component analysis, showing that WM volume changes independently of  T2, MRI diffusion indices and cognitive performance indices. Our data add to the evidence that age-related and AD-related decline in cognition is in part attributable to WM tissue state, and much less to WM quantity. These observations suggest that WM is involved in AD pathology, and that T2 relaxometry is a potential imaging modality for detecting and characterising WM in cognitive decline and dementia.

  2. Quantitative Investigation of Roasting-magnetic Separation for Hematite Oolitic-ores: Mechanisms and Industrial Application

    Directory of Open Access Journals (Sweden)

    Peng Tiefeng

    2017-12-01

    Full Text Available Natural high-quality iron can be directly applied to pyro-metallurgy process, however, the availability of these ores has become less and less due to exploitation. This research reports a systematic approach using reduction roasting and magnetic separation for oolitic iron ores from west Hubei Province. Firstly, a mineralogical study was performed and it was shown that the oolitic particles were mainly composed of hematite, with some silicon-quartz inside the oolitic particle. Then, the roasting temperature was examined and shown to have significant influence on both Fe recovery and the Fe content of the concentrate. Also the Fe content gradually increased as the temperature increased from 700 to 850 °C. The most important aspects are the quantitative investigation of change of mineral phases, and reduction area (with ratio during the reduction roasting process. The results showed that Fe2O3 decreased with temperature, and Fe3O4 (magnetite increased considerably from 600 to 800 °C. The reductive reaction was found to occur from the outside in, the original oolitic structure and embedding relationship among the minerals did not change after roasting. Finally, 5% surrounding rock was added to mimic real industrial iron beneficiation. This study could provides useful insight and practical support for the utilization of such iron ores.

  3. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  4. Quantitative assessment of the probability of bluetongue virus overwintering by horizontal transmission: application to Germany.

    Science.gov (United States)

    Napp, Sebastian; Gubbins, Simon; Calistri, Paolo; Allepuz, Alberto; Alba, Anna; García-Bocanegra, Ignacio; Giovannini, Armando; Casal, Jordi

    2011-01-11

    Even though bluetongue virus (BTV) transmission is apparently interrupted during winter, bluetongue outbreaks often reappear in the next season (overwintering). Several mechanisms for BTV overwintering have been proposed, but to date, their relative importance remain unclear. In order to assess the probability of BTV overwintering by persistence in adult vectors, ruminants (through prolonged viraemia) or a combination of both, a quantitative risk assessment model was developed. Furthermore, the model allowed the role played by the residual number of vectors present during winter to be examined, and the effect of a proportion of Culicoides living inside buildings (endophilic behaviour) to be explored. The model was then applied to a real scenario: overwintering in Germany between 2006 and 2007. The results showed that the limited number of vectors active during winter seemed to allow the transmission of BTV during this period, and that while transmission was favoured by the endophilic behaviour of some Culicoides, its effect was limited. Even though transmission was possible, the likelihood of BTV overwintering by the mechanisms studied seemed too low to explain the observed re-emergence of the disease. Therefore, other overwintering mechanisms not considered in the model are likely to have played a significant role in BTV overwintering in Germany between 2006 and 2007.

  5. Application of infrared lock-in thermography for the quantitative evaluation of bruises on pears

    Science.gov (United States)

    Kim, Ghiseok; Kim, Geon-Hee; Park, Jongmin; Kim, Dae-Yong; Cho, Byoung-Kwan

    2014-03-01

    An infrared lock-in thermography technique was adjusted for the detection of early bruises on pears. This mechanical damage is usually difficult to detect in the early stage after harvested using conventional visual sorting or CCD sensor-based imaging processing methods. We measured the thermal emission signals from pears using a highly sensitive mid-infrared thermal camera. These images were post-processed using a lock-in method that utilized the periodic thermal energy input to the pear. By applying the lock-in method to infrared thermography, the detection sensitivity and signal to noise ratio were enhanced because of the phase-sensitive narrow-band filtering effect. It was also found that the phase information of thermal emission from pears provides good metrics with which to identify quantitative information about both damage size and damage depth for pears. Additionally, a photothermal model was implemented to investigate the behavior of thermal waves on pears under convective conditions. Theoretical results were compared to experimental results. These results suggested that the proposed lock-in thermography technique and resultant phase information can be used to detect mechanical damage to fruit, especially in the early stage of bruising.

  6. Application of Western blot analysis for the diagnosis of Encephalitozoon cuniculi infection in rabbits: example of a quantitative approach.

    Science.gov (United States)

    Desoubeaux, Guillaume; Pantin, Ana; Peschke, Roman; Joachim, Anja; Cray, Carolyn

    2017-02-01

    Diagnosis of Encephalitozoon cuniculi infection in rabbits remains a major veterinary issue. ELISA or immunofluorescence assays are the current reference standards of serological tests. However, these conventional techniques suffer from a lack of accuracy for distinguishing active from past infections, as a positive serostatus is common in clinically normal rabbits. In this study, we assessed the diagnostic performance of Western blot (WB) to detect both anti-E. cuniculi immunoglobulin G (IgG) and immunoglobulin M (IgM) in comparison with ELISA and to address the intensity of the immune response through a quantitative approach. Positive WB results were highly correlated with the E. cuniculi-related diseased status (P < 0.0001). Although it was more labor intensive and less standardized, quantitative WB provided detailed comparable analysis regarding the humoral response and diagnostic performance similar to ELISA testing with statistically higher sensitivity (88.4 vs. 76.1% for IgG detection and 84.3 vs. 70.4% for IgM, P < 0.01). Several specific WB bands were shown to be significantly associated with concomitant clinical signs, like the one located at 50 kDa (OR = 8.2, [2.4-27.7], P = 0.0008) for IgG and (OR = 27.9, [4.2-187.9], P = 0.0006) for IgM. Therefore, the quantitative WB may have application in veterinary diagnostic laboratories to increase the accuracy of the clinical diagnosis of E. cuniculi infection. In addition, this tool may help to further understand the development and function of the humoral immune response to this infectious agent.

  7. Applications of an Automated and Quantitative CE-Based Size and Charge Western Blot for Therapeutic Proteins and Vaccines.

    Science.gov (United States)

    Rustandi, Richard R; Hamm, Melissa; Lancaster, Catherine; Loughney, John W

    2016-01-01

    Capillary Electrophoresis (CE) is a versatile and indispensable analytical tool that can be applied to characterize proteins. In recent years, labor-intensive SDS-PAGE and IEF slab gels have been replaced with CE-SDS (CGE) and CE-IEF methods, respectively, in the biopharmaceutical industry. These two CE-based methods are now an industry standard and are an expectation of the regulatory agencies for biologics characterization. Another important and traditional slab gel technique is the western blot, which detects proteins using immuno-specific reagents after SDS-PAGE separation. This technique is widely used across industrial and academic laboratories, but it is very laborious, manual, time-consuming, and only semi-quantitative. Here, we describe the applications of a relatively new CE-based western blot technology which is automated, fast, and quantitative. We have used this technology for both charge- and size-based CE westerns to analyze biotherapeutic and vaccine products. The size-based capillary western can be used for fast antibody screening, clone selection, product titer, identity, and degradation while the charge-based capillary western can be used to study product charge heterogeneity. Examples using this technology for monoclonal antibody (mAb), Enbrel, CRM197, and Clostridium difficile (C. difficile) vaccine proteins are presented here to demonstrate the utility of the capillary western techniques. Details of sample preparation and experimental conditions for each capillary western mode are described in this chapter.

  8. Novel Application of Fluorescence Lifetime and Fluorescence Microscopy Enables Quantitative Access to Subcellular Dynamics in Plant Cells

    Science.gov (United States)

    Elgass, Kirstin; Caesar, Katharina; Schleifenbaum, Frank; Stierhof, York-Dieter; Meixner, Alfred J.; Harter, Klaus

    2009-01-01

    Background Optical and spectroscopic technologies working at subcellular resolution with quantitative output are required for a deeper understanding of molecular processes and mechanisms in living cells. Such technologies are prerequisite for the realisation of predictive biology at cellular and subcellular level. However, although established in the physical sciences, these techniques are rarely applied to cell biology in the plant sciences. Principal Findings Here, we present a combined application of one-chromophore fluorescence lifetime microscopy and wavelength-selective fluorescence microscopy to analyse the function of a GFP fusion of the Brassinosteroid Insensitive 1 Receptor (BRI1-GFP) with high spatial and temporal resolution in living Arabidopsis cells in their tissue environment. We show a rapid, brassinolide-induced cell wall expansion and a fast BR-regulated change in the BRI1-GFP fluorescence lifetime in the plasmamembrane in vivo. Both cell wall expansion and changes in fluorescence lifetime reflect early BR-induced and BRI1-dependent physiological or signalling processes. Our experiments also show the potential of one-chromophore fluorescence lifetime microscopy for the in vivo monitoring of the biochemical and biophysical subcellular environment using GFP fusion proteins as probes. Significance One-chromophore fluorescence lifetime microscopy, combined with wavelength-specific fluorescence microscopy, opens up new frontiers for in vivo dynamic and quantitative analysis of cellular processes at high resolution which are not addressable by pure imaging technologies or transmission electron microscopy. PMID:19492078

  9. Application of in vitro skin penetration measurements to confirm and refine the quantitative skin sensitization risk assessment of methylisothiazolinone.

    Science.gov (United States)

    Rothe, Helga; Ryan, Cindy A; Page, Leanne; Vinall, Joanne; Goebel, Carsten; Scheffler, Heike; Toner, Frank; Roper, Clive; Kern, Petra S

    2017-12-01

    Use of quantitative risk assessment (QRA) for assessing the skin sensitization potential of chemicals present in consumer products requires an understanding of hazard and product exposure. In the absence of data, consumer exposure is based on relevant habits and practices and assumes 100% skin uptake of the applied dose. To confirm and refine the exposure, a novel design for in vitro skin exposure measurements was conducted with the preservative, methylisothiazolinone (MI), in beauty care (BC) and household care (HHC) products using realistic consumer exposure conditions. A difference between measured exposure levels (MELs) for MI in leave-on versus rinse-off BC products, and lower MELs for MI in HHC rinse-off compared to BC products was demonstrated. For repeated product applications, the measured exposure was lower than estimations based on summation of applied amounts. Compared to rinse-off products, leave-on applications resulted in higher MELs, correlating with the higher incidences of allergic contact dermatitis associated with those product types. Lower MELs for MI in rinse-off products indicate a lower likelihood to induce skin sensitization, also after multiple daily applications. These in vitro skin exposure measurements indicate conservatism of default exposure estimates applied in skin sensitization QRA and might be helpful in future risk assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. ffect of Nitrogen and Zinc Foliar Application on Quantitative Traits of Tea Rosslle (Hibiscus sabdariffa in Jiroft Zone

    Directory of Open Access Journals (Sweden)

    abdolreza raisi sarbijan

    2017-02-01

    Full Text Available Introduction: Nitrogen is an essential element forplants and in combination withelements such as carbon, oxygen, hydrogen and sulfur results ineven more valuable materials such as amino acids, nucleic acids, alkaloids. Hibiscus tea (Hibiscus sabdariffa from Malvaceaefamily is known by different names in different parts of the world. In Iran it is calledthe Maki tea, tea Meccaorred tea.As an important plant,it is decided to investigate its growth and development in Jiroft. Materials and Methods The experiment was conducted as factorial based on randomized complete block design with three replications in farm research of Islamic Azad University of Jiroft during 2010. The first factor was nitrogen foliar application in four levels (0, 1, 2 and 3 percent and second factor was foliar application of zinc at twolevels (0 and 1 percent. The measured quantitative characteristics were stem diameter, plant height, calycle fresh weight,calycle dry weight, plant fresh weight,plant dry weight, leaf fresh weight,leaf dry weight, mucilage percentage and mucilage yield. Results and Discussion:The results of ANOVA showed that nitrogen foliar application on leaf dry weight, calycle fresh and dry weight was effective. Plant fresh weight, dry weight, stem diameter, plant height, mucilage percentageandmucilage yield showedsignificanteffects. Zinc foliar application significantly affected leaf fresh weight,leafdry weight, calycle fresh weight, plant fresh weight,plant dry weight, mucilage percentage andmucilage yield.The interaction effect of nitrogen and zinc on leaf dry weight, plant freshweight and plant dry weight was also significant. The mean comparison of studied characteristics revealed that byincreasing the amount of nitrogen up to N2 level, the stem diameter, plant height, leaf dry weight, calycle dry weight, mucilage percentage and yield increased but there was no significant difference between N2 and N3 levels. Plant fresh weight and plantdry weight

  11. Nano-graphene oxide carboxylation for efficient bioconjugation applications: a quantitative optimization approach

    Energy Technology Data Exchange (ETDEWEB)

    Imani, Rana; Emami, Shahriar Hojjati, E-mail: semami@aut.ac.ir [Amirkabir University of Technology, Department of Biomedical Engineering (Iran, Islamic Republic of); Faghihi, Shahab, E-mail: shahabeddin.faghihi@mail.mcgill.ca, E-mail: sfaghihi@nigeb.ac.ir [National Institute of Genetic Engineering and Biotechnology, Tissue Engineering and Biomaterials Division (Iran, Islamic Republic of)

    2015-02-15

    A method for carboxylation of graphene oxide (GO) with chloroacetic acid that precisely optimizes and controls the efficacy of the process for bioconjugation applications is proposed. Quantification of COOH groups on nano-graphene oxide sheets (NGOS) is performed by novel colorimetric methylene blue (MB) assay. The GO is synthesized and carboxylated by chloroacetic acid treatment under strong basic condition. The size and morphology of the as-prepared NGOS are characterized by scanning electron microscopy, transmission electron microscopy (TEM), and atomic force microscopy (AFM). The effect of acid to base molar ratio on the physical, chemical, and morphological properties of NGOS is analyzed by Fourier-transformed infrared spectrometry (FTIR), UV–Vis spectroscopy, X-ray diffraction (XRD), AFM, and zeta potential. For evaluation of bioconjugation efficacy, the synthesized nano-carriers with different carboxylation ratios are functionalized by octaarginine peptide sequence (R8) as a biomolecule model containing amine groups. The quantification of attached R8 peptides to graphene nano-sheets’ surface is performed with a colorimetric-based assay which includes the application of 2,4,6-Trinitrobenzene sulfonic acid (TNBS). The results show that the thickness and lateral size of nano-sheets are dramatically decreased to 0.8 nm and 50–100 nm after carboxylation process, respectively. X-ray analysis shows the nano-sheets interlaying space is affected by the alteration of chloroacetic acid to base ratio. The MB assay reveals that the COOH groups on the surface of NGOS are maximized at the acid to base ratio of 2 which is confirmed by FTIR, XRD, and zeta potential. The TNBS assay also shows that bioconjugation of the optimized carboxylated NGOS sample with octaarginine peptide is 2.5 times more efficient compared to bare NGOS. The present work provides evidence that treatment of GO by chloroacetic acid under an optimized condition would create a functionalized high

  12. Improving the quantitative accuracy of optical-emission computed tomography by incorporating an attenuation correction: application to HIF1 imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kim, E; Bowsher, J; Thomas, A S; Sakhalkar, H; Dewhirst, M; Oldham, M [Department of Radiation Oncology, Duke University Medical Center, Durham, NC (United States)

    2008-10-07

    Optical computed tomography (optical-CT) and optical-emission computed tomography (optical-ECT) are new techniques for imaging the 3D structure and function (including gene expression) of whole unsectioned tissue samples. This work presents a method of improving the quantitative accuracy of optical-ECT by correcting for the 'self'-attenuation of photons emitted within the sample. The correction is analogous to a method commonly applied in single-photon-emission computed tomography reconstruction. The performance of the correction method was investigated by application to a transparent cylindrical gelatin phantom, containing a known distribution of attenuation (a central ink-doped gelatine core) and a known distribution of fluorescing fibres. Attenuation corrected and uncorrected optical-ECT images were reconstructed on the phantom to enable an evaluation of the effectiveness of the correction. Significant attenuation artefacts were observed in the uncorrected images where the central fibre appeared {approx}24% less intense due to greater attenuation from the surrounding ink-doped gelatin. This artefact was almost completely removed in the attenuation-corrected image, where the central fibre was within {approx}4% of the others. The successful phantom test enabled application of attenuation correction to optical-ECT images of an unsectioned human breast xenograft tumour grown subcutaneously on the hind leg of a nude mouse. This tumour cell line had been genetically labelled (pre-implantation) with fluorescent reporter genes such that all viable tumour cells expressed constitutive red fluorescent protein and hypoxia-inducible factor 1 transcription-produced green fluorescent protein. In addition to the fluorescent reporter labelling of gene expression, the tumour microvasculature was labelled by a light-absorbing vasculature contrast agent delivered in vivo by tail-vein injection. Optical-CT transmission images yielded high-resolution 3D images of the

  13. Design of the stereoscopic eye-tracking system for quantitative remote sensing applications

    Science.gov (United States)

    Sergeyev, Aleksandr; Levin, Eugene; Roggemann, Michael C.; Gienko, Gennady

    2008-08-01

    Spatial and temporal data derived from eye movements, compiled while the human eye observes geospatial imagery, retain meaningful and usable information. When human perceives the stereo effect, the virtual three dimensional (3D) model resulting from eye-brain interaction is generated in the mind. If the eye movements are recorded while the virtual model is observed, it is possible to reconstruct a 3D geometrical model almost identical to the one generated in the human brain. Information obtained from eye-movements can be utilized in many ways for remote sensing applications such as geospatial image analysis and interpretation. There are various eyetracking systems available on the market; however, none of them is designed to work with stereoscopic imagery. We explore different approaches and designs of the most suitable and non-intrusive scheme for stereoscopic image viewing in the eye-tracking systems to observe and analyze 3D visual models. The design of the proposed system is based on the optical separation method, which provides visually comfortable environment for perception of stereoscopic imagery. A proof of concept solution is based on multiple mirror-lens assembly that provides a significant reduction of geometrical constrains in eye-frame capturing. Two projected solutions: for wide-angle of viewing and helmet-integrated eye-tracker are also discussed here.

  14. Quantitative methods and detection techniques in hyperspectral imaging involving medical and other applications

    Science.gov (United States)

    Roy, Ankita

    2007-12-01

    This research using Hyperspectral imaging involves recognizing targets through spatial and spectral matching and spectral un-mixing of data ranging from remote sensing to medical imaging kernels for clinical studies based on Hyperspectral data-sets generated using the VFTHSI [Visible Fourier Transform Hyperspectral Imager], whose high resolution Si detector makes the analysis achievable. The research may be broadly classified into (I) A Physically Motivated Correlation Formalism (PMCF), which places both spatial and spectral data on an equivalent mathematical footing in the context of a specific Kernel and (II) An application in RF plasma specie detection during carbon nanotube growing process. (III) Hyperspectral analysis for assessing density and distribution of retinopathies like age related macular degeneration (ARMD) and error estimation enabling the early recognition of ARMD, which is treated as an ill-conditioned inverse imaging problem. The broad statistical scopes of this research are two fold-target recognition problems and spectral unmixing problems. All processes involve experimental and computational analysis of Hyperspectral data sets is presented, which is based on the principle of a Sagnac Interferometer, calibrated to obtain high SNR levels. PMCF computes spectral/spatial/cross moments and answers the question of how optimally the entire hypercube should be sampled and finds how many spatial-spectral pixels are required precisely for a particular target recognition. Spectral analysis of RF plasma radicals, typically Methane plasma and Argon plasma using VFTHSI has enabled better process monitoring during growth of vertically aligned multi-walled carbon nanotubes by instant registration of the chemical composition or density changes temporally, which is key since a significant correlation can be found between plasma state and structural properties. A vital focus of this dissertation is towards medical Hyperspectral imaging applied to retinopathies

  15. QUANTITATIVE METHODS FOR RESERVOIR CHARACTERIZATION AND IMPROVED RECOVERY: APPLICATION TO HEAVY OIL SANDS

    Energy Technology Data Exchange (ETDEWEB)

    James W. Castle; Fred J. Molz; Ronald W. Falta; Cynthia L. Dinwiddie; Scott E. Brame; Robert A. Bridges

    2002-10-30

    Improved prediction of interwell reservoir heterogeneity has the potential to increase productivity and to reduce recovery cost for California's heavy oil sands, which contain approximately 2.3 billion barrels of remaining reserves in the Temblor Formation and in other formations of the San Joaquin Valley. This investigation involves application of advanced analytical property-distribution methods conditioned to continuous outcrop control for improved reservoir characterization and simulation, particularly in heavy oil sands. The investigation was performed in collaboration with Chevron Production Company U.S.A. as an industrial partner, and incorporates data from the Temblor Formation in Chevron's West Coalinga Field. Observations of lateral variability and vertical sequences observed in Temblor Formation outcrops has led to a better understanding of reservoir geology in West Coalinga Field. Based on the characteristics of stratigraphic bounding surfaces in the outcrops, these surfaces were identified in the subsurface using cores and logs. The bounding surfaces were mapped and then used as reference horizons in the reservoir modeling. Facies groups and facies tracts were recognized from outcrops and cores of the Temblor Formation and were applied to defining the stratigraphic framework and facies architecture for building 3D geological models. The following facies tracts were recognized: incised valley, estuarine, tide- to wave-dominated shoreline, diatomite, and subtidal. A new minipermeameter probe, which has important advantages over previous methods of measuring outcrop permeability, was developed during this project. The device, which measures permeability at the distal end of a small drillhole, avoids surface weathering effects and provides a superior seal compared with previous methods for measuring outcrop permeability. The new probe was used successfully for obtaining a high-quality permeability data set from an outcrop in southern Utah

  16. Propagation of error from parameter constraints in quantitative MRI: Example application of multiple spin echo T2mapping.

    Science.gov (United States)

    Lankford, Christopher L; Does, Mark D

    2018-02-01

    Quantitative MRI may require correcting for nuisance parameters which can or must be constrained to independently measured or assumed values. The noise and/or bias in these constraints propagate to fitted parameters. For example, the case of refocusing pulse flip angle constraint in multiple spin echo T 2 mapping is explored. An analytical expression for the mean-squared error of a parameter of interest was derived as a function of the accuracy and precision of an independent estimate of a nuisance parameter. The expression was validated by simulations and then used to evaluate the effects of flip angle (θ) constraint on the accuracy and precision of T⁁2 for a variety of multi-echo T 2 mapping protocols. Constraining θ improved T⁁2 precision when the θ-map signal-to-noise ratio was greater than approximately one-half that of the first spin echo image. For many practical scenarios, constrained fitting was calculated to reduce not just the variance but the full mean-squared error of T⁁2, for bias in θ⁁≲6%. The analytical expression derived in this work can be applied to inform experimental design in quantitative MRI. The example application to T 2 mapping provided specific cases, depending on θ⁁ accuracy and precision, in which θ⁁ measurement and constraint would be beneficial to T⁁2 variance or mean-squared error. Magn Reson Med 79:673-682, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  17. Growth Factor Release from Lyophilized Porcine Platelet-Rich Plasma: Quantitative Analysis and Implications for Clinical Applications.

    Science.gov (United States)

    Pan, Long; Yong, Zhang; Yuk, Kim Sun; Hoon, Kim Young; Yuedong, Shi; Xu, Jianwei

    2016-02-01

    Freeze-dried platelet-rich plasma (FD PRP) is of potential value for clinical applications. However, growth factors released from FD PRP have not been well studied. Our study investigates growth factor release from FD PRP preparations, compared with other PRP samples, to further facilitate such clinical use. We used four experimental groups: (1) Fresh porcine PRP (PRP), (2) PRP activated by calcium chloride (CaCl2) (Ca PRP), (3) PRP activated by CaCl2, followed by freeze drying (Ca-FD PRP), and (4) PRP freeze-dried first, then activated by CaCl2 (FD-Ca PRP). All FD PRP samples were kept for up to 4 weeks at room temperature (22 °C) and reconstituted prior to analysis. Transforming growth factor-β1 (TGF-β1), platelet-derived growth factor AB (PDGF-AB), and vascular endothelial growth factor (VEGF) were quantitated by ELISA at 15 min and 1 h incubation times. The concentrations of all growth factors in Ca PRP, measured at 1 h, were significantly higher than those in PRP (p 0.05). Levels of VEGF in Ca-FD PRP were not significantly different than in Ca PRP (p > 0.05). However, TGF-β1 concentrations in Ca-FD PRP, measured at 15 min, were higher than those in Ca PRP (p factors after storage for 4 weeks at room temperature, indicating its ease of use and wider possibilities for clinical applications. This journal requires that authors assign a level of evidence to each submission to which Evidence-Based Medicine rankings are applicable. This excludes Review Articles, Book Reviews, and manuscripts that concern Basic Science, Animal Studies, Cadaver Studies, and Experimental Studies. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266.

  18. FANSe2: a robust and cost-efficient alignment tool for quantitative next-generation sequencing applications.

    Directory of Open Access Journals (Sweden)

    Chuan-Le Xiao

    Full Text Available Correct and bias-free interpretation of the deep sequencing data is inevitably dependent on the complete mapping of all mappable reads to the reference sequence, especially for quantitative RNA-seq applications. Seed-based algorithms are generally slow but robust, while Burrows-Wheeler Transform (BWT based algorithms are fast but less robust. To have both advantages, we developed an algorithm FANSe2 with iterative mapping strategy based on the statistics of real-world sequencing error distribution to substantially accelerate the mapping without compromising the accuracy. Its sensitivity and accuracy are higher than the BWT-based algorithms in the tests using both prokaryotic and eukaryotic sequencing datasets. The gene identification results of FANSe2 is experimentally validated, while the previous algorithms have false positives and false negatives. FANSe2 showed remarkably better consistency to the microarray than most other algorithms in terms of gene expression quantifications. We implemented a scalable and almost maintenance-free parallelization method that can utilize the computational power of multiple office computers, a novel feature not present in any other mainstream algorithm. With three normal office computers, we demonstrated that FANSe2 mapped an RNA-seq dataset generated from an entire Illunima HiSeq 2000 flowcell (8 lanes, 608 M reads to masked human genome within 4.1 hours with higher sensitivity than Bowtie/Bowtie2. FANSe2 thus provides robust accuracy, full indel sensitivity, fast speed, versatile compatibility and economical computational utilization, making it a useful and practical tool for deep sequencing applications. FANSe2 is freely available at http://bioinformatics.jnu.edu.cn/software/fanse2/.

  19. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications.

    Science.gov (United States)

    Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M

    2017-05-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.

  20. A Quantitative Structure Activity Relationship for acute oral toxicity of pesticides on rats: Validation, domain of application and prediction.

    Science.gov (United States)

    Hamadache, Mabrouk; Benkortbi, Othmane; Hanini, Salah; Amrane, Abdeltif; Khaouane, Latifa; Si Moussa, Cherif

    2016-02-13

    Quantitative Structure Activity Relationship (QSAR) models are expected to play an important role in the risk assessment of chemicals on humans and the environment. In this study, we developed a validated QSAR model to predict acute oral toxicity of 329 pesticides to rats because a few QSAR models have been devoted to predict the Lethal Dose 50 (LD50) of pesticides on rats. This QSAR model is based on 17 molecular descriptors, and is robust, externally predictive and characterized by a good applicability domain. The best results were obtained with a 17/9/1 Artificial Neural Network model trained with the Quasi Newton back propagation (BFGS) algorithm. The prediction accuracy for the external validation set was estimated by the Q(2)ext and the root mean square error (RMS) which are equal to 0.948 and 0.201, respectively. 98.6% of external validation set is correctly predicted and the present model proved to be superior to models previously published. Accordingly, the model developed in this study provides excellent predictions and can be used to predict the acute oral toxicity of pesticides, particularly for those that have not been tested as well as new pesticides. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. An UPLC-MS/MS method for the quantitation of vortioxetine in rat plasma: Application to a pharmacokinetic study.

    Science.gov (United States)

    Gu, Er-min; Huang, Chengke; Liang, Bingqing; Yuan, Lingjing; Lan, Tian; Hu, Guoxin; Zhou, Hongyu

    2015-08-01

    In this work, a simple, sensitive and fast ultra performance liquid chromatography with tandem mass spectrometry (UPLC-MS/MS) method was developed and validated for the quantitative determination of vortioxetine in rat plasma. Plasma samples were processed with a protein precipitation. The separation was achieved by an Acquity UPLC BEH C18 column (2.1mm×50mm, 1.7μm) column with a gradient mobile phase consisting of 0.1% formic acid in water and acetonitrile. Detection was carried out using positive-ion electrospray tandem mass spectrometry via multiple reaction monitoring (MRM). The validated method had an excellent linearity in the range of 0.05-20ng/mL (R(2)>0.997) with a lower limit of quantification (0.05ng/mL). The extraction recovery was in the range of 78.3-88.4% for vortioxetine and 80.3% for carbamazepine (internal standard, IS). The intra- and inter-day precision was below 8.5% and accuracy was from -11.2% to 9.5%. No notable matrix effect and astaticism was observed for vortioxetine. The method has been successfully applied to a pharmacokinetic study of vortioxetine in rats for the first time, which provides the basis for the further development and application of vortioxetine. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Application of light sheet microscopy for qualitative and quantitative analysis of bronchus-associated lymphoid tissue in mice.

    Science.gov (United States)

    Mzinza, David Twapokera; Fleige, Henrike; Laarmann, Kristin; Willenzon, Stefanie; Ristenpart, Jasmin; Spanier, Julia; Sutter, Gerd; Kalinke, Ulrich; Valentin-Weigand, Peter; Förster, Reinhold

    2018-02-12

    Bronchus-associated lymphoid tissue (BALT) develops at unpredictable locations around lung bronchi following pulmonary inflammation. The formation and composition of BALT have primarily been investigated by immunohistology that, due to the size of the invested organ, is usually restricted to a limited number of histological sections. To assess the entire BALT of the lung, other approaches are urgently needed. Here, we introduce a novel light sheet microscopy-based approach for assessing lymphoid tissue in the lung. Using antibody staining of whole lung lobes and optical clearing by organic solvents, we present a method that allows in-depth visualization of the entire bronchial tree, the lymphatic vasculature and the immune cell composition of the induced BALT. Furthermore, three-dimensional analysis of the entire lung allows the qualitative and quantitative enumeration of the induced BALT. Using this approach, we show that a single intranasal application of the replication-deficient poxvirus MVA induces BALT that constitutes up to 8% of the entire lung volume in mice deficient in CCR7, in contrast to wild type mice (WT). Furthermore, BALT induced by heat-inactivated E. coli is dominated by a pronounced T cell infiltration in Cxcr5-deficient mice, in contrast to WT mice.Cellular and Molecular Immunology advance online publication, 12 February 2018; doi:10.1038/cmi.2017.150.

  3. The quantitative real-time PCR applications in the monitoring of marine harmful algal bloom (HAB) species.

    Science.gov (United States)

    Penna, Antonella; Antonella, Penna; Galluzzi, Luca; Luca, Galluzzi

    2013-10-01

    In the last decade, various molecular methods (e.g., fluorescent hybridization assay, sandwich hybridization assay, automatized biosensor detection, real-time PCR assay) have been developed and implemented for accurate and specific identification and estimation of marine toxic microalgal species. This review focuses on the recent quantitative real-time PCR (qrt-PCR) technology developed for the control and monitoring of the most important taxonomic phytoplankton groups producing biotoxins with relevant negative impact on human health, the marine environment, and related economic activities. The high specificity and sensitivity of the qrt-PCR methods determined by the adequate choice of the genomic target gene, nucleic acid purification protocol, quantification through the standard curve, and type of chemical detection method make them highly efficient and therefore applicable to harmful algal bloom phenomena. Recent development of qrt-PCR-based assays using the target gene of toxins, such as saxitoxin compounds, has allowed more precise quantification of toxigenic species (i.e., Alexandrium catenella) abundance. These studies focus only on toxin-producing species in the marine environment. Therefore, qrt-PCR technology seems to offer the advantages of understanding the ecology of harmful algal bloom species and facilitating the management of their outbreaks.

  4. The selected reaction monitoring/multiple reaction monitoring-based mass spectrometry approach for the accurate quantitation of proteins: clinical applications in the cardiovascular diseases.

    Science.gov (United States)

    Gianazza, Erica; Tremoli, Elena; Banfi, Cristina

    2014-12-01

    Selected reaction monitoring, also known as multiple reaction monitoring, is a powerful targeted mass spectrometry approach for a confident quantitation of proteins/peptides in complex biological samples. In recent years, its optimization and application have become pivotal and of great interest in clinical research to derive useful outcomes for patient care. Thus, selected reaction monitoring/multiple reaction monitoring is now used as a highly sensitive and selective method for the evaluation of protein abundances and biomarker verification with potential applications in medical screening. This review describes technical aspects for the development of a robust multiplex assay and discussing its recent applications in cardiovascular proteomics: verification of promising disease candidates to select only the highest quality peptides/proteins for a preclinical validation, as well as quantitation of protein isoforms and post-translational modifications.

  5. Going Beyond, Going Further. Quantitative Application of Thin-Layer Chromatography in the Analysis of Organic Compounds.

    Science.gov (United States)

    Giuliano, Vincenzo; Rieck, John Paul

    1987-01-01

    Discusses the use of thin-layer chromatography (TLC) in the chemical laboratory as a quantitative method for determining the molecular weights of organic compounds. Describes a simple method which provides an illustration of the importance of polarity on solubility and demonstrates the effectiveness of TLC as a quantitative tool. (TW)

  6. Quantitative Segmentation of Fluorescence Microscopy Images of Heterogeneous Tissue: Application to the Detection of Residual Disease in Tumor Margins.

    Directory of Open Access Journals (Sweden)

    Jenna L Mueller

    Full Text Available To develop a robust tool for quantitative in situ pathology that allows visualization of heterogeneous tissue morphology and segmentation and quantification of image features.TISSUE EXCISED FROM A GENETICALLY ENGINEERED MOUSE MODEL OF SARCOMA WAS IMAGED USING A SUBCELLULAR RESOLUTION MICROENDOSCOPE AFTER TOPICAL APPLICATION OF A FLUORESCENT ANATOMICAL CONTRAST AGENT: acriflavine. An algorithm based on sparse component analysis (SCA and the circle transform (CT was developed for image segmentation and quantification of distinct tissue types. The accuracy of our approach was quantified through simulations of tumor and muscle images. Specifically, tumor, muscle, and tumor+muscle tissue images were simulated because these tissue types were most commonly observed in sarcoma margins. Simulations were based on tissue characteristics observed in pathology slides. The potential clinical utility of our approach was evaluated by imaging excised margins and the tumor bed in a cohort of mice after surgical resection of sarcoma.Simulation experiments revealed that SCA+CT achieved the lowest errors for larger nuclear sizes and for higher contrast ratios (nuclei intensity/background intensity. For imaging of tumor margins, SCA+CT effectively isolated nuclei from tumor, muscle, adipose, and tumor+muscle tissue types. Differences in density were correctly identified with SCA+CT in a cohort of ex vivo and in vivo images, thus illustrating the diagnostic potential of our approach.The combination of a subcellular-resolution microendoscope, acriflavine staining, and SCA+CT can be used to accurately isolate nuclei and quantify their density in anatomical images of heterogeneous tissue.

  7. [Application of Cationic Aluminum Phthalocyanine, a Red-Emitting Fluorescent Probe, for Sensitive Quantitative Analysis of RNA at Nanogram Level].

    Science.gov (United States)

    Guo, Meng-lin; Yang, Hui-qing; Huang, Ping; Chen, Lin; Li, Dong-hui

    2016-03-01

    Tetrasubstituted trimethyl ammonium iodide aluminum phthalocyanine (TTMAAlPc), a positively charged phthalocyanine compound, is an emerging and potentially useful red-emitting fluorescence probe. The study showed that the fluorescence of TTMAAlPc could be quenched by RNA with high efficiency in weak alkaline media, and the degree of quenching has a linear relationship with RNA in a wide concentration range. The mechanism of quenching behavior of RNA on TTMAAlPc was discussed. It was attributed by the static interaction between RNA and TTMAAlPc, and the assembly of TTMAAlPc induced by RNA. Based on this new discovery, a novel method for quantitative determination of RNA at nanogram level has been established. The factors, including the pH of medium, buffer system, reaction time, reaction temperature, the usage of TTMAAlPc as well as the interferences, which affected the determination, were investigated and discussed. Under optimum conditions, the linear range of the calibration curve was 7.71-1 705.57 ng x mL(-1). The detection limit for RNA was 1.55 ng x mL(-1). This method has been applied to the analysis of practical samples with satisfied results. The constructed method is of high sensitivity and has a wide linear range, it also showed strong ability in the tolerance of foreign substances from anions, cations, surfactants and vitamins, all of which are common interferences encountered in the determination of RNA. Besides, it is the first report that the fluorescence quantum yield of TTMAAlPc has been measured at different pH by reference method in this work. The achieved data indicated that the fluorescence quantum yield of TTMAAlPc is larger than 20% and it keeps constant in a wide range of acidity, implying that TTMAAlPc is a high-quality red-emitting fluorescence probe, it has great potential for practical applications, thus is worthy of further study. This work expands the application of phthalocyanine compound in analytical sciences.

  8. Application of quantitative 19F and 1H NMR for reaction monitoring and in situ yield determinations for an early stage pharmaceutical candidate.

    Science.gov (United States)

    Do, Nga M; Olivier, Mark A; Salisbury, John J; Wager, Carrie B

    2011-11-15

    Quantitative NMR spectrometry (qNMR) is an attractive, viable alternative to traditional chromatographic techniques. It is a fast, easy, accurate, and nondestructive technique which allows an analyst to gain quantitative information about a component mixture without the necessity of authentic reference materials, as is the case with most other analytical techniques. This is ideal for the synthesis of active pharmaceutical ingredients (API) that are in the early stages of development where authentic standards of the analytes may not be available. In this paper, the application of (19)F and (1)H qNMR for reaction monitoring and in situ potency determinations will be discussed for an early stage pharmaceutical candidate with several analytical challenges. These challenges include low UV absorption, low ionization, thermal instability, and lack of authentic reference standards. Quantitative NMR provided quick, fit-for-purpose solutions for process development where conventional separation techniques were limited.

  9. Evaluation of Quantitative Exposure Assessment Method for Nanomaterials in Mixed Dust Environments: Application in Tire Manufacturing Facilities

    National Research Council Canada - National Science Library

    Kreider, Marisa L; Cyrs, William D; Tosiano, Melissa A; Panko, Julie M

    2015-01-01

    ...) and amorphous silica (AS) from tire manufacturing as an example. This method combined air sampling with a low pressure cascade impactor with analysis of elemental composition by size to quantitatively assess potential exposures in the workplace...

  10. Development and application of a quantitative loop-mediated isothermal amplification method for detecting genetically modified maize MON863.

    Science.gov (United States)

    Huang, Sicong; Xu, Yuancong; Yan, Xinghua; Shang, Ying; Zhu, Pengyu; Tian, Wenying; Xu, Wentao

    2015-01-01

    A SYBR Green I-based quantitative loop-mediated isothermal amplification (LAMP) assay was developed for the rapid detection of genetically modified maize MON863. A set of primers was designed based on the integration region of the Cry3Bb1 and tahsp17 genes. The qualitative and quantitative reaction conditions (dNTPs, betaine, primers, Mg(2+), Bst polymerase, temperature, reaction time) were optimized. The concentrations of Mg(2+) and betaine were found to be important to the LAMP assay. The detection limits of both qualitative and quantitative LAMP for MON863 were as low as 4 haploid genomic DNA, and the LAMP reactions can be completed within 1 h at an isothermal temperature of 65 °C. The results of this study demonstrate that this new SYBR Green I-based quantitative LAMP assay system is reliable, sensitive and accurate. © 2014 Society of Chemical Industry.

  11. Sigmoidal curve-fitting redefines quantitative real-time PCR with the prospective of developing automated high-throughput applications

    OpenAIRE

    Rutledge, R. G.

    2004-01-01

    Quantitative real-time PCR has revolutionized many aspects of genetic research, biomedical diagnostics and pathogen detection. Nevertheless, the full potential of this technology has yet to be realized, primarily due to the limitations of the threshold-based methodologies that are currently used for quantitative analysis. Prone to errors caused by variations in reaction preparation and amplification conditions, these approaches necessitate construction of standard curves for each target seque...

  12. Application of computer-assisted three-dimensional quantitative assessment and a surgical planning tool for living donor liver transplantation.

    Science.gov (United States)

    Wei, Lin; Zhu, Zhi-Jun; Lü, Yi; Jiang, Wen-Tao; Gao, Wei; Zeng, Zhi-Gui; Shen, Zhong-Yang

    2013-04-01

    Precise evaluation of the live donor's liver is the most important factor for the donor's safety and the recipient's prognosis in living donor liver transplantation (LDLT). Our study assessed the clinical value of computer-assisted three-dimensional quantitative assessment and a surgical planning tool for donor evaluation in LDLT. Computer-assisted three-dimensional (3D) quantitative assessment was used to prospectively provide quantitative assessment of the graft volume for 123 consecutive donors of LDLT and its accuracy and efficiency were compared with that of the standard manual-traced method. A case of reduced monosegmental LDLT was also assessed and a surgical planning tool displayed the precise surgical plan to avoid large-for-size syndrome. There was no statistically significant difference between the detected graft volumes with computer-assisted 3D quantitative assessment and manual-traced approaches ((856.76 ± 162.18) cm(3) vs. (870.64 ± 172.54) cm(3), P = 0.796). Estimated volumes by either method had good correlation with the actual graft weight (r-manual-traced method: 0.921, r-3D quantitative assessment method: 0.896, both P computer-assisted 3D quantitative assessment approach was significantly more efficient taking half the time of the manual-traced method ((16.91 ± 1.375) minutes vs. (39.27 ± 2.102) minutes, P Computer-assisted 3D quantitative assessment provided precise evaluation of the graft volume. It also assisted surgeons with a better understanding of the hepatic 3D anatomy and was useful for the individual surgical planning tool.

  13. Cellular Phone-Based Image Acquisition and Quantitative Ratiometric Method for Detecting Cocaine and Benzoylecgonine for Biological and Forensic Applications

    Directory of Open Access Journals (Sweden)

    Brian A. Cadle

    2010-01-01

    Full Text Available Here we describe the first report of using low-cost cellular or web-based digital cameras to image and quantify standardized rapid immunoassay strips as a new point-of-care diagnostic and forensics tool with health applications. Quantitative ratiometric pixel density analysis (QRPDA is an automated method requiring end-users to utilize inexpensive (~ $1 USD/each immunotest strips, a commonly available web or mobile phone camera or scanner, and internet or cellular service. A model is described whereby a central computer server and freely available IMAGEJ image analysis software records and analyzes the incoming image data with time-stamp and geo-tag information and performs the QRPDA using custom JAVA based macros ( http://www.neurocloud.org . To demonstrate QRPDA we developed a standardized method using rapid immunotest strips directed against cocaine and its major metabolite, benzoylecgonine. Images from standardized samples were acquired using several devices, including a mobile phone camera, web cam, and scanner. We performed image analysis of three brands of commercially available dye-conjugated anti-cocaine/benzoylecgonine (COC/BE antibody test strips in response to three different series of cocaine concentrations ranging from 0.1 to 300 ng/ml and BE concentrations ranging from 0.003 to 0.1 ng/ml. This data was then used to create standard curves to allow quantification of COC/BE in biological samples. Across all devices, QRPDA quantification of COC and BE proved to be a sensitive, economical, and faster alternative to more costly methods, such as gas chromatography-mass spectrometry, tandem mass spectrometry, or high pressure liquid chromatography. The limit of detection was determined to be between 0.1 and 5 ng/ml. To simulate conditions in the field, QRPDA was found to be robust under a variety of image acquisition and testing conditions that varied temperature, lighting, resolution, magnification and concentrations of biological fluid

  14. Cellular phone-based image acquisition and quantitative ratiometric method for detecting cocaine and benzoylecgonine for biological and forensic applications.

    Science.gov (United States)

    Cadle, Brian A; Rasmus, Kristin C; Varela, Juan A; Leverich, Leah S; O'Neill, Casey E; Bachtell, Ryan K; Cooper, Donald C

    2010-01-01

    Here we describe the first report of using low-cost cellular or web-based digital cameras to image and quantify standardized rapid immunoassay strips as a new point-of-care diagnostic and forensics tool with health applications. Quantitative ratiometric pixel density analysis (QRPDA) is an automated method requiring end-users to utilize inexpensive (∼ $1 USD/each) immunotest strips, a commonly available web or mobile phone camera or scanner, and internet or cellular service. A model is described whereby a central computer server and freely available IMAGEJ image analysis software records and analyzes the incoming image data with time-stamp and geo-tag information and performs the QRPDA using custom JAVA based macros (http://www.neurocloud.org). To demonstrate QRPDA we developed a standardized method using rapid immunotest strips directed against cocaine and its major metabolite, benzoylecgonine. Images from standardized samples were acquired using several devices, including a mobile phone camera, web cam, and scanner. We performed image analysis of three brands of commercially available dye-conjugated anti-cocaine/benzoylecgonine (COC/BE) antibody test strips in response to three different series of cocaine concentrations ranging from 0.1 to 300 ng/ml and BE concentrations ranging from 0.003 to 0.1 ng/ml. This data was then used to create standard curves to allow quantification of COC/BE in biological samples. Across all devices, QRPDA quantification of COC and BE proved to be a sensitive, economical, and faster alternative to more costly methods, such as gas chromatography-mass spectrometry, tandem mass spectrometry, or high pressure liquid chromatography. The limit of detection was determined to be between 0.1 and 5 ng/ml. To simulate conditions in the field, QRPDA was found to be robust under a variety of image acquisition and testing conditions that varied temperature, lighting, resolution, magnification and concentrations of biological fluid in a sample. To

  15. A novel multicolor flow-cytometry application for quantitative detection of receptors on vascular smooth muscle cells

    DEFF Research Database (Denmark)

    Radziwon-Balicka, Aneta; Degn, Matilda; Johansson, Sara E

    2017-01-01

    There is a need to develop new techniques for quantitative measurement of receptors expression on particular vasculature cells types. Here, we describe and demonstrate a novel method to measure quantitatively and simultaneously the expression of endothelin B receptor (ETB) on vascular smooth muscle...... a quantitative measurement of ETB receptor expression on VSMC and we identified two subpopulations of VSMC based on their expression of smooth muscle cells marker SM22α. The results obtained from pial vessels are statistically significant (38.4% ± 4% vs 9.8% ± 3.32%) between the two subpopulations of VSMC...... cells (VSMC). We isolated cells from male rat tissues such as: brain pial, brain intraparenchymal and retina vessels. To analyze solid tissues, a single-cell suspension was prepared by a combined mechanic and enzymatic process. The cells were stained with Fixable Viability Dye, followed by fixation...

  16. Nondestructive application of laser-induced fluorescence spectroscopy for quantitative analyses of phenolic compounds in strawberry fruits (Fragaria x ananassa).

    Science.gov (United States)

    Wulf, J S; Rühmann, S; Rego, I; Puhl, I; Treutter, D; Zude, M

    2008-05-14

    Laser-induced fluorescence spectroscopy (LIFS) was nondestructively applied on strawberries (EX = 337 nm, EM = 400-820 nm) to test the feasibility of quantitatively determining native phenolic compounds in strawberries. Eighteen phenolic compounds were identified in fruit skin by UV and MS spectroscopy and quantitatively determined by use of rp-HPLC for separation and diode-array or chemical reaction detection. Partial least-squares calibration models were built for single phenolic compounds by means of nondestructively recorded fluorescence spectra in the blue-green wavelength range using different data preprocessing methods. The direct orthogonal signal correction resulted in r (2) = 0.99 and rmsep fruits.

  17. Quantitative capillary electrophoresis and its application in analysis of alkaloids in tea, coffee, coca cola, and theophylline tablets.

    Science.gov (United States)

    Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao

    2009-01-01

    A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.

  18. Application of quantitative real-time PCR compared to filtration methods for the enumeration of Escherichia coli in surface waters within Vietnam.

    Science.gov (United States)

    Vital, Pierangeli G; Van Ha, Nguyen Thi; Tuyet, Le Thi Hong; Widmer, Kenneth W

    2017-02-01

    Surface water samples in Vietnam were collected from the Saigon River, rural and suburban canals, and urban runoff canals in Ho Chi Minh City, Vietnam, and were processed to enumerate Escherichia coli. Quantification was done through membrane filtration and quantitative real-time polymerase chain reaction (PCR). Mean log colony-forming unit (CFU)/100 ml E. coli counts in the dry season for river/suburban canals and urban canals were log 2.8 and 3.7, respectively, using a membrane filtration method, while using Taqman quantitative real-time PCR they were log 2.4 and 2.8 for river/suburban canals and urban canals, respectively. For the wet season, data determined by the membrane filtration method in river/suburban canals and urban canals samples had mean counts of log 3.7 and 4.1, respectively. While mean log CFU/100 ml counts in the wet season using quantitative PCR were log 3 and 2, respectively. Additionally, the urban canal samples were significantly lower than those determined by conventional culture methods for the wet season. These results show that while quantitative real-time PCR can be used to determine levels of fecal indicator bacteria in surface waters, there are some limitations to its application and it may be impacted by sources of runoff based on surveyed samples.

  19. Alternate strategies to obtain mass balance without the use of radiolabeled compounds: application of quantitative fluorine (19F) nuclear magnetic resonance (NMR) spectroscopy in metabolism studies.

    Science.gov (United States)

    Mutlib, Abdul; Espina, Robert; Atherton, James; Wang, Jianyao; Talaat, Rasmy; Scatina, JoAnn; Chandrasekaran, Appavu

    2012-03-19

    studies demonstrate that quantitative (19)F-NMR could be used as an alternate technique to obtain an estimate of the mass balance of fluorinated compounds, especially in early drug development where attrition of the compounds is high, and cost savings could be realized through the use of such a technique rather than employing radioactive compounds. The potential application of qNMR in conducting early human ADME studies with fluorinated compounds is also discussed. © 2012 American Chemical Society

  20. Quantitation of Crocins and picrocrocin in saffron by HPLC: application to quality control and phytochemical differentiation from other crocus taxa.

    Science.gov (United States)

    Koulakiotis, Nikolaos Stavros; Gikas, Evangelos; Iatrou, Gregoris; Lamari, Fotini N; Tsarbopoulos, Anthony

    2015-05-01

    A chromatographic method was developed and fully validated for the determination of the major saffron constituents, i.e., picrocrocin and five major crocins. Dried samples (styles of Crocus sativus and other Crocus taxa) were extracted with MeOH : water (1 : 1, v/v), and chromatographic separation of the analytes was achieved by reversed-phase chromatography using a gradient elution. Full validation was performed using spiked samples with analytes, which were isolated, purified, and characterized by MS due to a lack of commercial standards. The method showed a good fit (r2 > 0.999) for all analytes with limit of quantitation values in the range of 1-15 µg/mL, and demonstrated adequate intra- and inter-precision (saffron samples and of indigenous Crocus taxa and allowed for the first time the absolute quantitation of several Crocus components. Georg Thieme Verlag KG Stuttgart · New York.

  1. Baseline correction combined partial least squares algorithm and its application in on-line Fourier transform infrared quantitative analysis.

    Science.gov (United States)

    Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping

    2011-04-01

    In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Application of terahertz time-domain spectroscopy combined with chemometrics to quantitative analysis of imidacloprid in rice samples

    Science.gov (United States)

    Chen, Zewei; Zhang, Zhuoyong; Zhu, Ruohua; Xiang, Yuhong; Yang, Yuping; Harrington, Peter B.

    2015-12-01

    Terahertz time-domain spectroscopy (THz-TDS) has been utilized as an effective tool for quantitative analysis of imidacloprid in rice powder samples. Unlike previous studies, our method for sample preparation was mixing imidacloprid with rice powder instead of polyethylene. Then, terahertz time domain transmission spectra of these mixed samples were measured and the absorption coefficient spectra of the samples with frequency range extending from 0.3 to 1.7 THz were obtained. Asymmetric least square (AsLS) method was utilized to correct the slope baselines that are presented in THz absorption coefficient spectra and improve signal-to-noise ratio of THz spectra. Chemometrics methods, including partial least squares (PLS), support vector regression (SVR), interval partial least squares (iPLS), and backward interval partial least squares (biPLS), were used for quantitative model building and prediction. To achieve a reliable and unbiased estimation, bootstrapped Latin partition was chosen as an approach for statistical cross-validation. Results showed that the mean value of root mean square error of prediction (RMSEP) for PLS (0.5%) is smaller than SVR (0.7%), these two methods were based on the whole absorption coefficient spectra. In addition, PLS performed a better performance with a lower RMSEP (0.3%) based on the THz absorption coefficient spectra after AsLS baseline correction. Alternatively, two methods for variable selection, namely iPLS and biPLS, yielded models with improved predictions. Comparing with conventional PLS and SVR, the mean values of RMSEP were 0.4% (iPLS) and 0.3% (biPLS) by selecting the informative frequency ranges. The results demonstrated that an accurate quantitative analysis of imidacloprid in rice powder samples could be achieved by terahertz time-domain transmission spectroscopy combined with chemometrics. Furthermore, these results demonstrate that THz time-domain spectroscopy can be used for quantitative determinations of other

  3. Three-dimensional histology: tools and application to quantitative assessment of cell-type distribution in rabbit heart.

    OpenAIRE

    Burton, RA; Lee, P; R. Casero; Garny, A.; Siedlecka, U; Schneider, JE; Kohl, P.; Grau, V

    2014-01-01

    AIMS: Cardiac histo-anatomical organization is a major determinant of function. Changes in tissue structure are a relevant factor in normal and disease development, and form targets of therapeutic interventions. The purpose of this study was to test tools aimed to allow quantitative assessment of cell-type distribution from large histology and magnetic resonance imaging- (MRI) based datasets. METHODS AND RESULTS: Rabbit heart fixation during cardioplegic arrest and MRI were followed by serial...

  4. Three-dimensional histology: tools and application to quantitative assessment of cell-type distribution in rabbit heart

    OpenAIRE

    Burton, Rebecca A. B.; Lee, Peter; Casero, Ramón; Garny, Alan; Siedlecka, Urszula; Schneider, Jürgen E; Kohl, Peter; Grau, Vicente

    2014-01-01

    Aims Cardiac histo-anatomical organization is a major determinant of function. Changes in tissue structure are a relevant factor in normal and disease development, and form targets of therapeutic interventions. The purpose of this study was to test tools aimed to allow quantitative assessment of cell-type distribution from large histology and magnetic resonance imaging- (MRI) based datasets. Methods and results Rabbit heart fixation during cardioplegic arrest and MRI were followed by serial s...

  5. Development and application of a quantitative method for determination of flavonoids in orange peel: Influence of sample pretreatment on composition.

    Science.gov (United States)

    Molina-Calle, María; Priego-Capote, Feliciano; Luque de Castro, María D

    2015-11-01

    Peel, a part of the citrus rich in compounds with high-added value, constitutes the bulk of the waste generated in citrus juice industries. Flavonoids are a class of these high-added value compounds characterized by their bioactivity. In this research, a method for analysis of flavonoids, based on LC-MS/MS by using a triple quadrupole detector, has been developed and applied to the quantitative analysis of 16 flavonoids in extracts obtained by maceration of citrus peel. The parameters involved in the ionization and fragmentation of the target analytes were optimized to develop a selected reaction monitoring (SRM) method, which reported detection and quantitation limits ranging from 0.005 to 5 ng/mL and from 0.01 to 10 ng/mL, respectively. The raw materials for flavonoids extraction were fresh, oven-dried and lyophilized peel of 8 different orange varieties, and the proposed quantitation method was applied to the analysis of the obtained extracts. Evaluation of the two methods of water removal showed that lyophilization preserves the concentration of the flavonoids, while oven-dried peel presented a decrease of glycosylated flavonoids and an increase of aglycone forms. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Using quantitative image analysis to classify axillary lymph nodes on breast MRI: A new application for the Z 0011 Era

    Energy Technology Data Exchange (ETDEWEB)

    Schacht, David V., E-mail: dschacht@radiology.bsd.uchicago.edu; Drukker, Karen, E-mail: kdrukker@uchicago.edu; Pak, Iris, E-mail: irisgpak@gmail.com; Abe, Hiroyuki, E-mail: habe@radiology.bsd.uchicago.edu; Giger, Maryellen L., E-mail: m-giger@uchicago.edu

    2015-03-15

    Highlights: •Quantitative image analysis showed promise in evaluating axillary lymph nodes. •13 of 28 features performed better than guessing at metastatic status. •When all features were used in together, a considerably higher AUC was obtained. -- Abstract: Purpose: To assess the performance of computer extracted feature analysis of dynamic contrast enhanced (DCE) magnetic resonance images (MRI) of axillary lymph nodes. To determine which quantitative features best predict nodal metastasis. Methods: This institutional board-approved HIPAA compliant study, in which informed patient consent was waived, collected enhanced T1 images of the axilla from patients with breast cancer. Lesion segmentation and feature analysis were performed on 192 nodes using a laboratory-developed quantitative image analysis (QIA) workstation. The importance of 28 features were assessed. Classification used the features as input to a neural net classifier in a leave-one-case-out cross-validation and evaluated with receiver operating characteristic (ROC) analysis. Results: The area under the ROC curve (AUC) values for features in the task of distinguishing between positive and negative nodes ranged from just over 0.50 to 0.70. Five features yielded AUCs greater than 0.65: two morphological and three textural features. In cross-validation, the neural net classifier obtained an AUC of 0.88 (SE 0.03) for the task of distinguishing between positive and negative nodes. Conclusion: QIA of DCE MRI demonstrated promising performance in discriminating between positive and negative axillary nodes.

  7. [Research advances in application of isobaric tags for relative and absolute quantitation in proteomics of hepatocellular carcinoma].

    Science.gov (United States)

    Qi, Y Z; Chen, L S; Xu, P

    2016-12-20

    Due to the features of strong heterogeneity, difficult early diagnosis, poor prognosis, and high fatality rate, hepatocellular carcinoma (HCC) has become an important disease which threatens the health of the Chinese population. Accurate early diagnosis is crucial to improving the success rate of liver cancer resection and reducing postoperative recurrence and metastasis, and its core is the screening and validation of biomarkers for early diagnosis. Isobaric tags for relative and absolute quantitation (iTRAQ) and stable isotope labeling with amino acids in cell culture are important parts of proteomics technology, and iTRAQ has become the most important technique in quantitative proteomics technology due to its advantages of high throughput, high quantitative accuracy, and no limitation by sample source. This article reviews the research advances in molecular mechanism of the development and progression of HCC and screening of markers, in order to establish a theoretical foundation for in-depth understanding of the molecular mechanisms of the development and progression of HCC and the development of new biomarkers.

  8. Electrochemical detection of magnetically-entrapped DNA sequences from complex samples by multiplexed enzymatic labelling: Application to a transgenic food/feed quantitative survey.

    Science.gov (United States)

    Manzanares-Palenzuela, C L; Martín-Clemente, J P; Lobo-Castañón, M J; López-Ruiz, B

    2017-03-01

    Monitoring of genetically modified organisms in food and feed demands molecular techniques that deliver accurate quantitative results. Electrochemical DNA detection has been widely described in this field, yet most reports convey qualitative data and application in processed food and feed samples is limited. Herein, the applicability of an electrochemical multiplex assay for DNA quantification in complex samples is assessed. The method consists of the simultaneous magnetic entrapment via sandwich hybridisation of two DNA sequences (event-specific and taxon-specific) onto the surface of magnetic microparticles, followed by bienzymatic labelling. As proof-of-concept, we report its application in a transgenic food/feed survey where relative quantification (two-target approach) of Roundup Ready Soybean® (RRS) was performed in food and feed. Quantitative coupling to end-point PCR was performed and calibration was achieved from 22 and 243 DNA copies spanning two orders of magnitude for the event and taxon-specific sequences, respectively. We collected a total of 33 soybean-containing samples acquired in local supermarkets, four out of which were found to contain undeclared presence of genetically modified soybean. A real-time PCR method was used to verify these findings. High correlation was found between results, indicating the suitability of the proposed multiplex method for food and feed monitoring. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Precision and accuracy in the quantitative analysis of biological samples by accelerator mass spectrometry: application in microdose absolute bioavailability studies.

    Science.gov (United States)

    Gao, Lan; Li, Jing; Kasserra, Claudia; Song, Qi; Arjomand, Ali; Hesk, David; Chowdhury, Swapan K

    2011-07-15

    Determination of the pharmacokinetics and absolute bioavailability of an experimental compound, SCH 900518, following a 89.7 nCi (100 μg) intravenous (iv) dose of (14)C-SCH 900518 2 h post 200 mg oral administration of nonradiolabeled SCH 900518 to six healthy male subjects has been described. The plasma concentration of SCH 900518 was measured using a validated LC-MS/MS system, and accelerator mass spectrometry (AMS) was used for quantitative plasma (14)C-SCH 900518 concentration determination. Calibration standards and quality controls were included for every batch of sample analysis by AMS to ensure acceptable quality of the assay. Plasma (14)C-SCH 900518 concentrations were derived from the regression function established from the calibration standards, rather than directly from isotopic ratios from AMS measurement. The precision and accuracy of quality controls and calibration standards met the requirements of bioanalytical guidance (U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation and Research, Center for Veterinary Medicine. Guidance for Industry: Bioanalytical Method Validation (ucm070107), May 2001. http://www.fda.gov/downloads/Drugs/GuidanceCompilanceRegulatoryInformation/Guidances/ucm070107.pdf ). The AMS measurement had a linear response range from 0.0159 to 9.07 dpm/mL for plasma (14)C-SCH 900158 concentrations. The CV and accuracy were 3.4-8.5% and 94-108% (82-119% for the lower limit of quantitation (LLOQ)), respectively, with a correlation coefficient of 0.9998. The absolute bioavailability was calculated from the dose-normalized area under the curve of iv and oral doses after the plasma concentrations were plotted vs the sampling time post oral dose. The mean absolute bioavailability of SCH 900518 was 40.8% (range 16.8-60.6%). The typical accuracy and standard deviation in AMS quantitative analysis of drugs from human plasma samples have been reported for the first time, and the impact of these

  10. Application of a series of artificial neural networks to on-site quantitative analysis of lead into real soil samples by laser induced breakdown spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    El Haddad, J. [Univ. Bordeaux, LOMA, CNRS UMR 5798, F-33400 Talence (France); Bruyère, D. [BRGM, Service Métrologie, Monitoring et Analyse, 3 av. C. Guillemin, B.P 36009, 45060 Orléans Cedex (France); Ismaël, A.; Gallou, G. [IVEA Solution, Centre Scientifique d' Orsay, Bât 503, 91400 Orsay (France); Laperche, V.; Michel, K. [BRGM, Service Métrologie, Monitoring et Analyse, 3 av. C. Guillemin, B.P 36009, 45060 Orléans Cedex (France); Canioni, L. [Univ. Bordeaux, LOMA, CNRS UMR 5798, F-33400 Talence (France); Bousquet, B., E-mail: bruno.bousquet@u-bordeaux.fr [Univ. Bordeaux, LOMA, CNRS UMR 5798, F-33400 Talence (France)

    2014-07-01

    Artificial neural networks were applied to process data from on-site LIBS analysis of soil samples. A first artificial neural network allowed retrieving the relative amounts of silicate, calcareous and ores matrices into soils. As a consequence, each soil sample was correctly located inside the ternary diagram characterized by these three matrices, as verified by ICP-AES. Then a series of artificial neural networks were applied to quantify lead into soil samples. More precisely, two models were designed for classification purpose according to both the type of matrix and the range of lead concentrations. Then, three quantitative models were locally applied to three data subsets. This complete approach allowed reaching a relative error of prediction close to 20%, considered as satisfying in the case of on-site analysis. - Highlights: • Application of a series of artificial neural networks (ANN) to quantitative LIBS • Matrix-based classification of the soil samples by ANN • Concentration-based classification of the soil samples by ANN • Series of quantitative ANN models dedicated to the analysis of data subsets • Relative error of prediction lower than 20% for LIBS analysis of soil samples.

  11. Quantitative linear and nonlinear resonance inspection techniques and analysis for material characterization: application to concrete thermal damage.

    Science.gov (United States)

    Payan, C; Ulrich, T J; Le Bas, P Y; Saleh, T; Guimaraes, M

    2014-08-01

    Developed in the late 1980s, Nonlinear Resonant Ultrasound Spectroscopy (NRUS) has been widely employed in the field of material characterization. Most of the studies assume the measured amplitude to be proportional to the strain amplitude which drives nonlinear phenomena. In 1D resonant bar experiments, the configuration for which NRUS was initially developed, this assumption holds. However, it is not true for samples of general shape which exhibit several resonance mode shapes. This paper proposes a methodology based on linear resonant ultrasound spectroscopy, numerical simulations and nonlinear resonant ultrasound spectroscopy to provide quantitative values of nonlinear elastic moduli taking into account the 3D nature of the samples. In the context of license renewal in the field of nuclear energy, this study aims at providing some quantitative information related to the degree of micro-cracking of concrete and cement based materials in the presence of thermal damage. The resonance based method is validated as regard with concrete microstructure evolution during thermal exposure.

  12. Quantitative geochemical modelling using leaching tests: application for coal ashes produced by two South African thermal processes.

    Science.gov (United States)

    Hareeparsad, Shameer; Tiruta-Barna, Ligia; Brouckaert, Chris J; Buckley, Chris A

    2011-02-28

    The present work focuses on the reactivity of coal fly ash in aqueous solutions studied through geochemical modelling. The studied coal fly ashes originate from South African industrial sites. The adopted methodology is based on mineralogical analysis, laboratory leaching tests and geochemical modelling. A quantitative modelling approach is developed here in order to determine the quantities of different solid phases composing the coal fly ash. It employs a geochemical code (PHREEQC) and a numerical optimisation tool developed under MATLAB, by the intermediate of a coupling program. The experimental conditions are those of the laboratory leaching test, i.e. liquid/solid ratio of 10 L/kg and 48 h contact time. The simulation results compared with the experimental data demonstrate the feasibility of such approach, which is the scope of the present work. The perspective of the quantitative geochemical modelling is the waste reactivity prediction in different leaching conditions and time frames. This work is part of a largest research project initiated by Sasol and Eskom companies, the largest South African coal consumers, aiming to address the issue of waste management of coal combustion residues and the environmental impact assessment of coal ash disposal on land. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Quantitative Y2H screening: cloning and signal peptide engineering of a fungal secretory LacA gene and its application to yeast two-hybrid system as a quantitative reporter.

    Science.gov (United States)

    Kamiya, Takuma; Ojima, Teruyo; Sugimoto, Kanoko; Nakano, Hideo; Kawarasaki, Yasuaki

    2010-04-15

    A quantitative protein/peptide screening system amenable to high-throughput screening has been developed by furnishing conventional yeast two-hybrid (Y2H) system with an engineered fungal secretory beta-galactosidase gene (designated LacA3). We describe the molecular cloning and signal peptide-optimization of the original fungal LacA gene of which extracellular expression was initially toxic to the host cell. The engineered LacA, LacA3, showed less toxicity, resulting in improved cultural properties of the host. The release of the enzyme to the medium was constant to the cell density under a certain induction condition and independent of the growth phase. The released enzyme kept the wild type properties, was highly glycosylated, stable in a wide pH range and high temperature, and had an acidic pH optimum. In the Y2H system with the novel reporter in combination with the conventional Y2H reporters, the yeast colonies are visibly stained in blue, white or red in the growth context, according to the interaction intensity. The clones with the more stable interactions are easily found as colonies with the larger blue halos, due to the increased extracellular LacA3 expression. A quantitative, high-throughput Y2H screening of cDNA library based on the novel reporter was demonstrated. An application of the novel Y2H system to directed evolution of a peptide fragment was also exemplified. (c) 2010 Elsevier B.V. All rights reserved.

  14. Quantitative Structure-Use Relationship Model thresholds for Model Validation, Domain of Applicability, and Candidate Alternative Selection

    Data.gov (United States)

    U.S. Environmental Protection Agency — This file contains value of the model training set confusion matrix, domain of applicability evaluation based on training set to predicted chemicals structural...

  15. Quantitation of clevidipine in dog blood by liquid chromatography tandem mass spectrometry: application to a pharmacokinetic study.

    Science.gov (United States)

    Wei, Huihui; Gu, Yuan; Liu, Yanping; Chen, Yong; Liu, Changxiao; Si, Duanyun

    2014-11-15

    Clevidipine, a vascular selective calcium channel antagonist of the dihydropyridine class, is rapidly metabolized by ester hydrolysis because of incorporation of an ester linkage into the drug molecule. To characterize its pharmacokinetic profiles in dogs, a simple, rapid and sensitive liquid chromatography-tandem mass spectrometry (LC-MS/MS) method was developed and validated for quantitation of clevidipine in dog blood. After one-step protein precipitation with methanol, the chromatographic separation was carried out on an Ecosil C18 column (150mm×4.6mm, 5μm) with a gradient mobile phase consisting of methanol and 5mM ammonium formate at a flow rate of 0.5mL/min. The quantitation analysis was performed using multiple reaction monitoring (MRM) at the specific ion transitions of m/z 454.1 [M-H](-)→m/z 234.1 for clevidipine and m/z 256.1 [M-H](-)→m/z 227.1 for elofesalamide (internal standard) in the negative ion mode with electrospray ionization (ESI) source. This validated LC-MS/MS method showed good linearity over the range 0.5-100ng/mL with the lower limit of quantitation (LLOQ) of 0.5ng/mL together with the satisfied intra- and inter-day precision, accuracy, extraction recovery and matrix effect. Stability testing indicated that clevidipine in dog blood with the addition of denaturant methanol was stable on workbench for 1h, at -80°C for up to 30 days, and after three freeze-thaw cycles. Extracted samples were also observed to be stable over 24h in an auto-sampler at 4°C. The validated method has been successfully applied to a pharmacokinetic study of clevidipine injection to 8 healthy Beagle dogs following intravenous infusion at a flow rate of 5mg/h for 0.5h. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Preliminary Discussion On The Three Dimensional Space Quantitative Analysis Of Erythrocytes By SEMP And Some Applications On The Clinic And Research Of Blood Disease.

    Science.gov (United States)

    Lian-Huang, Lu; Wen-Meng, Tong; Zhi-Jun, Zhang; Gui-Huan, He; Su-Hui, Huan

    1989-04-01

    The abnormity of the quality and quantity for erythrocytes is one of the important changes of blood disease. It shows the abnormal blood-making function of human body. Therefore, the study of the change of shape of erythrocytes is the indispensible and important basis of reference in the clinic, diagnose and research of blood disease. In this paper, a preliminary discussion is made on the acquisition of scanning stereographs for erythrocytes, the application of the theory of photographic measurement on the three dimensional space quantitative analysis of erythrocytes, drawings of isoline map and section map of various erythrocytes for normal persons, paroxysmal nocturanal hemoglobinuria (PNH) patients and aplastic anemia patients, study of the shape characteristics of normal erythrocytes and various abnormal erytnrocytes and the applications in clinic, diagnose and research. This research is a combination of microphotogrammetry and erythrocyte morphology. It is polssible to push fotward the study of erythrocyte morphology from LM, SEM to a higher stage of scanning electron micrographic photogrammetry(SEMP) for stereograpic observationand three diamensional quantitative analysis to explore a new path for the further study of the shape of erthrocytes.

  17. Qualitative and Quantitative Analysis of Congested Marine Traffic Environment – An Application Using Marine Traffic Simulation System

    Directory of Open Access Journals (Sweden)

    Kazuhiko Hasegawa

    2013-06-01

    Full Text Available Difficulty of sailing is quite subjective matter. It depends on various factors. Using Marine Traffic Simulation System (MTSS developed by Osaka University this challenging subject is discussed. In this system realistic traffic flow including collision avoidance manoeuvres can be reproduced in a given area. Simulation is done for southward of Tokyo Bay, Strait of Singapore and off-Shanghai area changing traffic volume from 5 or 50 to 150 or 200% of the present volume. As a result, strong proportional relation between near-miss ratio and traffic density per hour per sailed area is found, independent on traffic volume, area size and configuration. The quantitative evaluation index of the difficulty of sailing, here called risk rate of the area is defined using thus defined traffic density and near-miss ratio.

  18. An operational weather radar-based Quantitative Precipitation Estimation and its application in catchment water resources modeling

    DEFF Research Database (Denmark)

    He, Xin; Vejen, Flemming; Stisen, Simon

    2011-01-01

    The Danish Meteorological Institute operates a radar network consisting of five C-band Doppler radars. Quantitative precipitation estimation (QPE) using radar data is performed on a daily basis. Radar QPE is considered to have the potential to signifi cantly improve the spatial representation...... of precipitation compared with rain-gauge-based methods, thus providing the basis for better water resources assessments. The radar QPE algorithm called ARNE is a distance-dependent areal estimation method that merges radar data with ground surface observations. The method was applied to the Skjern River catchment...... in western Denmark where alternative precipitation estimates were also used as input to an integrated hydrologic model. The hydrologic responses from the model were analyzed by comparing radar- and ground-based precipitation input scenarios. Results showed that radar QPE products are able to generate...

  19. A network-based approach for semi-quantitative knowledge mining and its application to yield variability

    Science.gov (United States)

    Schauberger, Bernhard; Rolinski, Susanne; Müller, Christoph

    2016-12-01

    Variability of crop yields is detrimental for food security. Under climate change its amplitude is likely to increase, thus it is essential to understand the underlying causes and mechanisms. Crop models are the primary tool to project future changes in crop yields under climate change. A systematic overview of drivers and mechanisms of crop yield variability (YV) can thus inform crop model development and facilitate improved understanding of climate change impacts on crop yields. Yet there is a vast body of literature on crop physiology and YV, which makes a prioritization of mechanisms for implementation in models challenging. Therefore this paper takes on a novel approach to systematically mine and organize existing knowledge from the literature. The aim is to identify important mechanisms lacking in models, which can help to set priorities in model improvement. We structure knowledge from the literature in a semi-quantitative network. This network consists of complex interactions between growing conditions, plant physiology and crop yield. We utilize the resulting network structure to assign relative importance to causes of YV and related plant physiological processes. As expected, our findings confirm existing knowledge, in particular on the dominant role of temperature and precipitation, but also highlight other important drivers of YV. More importantly, our method allows for identifying the relevant physiological processes that transmit variability in growing conditions to variability in yield. We can identify explicit targets for the improvement of crop models. The network can additionally guide model development by outlining complex interactions between processes and by easily retrieving quantitative information for each of the 350 interactions. We show the validity of our network method as a structured, consistent and scalable dictionary of literature. The method can easily be applied to many other research fields.

  20. Three-dimensional histology: tools and application to quantitative assessment of cell-type distribution in rabbit heart.

    Science.gov (United States)

    Burton, Rebecca A B; Lee, Peter; Casero, Ramón; Garny, Alan; Siedlecka, Urszula; Schneider, Jürgen E; Kohl, Peter; Grau, Vicente

    2014-11-01

    Cardiac histo-anatomical organization is a major determinant of function. Changes in tissue structure are a relevant factor in normal and disease development, and form targets of therapeutic interventions. The purpose of this study was to test tools aimed to allow quantitative assessment of cell-type distribution from large histology and magnetic resonance imaging- (MRI) based datasets. Rabbit heart fixation during cardioplegic arrest and MRI were followed by serial sectioning of the whole heart and light-microscopic imaging of trichrome-stained tissue. Segmentation techniques developed specifically for this project were applied to segment myocardial tissue in the MRI and histology datasets. In addition, histology slices were segmented into myocytes, connective tissue, and undefined. A bounding surface, containing the whole heart, was established for both MRI and histology. Volumes contained in the bounding surface (called 'anatomical volume'), as well as that identified as containing any of the above tissue categories (called 'morphological volume'), were calculated. The anatomical volume was 7.8 cm(3) in MRI, and this reduced to 4.9 cm(3) after histological processing, representing an 'anatomical' shrinkage by 37.2%. The morphological volume decreased by 48% between MRI and histology, highlighting the presence of additional tissue-level shrinkage (e.g. an increase in interstitial cleft space). The ratio of pixels classified as containing myocytes to pixels identified as non-myocytes was roughly 6:1 (61.6 vs. 9.8%; the remaining fraction of 28.6% was 'undefined'). Qualitative and quantitative differentiation between myocytes and connective tissue, using state-of-the-art high-resolution serial histology techniques, allows identification of cell-type distribution in whole-heart datasets. Comparison with MRI illustrates a pronounced reduction in anatomical and morphological volumes during histology processing. © The Author 2014. Published by Oxford University Press

  1. Application of Programmable Bio-Nano-Chip System for the Quantitative Detection of Drugs of Abuse in Oral Fluids*

    Science.gov (United States)

    Christodoulides, Nicolaos; De La Garza, Richard; Simmons, Glennon W.; McRae, Michael P.; Wong, Jorge; Newton, Thomas F.; Smith, Regina; Mahoney, James J.; Hohenstein, Justin; Gomez, Sobeyda; Floriano, Pierre N.; Talavera, Humberto; Sloan, Daniel J.; Moody, David E.; Andrenyak, David M.; Kosten, Thomas R.; Haque, Ahmed; McDevitt, John T.

    2015-01-01

    Objective There is currently a gap in on-site drug of abuse monitoring. Current detection methods involve invasive sampling of blood and urine specimens, or collection of oral fluid, followed by qualitative screening tests using immunochromatographic cartridges. While remote laboratories then may provide confirmation and quantitative assessment of a presumptive positive, this instrumentation is expensive and decoupled from the initial sampling making the current drug-screening program inefficient and costly. The authors applied a noninvasive oral fluid sampling approach integrated with the in-development chip-based Programmable Bio-Nano-Chip (p-BNC) platform for the detection of drugs of abuse. Method The p-BNC assay methodology was applied for the detection of tetrahydrocannabinol, morphine, amphetamine, methamphetamine, cocaine, methadone and benzodiazepines, initially using spiked buffered samples and, ultimately, using oral fluid specimen collected from consented volunteers. Results Rapid (~10 minutes), sensitive detection (~ng/ml) and quantitation of 12 drugs of abuse was demonstrated on the p-BNC platform. Furthermore, the system provided visibility to time-course of select drug and metabolite profiles in oral fluids; for the drug cocaine, three regions of slope were observed that, when combined with concentration measurements from this and prior impairment studies, information about cocaine-induced impairment may be revealed. Conclusions This chip-based p-BNC detection modality has significant potential to be used in the future by law enforcement officers for roadside drug testing and to serve a variety of other settings, including outpatient and inpatient drug rehabilitation centers, emergency rooms, prisons, schools, and in the workplace. PMID:26048639

  2. Two novel quantitative trait linkage analysis statistics based on the posterior probability of linkage: application to the COGA families.

    Science.gov (United States)

    Bartlett, Christopher W; Vieland, Veronica J

    2005-12-30

    In this paper we apply two novel quantitative trait linkage statistics based on the posterior probability of linkage (PPL) to chromosome 4 from the GAW 14 COGA dataset. Our approaches are advantageous since they use the full likelihood, use full phenotypic information, do not assume normality at the population level or require population/sample parameter estimates; and like other forms of the PPL, they are specifically tailored to accumulate linkage evidence, either for or against linkage, across multiple sets of heterogeneous data. The first statistic uses all quantitative trait (QT) information from the pedigree (QT-posterior probability of linkage, PPL); we applied the QT-PPL to the trait ecb21 (resting electroencephalogram). The second statistic allows simultaneous incorporation of dichotomous trait data into the QT analysis via a threshold model (QTT-PPL); we applied the QTT-PPL to combined data on ecb21 and ALDX1. We obtained a QT-PPL of 96% at GABRB1 and a QT-PPL of 18% at FABP2 while the QTT-PPL was 4% and 2% at the same two loci, respectively. By comparison, the variance-components (VC) method, as implemented in SOLAR, yielded multipoint VC LOD scores of 2.05 and 2.21 at GABRB1 and FABP2, respectively; no other VC LODs were greater than 2. The QTT-PPL was only 4% at GABARB1, which might suggest that the underlying ecb21 gene does not also cause ALDX1, although features of the data complicate interpretation of this result.

  3. System Establishment and Method Application for Quantitatively Evaluating the Green Degree of the Products in Green Public Procurement

    Directory of Open Access Journals (Sweden)

    Shengguo Xu

    2016-09-01

    Full Text Available The government green purchase is widely considered to be an effective means of promoting sustainable consumption. However, how to identify the greener product is the biggest obstacle of government green purchase and it has not been well solved. A quantitative evaluation method is provided to measure the green degree of different products of the same use function with an indicator system established, which includes fundamental indicators, general indicators, and leading indicators. It can clearly show the products’ green extent by rating the scores of different products, which provides the government a tool to compare the green degree of different products and select greener ones. A comprehensive evaluation case of a project purchasing 1635 desk computers in Tianjin government procurement center is conducted using the green degree evaluation system. The environmental performance of the products were assessed quantitatively, and the evaluation price, which was the bid price minus the discount (the discount rate was according to the total scores attained by their environmental performance, and the final evaluation price ranking from low to high in turn is supplier C, D, E, A, and B. The winner, supplier C, was not the lowest bid price or the best environmental performance, but it performed well at both bid price and environmental performance so it deserved the project. It shows that the green extent evaluation system can help classify the different products by evaluating their environment performance including structure and connection technology, selection of materials and marks, prolonged use, hazardous substances, energy consumption, recyclability rate, etc. and price, so that it could help to choose the greener products.

  4. The Effects of Time of Manure Application and Different Biological Fertilizers on Quantitative and Qualitative Characteristics of Cucurbita pepo L.

    Directory of Open Access Journals (Sweden)

    m Jahan

    2011-02-01

    Full Text Available Abstract To study the response of summer squash as a medicinal plant, two manure application time and utilization of different biofertilizers, split plot arrangement of factors based on randomized complete block design with three replications was used in 2008-09 growing season. Two manure application time (autumn and spring were allocated to main plots and four biofertilizers including 1- Nitragin (containing Azotobacter sp., Azospirillum sp. and Pseudomonas sp., 2- phosphate solubilizing bacteria PSB (containing Pseudomonas sp. and Bacillus sp., 3- Nitragin+PSB, 4-control, were assigned to sub plots. The results showed the significant effect of spring manure application on fruit and seed yield. Nitragin increased fruit and seed yield, significantly. The superiority of spring manure application was revealed on seed and fruit number. A positive correlation (R2 = 0.92 was found between fruit and seed yield with a linear trend in the range of 10 to 20 t ha-1 and leveling off at the above 20 t ha-1 fruit yields. The seed oil and protein content were not affected by treatments, however, the biofertilizers increased oil and protein yield compared to control. At a glance, the biofertilizers could be an appropriate alternative for chemical fertilizers to achieve ecological production of summer squash. Keywords: Schneider Squash, Biofertilizers, Seed yield, Seed oil

  5. Recent technologic developments on high-resolution beta imaging systems for quantitative autoradiography and double labeling applications

    CERN Document Server

    Barthe, N; Chatti, K; Coulon, P; Maitrejean, S; 10.1016/j.nima.2004.03.014

    2004-01-01

    Two novel beta imaging systems, particularly interesting in the field of radiopharmacology and molecular biology research, were developed these last years. (1) a beta imager was derived from research conducted by Pr Charpak at CERN. This parallel plate avalanche chamber is a direct detection system of beta radioactivity, which is particularly adapted for qualitative and quantitative autoradiography. With this detector, autoradiographic techniques can be performed with emitters such as /sup 99m/Tc because this radionuclide emits many low-energy electrons and the detector has a very low sensitivity to low-range gamma -rays. Its sensitivity (smallest activity detected: 0.007 cpm/mm/sup 2/ for /sup 3/H and 0.01 for /sup 14/C), linearity (over a dynamic range of 10/sup 4/) and spatial resolution (50 mu m for /sup 3/H or /sup 99m/Tc to 150 mu m for /sup 32/P or /sup 18/F ( beta /sup +/)) gives a real interest to this system as a new imaging device. Its principle of detection is based on the analysis of light emitte...

  6. [Development and application of TaqMan-MGB real-time quantitative PCR assay for detection of goat pox virus].

    Science.gov (United States)

    Cheng, Zhentao; Yue, Jun; Li, Yongming; Xu, Leren; Wang, Kaigong; Zhou, Bijun; Chen, Junyi; Li, Jun; Jiang, Nan

    2009-03-01

    The complete gene sequences of eight capripoxvirus strains in GenBank were aligned and analyzed with DNAStar software. We selected a size of 64 bp gene fragment that was located in gp064 region of goat pox virus (GPV) genome, and designed a pair of primers and a TaqMan-MGB probe against the gene fragment with Primer Express 2.0 software. Then, the fluorescence quantitative PCR (FQ-PCR) assay was developed and the standard curve of different dilution series was described. We extracted the DNA samples from clinical skin pox, scab and GPV infected materials of artificial challenge animals. The FQ-PCR assay has been performed for all kinds of DNA samples. The results showed that the FQ-PCR assay was sensitive, specific, stable and could be used for clinical diagnosis. This method provided an important tool for rapid diagnosis of goat pox clinically, and for study GPV pathogenesis in the course of disease occurrence, development and convalescence.

  7. Application of quantitative real-time PCR for enumeration of total bacterial, archaeal, and yeast populations in kimchi.

    Science.gov (United States)

    Park, Eun-Jin; Chang, Ho-Won; Kim, Kyoung-Ho; Nam, Young-Do; Roh, Seong Woon; Bae, Jin-Woo

    2009-12-01

    Kimchi is a Korean traditional fermented food made of brined vegetables, with a variety of spices. Various microorganisms are associated with the kimchi fermentation process. This study was undertaken in order to apply quantitative real-time PCR targeting the 16S and 26S rRNA genes for the investigation of dynamics of bacterial, archaeal, and yeast communities during fermentation of various types of kimchi. Although the total bacterial and archaeal rRNA gene copy numbers increased during kimchi fermentation, the number of yeasts was not significantly altered. In 1 ng of bulk DNA, the mean number of rRNA gene copies for all strains of bacteria was 5.45 x 10(6) which was 360 and 50 times greater than those for archaea and yeast, respectively. The total gene copy number for each group of microorganisms differed among the different types of kimchi, although the relative ratios among them were similar. The common dominance of bacteria in the whole microbial communities of various types of kimchi suggests that bacteria play a principal role in the kimchi fermentation process.

  8. Novel atomic absorption spectrometric and rapid spectrophotometric methods for the quantitation of paracetamol in saliva: application to pharmacokinetic studies.

    Science.gov (United States)

    Issa, M M; Nejem, R M; El-Abadla, N S; Al-Kholy, M; Saleh, Akila A

    2008-01-01

    A novel atomic absorption spectrometric method and two highly sensitive spectrophotometric methods were developed for the determination of paracetamol. These techniques based on the oxidation of paracetamol by iron (III) (method I); oxidation of p-aminophenol after the hydrolysis of paracetamol (method II). Iron (II) then reacts with potassium ferricyanide to form Prussian blue color with a maximum absorbance at 700 nm. The atomic absorption method was accomplished by extracting the excess iron (III) in method II and aspirates the aqueous layer into air-acetylene flame to measure the absorbance of iron (II) at 302.1 nm. The reactions have been spectrometrically evaluated to attain optimum experimental conditions. Linear responses were exhibited over the ranges 1.0-10, 0.2-2.0 and 0.1-1.0 mug/ml for method I, method II and atomic absorption spectrometric method, respectively. A high sensitivity is recorded for the proposed methods I and II and atomic absorption spectrometric method value indicate: 0.05, 0.022 and 0.012 mug/ml, respectively. The limit of quantitation of paracetamol by method II and atomic absorption spectrometric method were 0.20 and 0.10 mug/ml. Method II and the atomic absorption spectrometric method were applied to demonstrate a pharmacokinetic study by means of salivary samples in normal volunteers who received 1.0 g paracetamol. Intra and inter-day precision did not exceed 6.9%.

  9. Linear regression analysis and its application to multivariate chromatographic calibration for the quantitative analysis of two-component mixtures.

    Science.gov (United States)

    Dinç, Erdal; Ozdemir, Abdil

    2005-01-01

    Multivariate chromatographic calibration technique was developed for the quantitative analysis of binary mixtures enalapril maleate (EA) and hydrochlorothiazide (HCT) in tablets in the presence of losartan potassium (LST). The mathematical algorithm of multivariate chromatographic calibration technique is based on the use of the linear regression equations constructed using relationship between concentration and peak area at the five-wavelength set. The algorithm of this mathematical calibration model having a simple mathematical content was briefly described. This approach is a powerful mathematical tool for an optimum chromatographic multivariate calibration and elimination of fluctuations coming from instrumental and experimental conditions. This multivariate chromatographic calibration contains reduction of multivariate linear regression functions to univariate data set. The validation of model was carried out by analyzing various synthetic binary mixtures and using the standard addition technique. Developed calibration technique was applied to the analysis of the real pharmaceutical tablets containing EA and HCT. The obtained results were compared with those obtained by classical HPLC method. It was observed that the proposed multivariate chromatographic calibration gives better results than classical HPLC.

  10. Quantitative easing

    OpenAIRE

    Faustino, Rui Alexandre Rodrigues Veloso

    2012-01-01

    A Work Project, presented as part of the requirements for the Award of a Masters Degree in Economics from the NOVA – School of Business and Economics Since November 2008, the Federal Reserve of the United States pursued a series of large-scale asset purchases, known as Quantitative Easing. In this Work Project, I describe the context, the objectives and the implementation of the Quantitative Easing. Additionally, I discuss its expected effects. Finally, I present empirical evidence of the ...

  11. Development and application of a backscatter lidar forward operator for quantitative validation of aerosol dispersion models and future data assimilation

    Science.gov (United States)

    Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Strohbach, Jens; Förstner, Jochen; Potthast, Roland

    2017-12-01

    A new backscatter lidar forward operator was developed which is based on the distinct calculation of the aerosols' backscatter and extinction properties. The forward operator was adapted to the COSMO-ART ash dispersion simulation of the Eyjafjallajökull eruption in 2010. While the particle number concentration was provided as a model output variable, the scattering properties of each individual particle type were determined by dedicated scattering calculations. Sensitivity studies were performed to estimate the uncertainties related to the assumed particle properties. Scattering calculations for several types of non-spherical particles required the usage of T-matrix routines. Due to the distinct calculation of the backscatter and extinction properties of the models' volcanic ash size classes, the sensitivity studies could be made for each size class individually, which is not the case for forward models based on a fixed lidar ratio. Finally, the forward-modeled lidar profiles have been compared to automated ceilometer lidar (ACL) measurements both qualitatively and quantitatively while the attenuated backscatter coefficient was chosen as a suitable physical quantity. As the ACL measurements were not calibrated automatically, their calibration had to be performed using satellite lidar and ground-based Raman lidar measurements. A slight overestimation of the model-predicted volcanic ash number density was observed. Major requirements for future data assimilation of data from ACL have been identified, namely, the availability of calibrated lidar measurement data, a scattering database for atmospheric aerosols, a better representation and coverage of aerosols by the ash dispersion model, and more investigation in backscatter lidar forward operators which calculate the backscatter coefficient directly for each individual aerosol type. The introduced forward operator offers the flexibility to be adapted to a multitude of model systems and measurement setups.

  12. Genetic neural networks for quantitative structure-activity relationships: improvements and application of benzodiazepine affinity for benzodiazepine/GABAA receptors.

    Science.gov (United States)

    So, S S; Karplus, M

    1996-12-20

    A novel tool, called a genetic neural network (GNN), has been developed for obtaining quantitative structure-activity relationships (QSAR) for high-dimensional data sets (J. Med. Chem. 1996, 39, 1521-1530). The GNN method uses a neural network to correlate activity with descriptors that are preselected by a genetic algorithm. To provide an extended test of the GNN method, the data on 57 benzodiazepines given by Maddalena and Johnston (MJ; J. Med. Chem. 1995, 38, 715-724) have been examined with an enhanced version of GNN, and the results are compared with the excellent QSAR of MJ. The problematic steepest descent training has been replaced by the scaled conjugate gradient algorithm. This leads to a substantial gain in performance in both robustness of prediction and speed of computation. The cross-validation GNN simulation and the subsequent run based on an unbiased and more efficient protocol led to the discovery of other 10-descriptor QSARs that are superior to the best model of MJ based on backward elimination selection and neural network training. Results from a series of GNNs with a different number of inputs showed that a neural network with fewer inputs can produce QSARs as good as or even better than those with higher dimensions. The top-ranking models from a GNN simulation using only six input descriptors are presented, and the chemical significance of the chosen descriptors is discussed. The statistical significance of these GNN QSARs is validated. The best QSARs are used to provide a graphical tool that aids the design of new drug analogues. By replacing functional groups at the 7- and 2'-positions with ones that have optimal substituent parameters, a number of new benzodiazepines with high potency are predicted.

  13. A Strategy to Find Suitable Reference Genes for miRNA Quantitative PCR Analysis and Its Application to Cervical Specimens.

    Science.gov (United States)

    Babion, Iris; Snoek, Barbara C; van de Wiel, Mark A; Wilting, Saskia M; Steenbergen, Renske D M

    2017-09-01

    miRNAs represent an emerging class of promising biomarkers for cancer diagnostics. To perform reliable miRNA expression analysis using quantitative PCR, adequate data normalization is essential to remove nonbiological, technical variations. Ideal reference genes should be biologically stable and reduce technical variability of miRNA expression analysis. Herein is a new strategy for the identification and evaluation of reference genes that can be applied for miRNA-based diagnostic tests without entailing excessive additional experiments. We analyzed the expression of 11 carefully selected candidate reference genes in different types of cervical specimens [ie, tissues, scrapes, and self-collected cervicovaginal specimens (self-samples)]. To identify the biologically most stable reference genes, three commonly used algorithms (GeNorm, NormFinder, and BestKeeper) were combined. Signal-to-noise ratios and P values between control and disease groups were calculated to validate the reduction in technical variability on expression analysis of two marker miRNAs. miR-423 was identified as a suitable reference gene for all sample types, to be used in combination with RNU24 in cervical tissues, RNU43 in scrapes, and miR-30b in self-samples. These findings demonstrate that the choice of reference genes may differ between different types of specimens, even when originating from the same anatomical source. More important, it is shown that adequate normalization increases the signal-to-noise ratio, which is not observed when normalizing to commonly used reference genes. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  14. Quantitative phosphoproteomics using acetone-based peptide labeling: Method evaluation and application to a cardiac ischemia/reperfusion model

    Science.gov (United States)

    Wijeratne, Aruna B.; Manning, Janet R.; Schultz, Jo El J.; Greis, Kenneth D.

    2013-01-01

    Mass spectrometry (MS) techniques to globally profile protein phosphorylation in cellular systems that are relevant to physiological or pathological changes have been of significant interest in biological research. In this report, an MS-based strategy utilizing an inexpensive acetone-based peptide labeling technique known as reductive alkylation by acetone (RABA) for quantitative phosphoproteomics was explored to evaluate its capacity. Since the chemistry for RABA-labeling for phosphorylation profiling had not been previously reported, it was first validated using a standard phosphoprotein and identical phosphoproteomes from cardiac tissue extracts. A workflow was then utilized to compare cardiac tissue phosphoproteomes from mouse hearts not expressing FGF2 vs. hearts expressing low molecular weight fibroblast growth factor-2 (LMW FGF2) to relate low molecular weight fibroblast growth factor-2 (LMW FGF2) mediated cardioprotective phenomena induced by ischemia/reperfusion (I/R) injury of hearts, with downstream phosphorylation changes in LMW FGF2 signaling cascades. Statistically significant phosphorylation changes were identified at 14 different sites on 10 distinct proteins including some with mechanisms already established for LMW FGF2-mediated cardioprotective signaling (e.g. connexin-43), some with new details linking LMW FGF2 to the cardioprotective mechanisms (e.g. cardiac myosin binding protein C or cMyBPC), and also several new downstream effectors not previously recognized for cardio-protective signaling by LMW FGF2. Additionally, one of the phosphopeptides, cMyBPC/pSer-282, identified was further verified with site-specific quantification using an SRM (selected reaction monitoring)-based approach that also relies on isotope labeling of a synthetic phosphopeptide with deuterated acetone as an internal standard. Overall, this study confirms that the inexpensive acetone-based peptide labeling can be used in both exploratory and targeted quantification

  15. Identification and validation of reference genes for quantitative real-time PCR normalization and its applications in lycium.

    Directory of Open Access Journals (Sweden)

    Shaohua Zeng

    Full Text Available Lycium barbarum and L. ruthenicum are extensively used as traditional Chinese medicinal plants. Next generation sequencing technology provides a powerful tool for analyzing transcriptomic profiles of gene expression in non-model species. Such gene expression can then be confirmed with quantitative real-time polymerase chain reaction (qRT-PCR. Therefore, use of systematically identified suitable reference genes is a prerequisite for obtaining reliable gene expression data. Here, we calculated the expression stability of 18 candidate reference genes across samples from different tissues and grown under salt stress using geNorm and NormFinder procedures. The geNorm-determined rank of reference genes was similar to those defined by NormFinder with some differences. Both procedures confirmed that the single most stable reference gene was ACNTIN1 for L. barbarum fruits, H2B1 for L. barbarum roots, and EF1α for L. ruthenicum fruits. PGK3, H2B2, and PGK3 were identified as the best stable reference genes for salt-treated L. ruthenicum leaves, roots, and stems, respectively. H2B1 and GAPDH1+PGK1 for L. ruthenicum and SAMDC2+H2B1 for L. barbarum were the best single and/or combined reference genes across all samples. Finally, expression of salt-responsive gene NAC, fruit ripening candidate gene LrPG, and anthocyanin genes were investigated to confirm the validity of the selected reference genes. Suitable reference genes identified in this study provide a foundation for accurately assessing gene expression and further better understanding of novel gene function to elucidate molecular mechanisms behind particular biological/physiological processes in Lycium.

  16. Identification and validation of reference genes for quantitative real-time PCR normalization and its applications in lycium.

    Science.gov (United States)

    Zeng, Shaohua; Liu, Yongliang; Wu, Min; Liu, Xiaomin; Shen, Xiaofei; Liu, Chunzhao; Wang, Ying

    2014-01-01

    Lycium barbarum and L. ruthenicum are extensively used as traditional Chinese medicinal plants. Next generation sequencing technology provides a powerful tool for analyzing transcriptomic profiles of gene expression in non-model species. Such gene expression can then be confirmed with quantitative real-time polymerase chain reaction (qRT-PCR). Therefore, use of systematically identified suitable reference genes is a prerequisite for obtaining reliable gene expression data. Here, we calculated the expression stability of 18 candidate reference genes across samples from different tissues and grown under salt stress using geNorm and NormFinder procedures. The geNorm-determined rank of reference genes was similar to those defined by NormFinder with some differences. Both procedures confirmed that the single most stable reference gene was ACNTIN1 for L. barbarum fruits, H2B1 for L. barbarum roots, and EF1α for L. ruthenicum fruits. PGK3, H2B2, and PGK3 were identified as the best stable reference genes for salt-treated L. ruthenicum leaves, roots, and stems, respectively. H2B1 and GAPDH1+PGK1 for L. ruthenicum and SAMDC2+H2B1 for L. barbarum were the best single and/or combined reference genes across all samples. Finally, expression of salt-responsive gene NAC, fruit ripening candidate gene LrPG, and anthocyanin genes were investigated to confirm the validity of the selected reference genes. Suitable reference genes identified in this study provide a foundation for accurately assessing gene expression and further better understanding of novel gene function to elucidate molecular mechanisms behind particular biological/physiological processes in Lycium.

  17. ImmunoRatio: a publicly available web application for quantitative image analysis of estrogen receptor (ER), progesterone receptor (PR), and Ki-67.

    Science.gov (United States)

    Tuominen, Vilppu J; Ruotoistenmäki, Sanna; Viitanen, Arttu; Jumppanen, Mervi; Isola, Jorma

    2010-01-01

    Accurate assessment of estrogen receptor (ER), progesterone receptor (PR), and Ki-67 is essential in the histopathologic diagnostics of breast cancer. Commercially available image analysis systems are usually bundled with dedicated analysis hardware and, to our knowledge, no easily installable, free software for immunostained slide scoring has been described. In this study, we describe a free, Internet-based web application for quantitative image analysis of ER, PR, and Ki-67 immunohistochemistry in breast cancer tissue sections. The application, named ImmunoRatio, calculates the percentage of positively stained nuclear area (labeling index) by using a color deconvolution algorithm for separating the staining components (diaminobenzidine and hematoxylin) and adaptive thresholding for nuclear area segmentation. ImmunoRatio was calibrated using cell counts defined visually as the gold standard (training set, n = 50). Validation was done using a separate set of 50 ER, PR, and Ki-67 stained slides (test set, n = 50). In addition, Ki-67 labeling indexes determined by ImmunoRatio were studied for their prognostic value in a retrospective cohort of 123 breast cancer patients. The labeling indexes by calibrated ImmunoRatio analyses correlated well with those defined visually in the test set (correlation coefficient r = 0.98). Using the median Ki-67 labeling index (20%) as a cutoff, a hazard ratio of 2.2 was obtained in the survival analysis (n = 123, P = 0.01). ImmunoRatio was shown to adapt to various staining protocols, microscope setups, digital camera models, and image acquisition settings. The application can be used directly with web browsers running on modern operating systems (e.g., Microsoft Windows, Linux distributions, and Mac OS). No software downloads or installations are required. ImmunoRatio is open source software, and the web application is publicly accessible on our website. We anticipate that free web applications, such as ImmunoRatio, will make the

  18. Development and Applications of a Laboratory Micro X-ray Fluorescence (μXRF) Spectrometer Using Monochromatic Excitation for Quantitative Elemental Analysis.

    Science.gov (United States)

    Garrevoet, Jan; Vekemans, Bart; Bauters, Stephen; Demey, Arne; Vincze, Laszlo

    2015-07-07

    The analytical characterization and an application example of a novel laboratory X-ray fluorescence (μXRF) microprobe is presented, which combines monochromatic, focused X-ray beam excitation with a high-performance silicon drift detector (SDD) and two-dimensional/three-dimensional (2D/3D) scanning capability. Because of the monochromatic excitation, below the (multiple) Compton/Rayleigh scattering peak region, the XRF spectra obtained by this laboratory spectrometer has similarly high peak-to-background ratios as those which can be obtained at synchrotron sources. However, the flux density difference between the proposed laboratory instrument and current synchrotron end stations is on the order of several orders of magnitude. As a result, sub-ppm minimum detection limits (MDL) for transition metals are obtained for a variety of sample matrices. The monochromatic excitation also allows for the efficient use of an iterative Monte Carlo simulation algorithm to obtain quantitative information on the analyzed samples. The analytical characteristics of this instrument and quantitative results, in combination with an iterative reverse Monte Carlo simulation algorithm, will be demonstrated using measurements conducted on an iron-containing meteorite.

  19. Application of Near-Infrared Spectroscopy to Quantitatively Determine Relative Content of Puccnia striiformis f. sp. tritici DNA in Wheat Leaves in Incubation Period

    Directory of Open Access Journals (Sweden)

    Yaqiong Zhao

    2017-01-01

    Full Text Available Stripe rust caused by Puccinia striiformis f. sp. tritici (Pst is a devastating wheat disease worldwide. Potential application of near-infrared spectroscopy (NIRS in detection of pathogen amounts in latently Pst-infected wheat leaves was investigated for disease prediction and control. A total of 300 near-infrared spectra were acquired from the Pst-infected leaf samples in an incubation period, and relative contents of Pst DNA in the samples were obtained using duplex TaqMan real-time PCR arrays. Determination models of the relative contents of Pst DNA in the samples were built using quantitative partial least squares (QPLS, support vector regression (SVR, and a method integrated with QPLS and SVR. The results showed that the kQPLS-SVR model built with a ratio of training set to testing set equal to 3 : 1 based on the original spectra, when the number of the randomly selected wavelength points was 700, the number of principal components was 8, and the number of the built QPLS models was 5, was the best. The results indicated that quantitative detection of Pst DNA in leaves in the incubation period could be implemented using NIRS. A novel method for determination of latent infection levels of Pst and early detection of stripe rust was provided.

  20. PETROGRAPHY AND APPLICATION OF THE RIETVELD METHOD TO THE QUANTITATIVE ANALYSIS OF PHASES OF NATURAL CLINKER GENERATED BY COAL SPONTANEOUS COMBUSTION

    Directory of Open Access Journals (Sweden)

    Pinilla A. Jesús Andelfo

    2010-06-01

    Full Text Available

    Fine-grained and mainly reddish color, compact and slightly breccious and vesicular pyrometamorphic rocks (natural clinker are associated to the spontaneous combustion of coal seams of the Cerrejón Formation exploited by Carbones del Cerrejón Limited in La Guajira Peninsula (Caribbean Region of Colombia. These rocks constitute remaining inorganic materials derived from claystones, mudstones and sandstones originally associated with the coal and are essentially a complex mixture of various amorphous and crystalline inorganic constituents. In this paper, a petrographic characterization of natural clinker, aswell as the application of the X-ray diffraction (Rietveld method by mean of quantitative analysis of its mineral phases were carried out. The RIQAS program was used for the refinement of X ray powder diffraction profiles, analyzing the importance of using the correct isostructural models for each of the existing phases, which were obtained from the Inorganic Crystal Structure Database (ICSD. The results obtained in this investigation show that the Rietveld method can be used as a powerful tool in the quantitative analysis of phases in polycrystalline samples, which has been a traditional problem in geology.

  1. Mining Discriminative Patterns from Graph Data with Multiple Labels and Its Application to Quantitative Structure-Activity Relationship (QSAR) Models.

    Science.gov (United States)

    Shao, Zheng; Hirayama, Yuya; Yamanishi, Yoshihiro; Saigo, Hiroto

    2015-12-28

    Graph data are becoming increasingly common in machine learning and data mining, and its application field pervades to bioinformatics and cheminformatics. Accordingly, as a method to extract patterns from graph data, graph mining recently has been studied and developed rapidly. Since the number of patterns in graph data is huge, a central issue is how to efficiently collect informative patterns suitable for subsequent tasks such as classification or regression. In this paper, we consider mining discriminative subgraphs from graph data with multiple labels. The resulting task has important applications in cheminformatics, such as finding common functional groups that trigger multiple drug side effects, or identifying ligand functional groups that hit multiple targets. In computational experiments, we first verify the effectiveness of the proposed approach in synthetic data, then we apply it to drug adverse effect prediction problem. In the latter dataset, we compared the proposed method with L1-norm logistic regression in combination with the PubChem/Open Babel fingerprint, in that the proposed method showed superior performance with a much smaller number of subgraph patterns. Software is available from https://github.com/axot/GLP.

  2. A guide through the computational analysis of isotope-labeled mass spectrometry-based quantitative proteomics data: an application study

    Directory of Open Access Journals (Sweden)

    Haußmann Ute

    2011-06-01

    Full Text Available Abstract Background Mass spectrometry-based proteomics has reached a stage where it is possible to comprehensively analyze the whole proteome of a cell in one experiment. Here, the employment of stable isotopes has become a standard technique to yield relative abundance values of proteins. In recent times, more and more experiments are conducted that depict not only a static image of the up- or down-regulated proteins at a distinct time point but instead compare developmental stages of an organism or varying experimental conditions. Results Although the scientific questions behind these experiments are of course manifold, there are, nevertheless, two questions that commonly arise: 1 which proteins are differentially regulated regarding the selected experimental conditions, and 2 are there groups of proteins that show similar abundance ratios, indicating that they have a similar turnover? We give advice on how these two questions can be answered and comprehensively compare a variety of commonly applied computational methods and their outcomes. Conclusions This work provides guidance through the jungle of computational methods to analyze mass spectrometry-based isotope-labeled datasets and recommends an effective and easy-to-use evaluation strategy. We demonstrate our approach with three recently published datasets on Bacillus subtilis 12 and Corynebacterium glutamicum 3. Special focus is placed on the application and validation of cluster analysis methods. All applied methods were implemented within the rich internet application QuPE 4. Results can be found at http://qupe.cebitec.uni-bielefeld.de.

  3. From HYSOMA to ENSOMAP - A new open source tool for quantitative soil properties mapping based on hyperspectral imagery from airborne to spaceborne applications

    Science.gov (United States)

    Chabrillat, Sabine; Guillaso, Stephane; Rabe, Andreas; Foerster, Saskia; Guanter, Luis

    2016-04-01

    Soil spectroscopy from the visible-near infrared to the short wave infrared has been shown to be a proven method for the quantitative prediction of key soil surface properties in the laboratory, field, and up to airborne studies for exposed soils in appropriate surface conditions. With the upcoming launch of the next generation of spaceborne hyperspectral sensors within the next 3 to 5 years (EnMAP, HISUI, PRISMA, SHALOM), a great potential for the global mapping and monitoring of soil properties is appearing. This potential can be achieved only if adequate software tools are available, as shown by the increasing demand for the availability/accessibility of hyperspectral soil products from the geoscience community that have neither the capacity nor the expertise to deliver these soil products. In this context, recently many international efforts were tuned toward the development of robust and easy-to-access soil algorithms to allow non-remote sensing experts to obtain geoscience information based on non-expensive software packages where repeatability of the results is an important prerequisite. In particular, several algorithms for geological and mineral mapping were recently released such as the U.S. Geological Survey Processing Routines in IDL for Spectroscopic Measurements (PRISM) software, or the GFZ EnMAP Geological Mapper. For quantitative soil mapping and monitoring, the HYSOMA (Hyperspectral Soil Mapper) software interface was developed at GFZ under the EUFAR (www.eufar.net) and the EnMAP (www.enmap.org) programs. HYSOMA was specifically oriented toward digital soil mapping applications and has been distributed since 2012 for free as IDL plug-ins under the IDL-virtual machine at www.gfz-potsdam.de/hysoma under a close source license. The HYSOMA interface focuses on fully automatic generation of semi-quantitative soil maps such as soil moisture, soil organic matter, iron oxide, clay content, and carbonate content. With more than 100 users around the world

  4. A validated high-resolution accurate mass LC-MS assay for quantitative determination of metoprolol and α-hydroxymetoprolol in human serum for application in pharmacokinetics

    Directory of Open Access Journals (Sweden)

    Sjoukje Postma-Kunnen

    2017-06-01

    Full Text Available To determine metoprolol and its metabolite α-hydroxymetoprolol in human serum we validated a method on an LC system with an Exactive® Orbitrap mass spectrometer (Thermo Scientific as detector and isotope-labelled metoprolol-d7 as internal standard. A simple sample preparation was used with water-acetonitrile (15:85, v/v as precipitation reagent. This method has a chromatographic run time of 15 min and linear calibration curves in the range of 5.0-250 μg/L for both metoprolol and α-hydroxymetoprolol. Validation showed the method to be accurate, with a good precision, selective and with a lower limit of quantitation of 2.0 μg/L for metoprolol and 1.0 μg/L for α-hydroxymetoprolol, respectively. This validated LC-Orbitrap MS analysis for metoprolol and α-hydroxymetoprolol can be used for application in human pharmacokinetics.

  5. Application of a Multiplex Quantitative PCR to Assess Prevalence and Intensity Of Intestinal Parasite Infections in a Controlled Clinical Trial.

    Directory of Open Access Journals (Sweden)

    Stacey Llewellyn

    2016-01-01

    Full Text Available Accurate quantitative assessment of infection with soil transmitted helminths and protozoa is key to the interpretation of epidemiologic studies of these parasites, as well as for monitoring large scale treatment efficacy and effectiveness studies. As morbidity and transmission of helminth infections are directly related to both the prevalence and intensity of infection, there is particular need for improved techniques for assessment of infection intensity for both purposes. The current study aimed to evaluate two multiplex PCR assays to determine prevalence and intensity of intestinal parasite infections, and compare them to standard microscopy.Faecal samples were collected from a total of 680 people, originating from rural communities in Timor-Leste (467 samples and Cambodia (213 samples. DNA was extracted from stool samples and subject to two multiplex real-time PCR reactions the first targeting: Necator americanus, Ancylostoma spp., Ascaris spp., and Trichuris trichiura; and the second Entamoeba histolytica, Cryptosporidium spp., Giardia. duodenalis, and Strongyloides stercoralis. Samples were also subject to sodium nitrate flotation for identification and quantification of STH eggs, and zinc sulphate centrifugal flotation for detection of protozoan parasites. Higher parasite prevalence was detected by multiplex PCR (hookworms 2.9 times higher, Ascaris 1.2, Giardia 1.6, along with superior polyparasitism detection with this effect magnified as the number of parasites present increased (one: 40.2% vs. 38.1%, two: 30.9% vs. 12.9%, three: 7.6% vs. 0.4%, four: 0.4% vs. 0%. Although, all STH positive samples were low intensity infections by microscopy as defined by WHO guidelines the DNA-load detected by multiplex PCR suggested higher intensity infections.Multiplex PCR, in addition to superior sensitivity, enabled more accurate determination of infection intensity for Ascaris, hookworms and Giardia compared to microscopy, especially in samples

  6. Potential application of quantitative microbiological risk assessment techniques to an aseptic-UHT process in the food industry.

    Science.gov (United States)

    Pujol, Laure; Albert, Isabelle; Johnson, Nicholas Brian; Membré, Jeanne-Marie

    2013-04-01

    Aseptic ultra-high-temperature (UHT)-type processed food products (e.g., milk or soup) are ready to eat products which are consumed extensively globally due to a combination of their comparative high quality and long shelf life, with no cold chain or other preservation requirements. Due to the inherent microbial vulnerability of aseptic-UHT product formulations, the safety and stability-related performance objectives (POs) required at the end of the manufacturing process are the most demanding found in the food industry. The key determinants to achieving sterility, and which also differentiates aseptic-UHT from in-pack sterilised products, are the challenges associated with the processes of aseptic filling and sealing. This is a complex process that has traditionally been run using deterministic or empirical process settings. Quantifying the risk of microbial contamination and recontamination along the aseptic-UHT process, using the scientifically based process quantitative microbial risk assessment (QMRA), offers the possibility to improve on the currently tolerable sterility failure rate (i.e., 1 defect per 10,000 units). In addition, benefits of applying QMRA are (i) to implement process settings in a transparent and scientific manner; (ii) to develop a uniform common structure whatever the production line, leading to a harmonisation of these process settings, and; (iii) to bring elements of a cost-benefit analysis of the management measures. The objective of this article is to explore how QMRA techniques and risk management metrics may be applied to aseptic-UHT-type processed food products. In particular, the aseptic-UHT process should benefit from a number of novel mathematical and statistical concepts that have been developed in the field of QMRA. Probabilistic techniques such as Monte Carlo simulation, Bayesian inference and sensitivity analysis, should help in assessing the compliance with safety and stability-related POs set at the end of the manufacturing

  7. Numerical techniques for quantitative evaluation of chemical reaction systems with volatile species and their applications to water radiolysis in BWRs

    Science.gov (United States)

    Ibe, Eishi; Uchida, Shunsuke

    1985-02-01

    A mass transfer model in boiling flow was proposed for computer simulation of chemical reaction systems. The model can be applied to a wide variety of chemical engineering applications including nuclear reactor plants. A statistical treatment for simple estimation formulae of distribution of chemical reagents in any plant from the simulated results of a specific plant was also proposed. These two mathematical techniques were applied to water radiolysis in BWR primary systems (Oskarshamn-2 and Dresden-2) to evaluate distributions of oxidizing reagents in the systems. Simulated results from the computer program agreed within a 20% error with measured hydrogen and oxygen concentrations. Hydrogen and oxygen concentrations in Dresden-2 estimated by means of the simplified formulae agreed within 26% error with those of the direct simulation results.

  8. A Quantitative Approach for Examining Female Status and Development Interrelationships: With Application to Pre-Beijing Data from the Philippines

    Directory of Open Access Journals (Sweden)

    Nimfa B. Ogena

    2000-12-01

    Full Text Available This paper addresses an important policy question, which has been taken for granted in most research: Does development enhance or worsen the status of women? The applicability of the Threshold Hypothesis, which posits a non-linear relationship between development and women's status, was tested using province-level data from a developing country, the Philippines. A contextual measure of female status relative to men, which is measured as gender inequality in education, health, work status, occupation, and industry for each province across time, accounts for the multidimensionality, heterogeneity and time- and context-variablity inherent in the femaile status concept. Development measures spanning the 1960s to the 1980s include year and development level. Composite development level indices, which were comparable across decades, were constructed using factor analysis. A change in development level over two decades was also measured. Pooled multiple-regression and correlation analyses, and regression standardization were employed. Results revealed that women were better off than men in health status but women fell behind men on the other four domains. The Threshold Hypothesis was applicable for education and health. Although thresholds still apply, the reverse pattern was found for work status, occupation, and industry. In addition, the unexpected second threshold found at the extreme right of the development scale for education and health further challenges the Modernization view on the positive linkage between status of women and development. Although findings in the study justify policy calls for continued development improvements for more gender-equitable environments, it is proposed that policies and efforts directed toward improving the status of women be guided by more detailed information on the critical linkages between various dimensions of development and women's status.

  9. Application of a rapid and efficient quantitative analysis method for traditional Chinese medicines: the case study of quality assessment of Salvia miltiorrhiza Bunge.

    Science.gov (United States)

    Jing, Wen-Guang; Zhang, Jun; Zhang, Li-Yan; Wang, Dong-Zhe; Wang, Yue-Sheng; Liu, An

    2013-06-13

    A reference extractive, containing multiple active known compounds, has been considered to be an alternative to individual reference standards. However, in the Chinese Pharmacopoeia (ChP) the great majority of reference extractives have been primarily used for qualitative identification by thin-layer chromatography (TLC) and few studies on the applicability of reference extractives for quantitative analysis have been presented. Using Salvia miltiorrhiza Bunge as an example in this paper, we first present a preliminary discussion on the feasibility and applicability of reference extractives for the quantitative analysis of TCMs. The reference extractive of S. miltiorrhiza Bunge, comprised of three pharmacological marker compounds, namely cryptotanshinone, tanshinone I and tanshinone IIA, was prepared from purchased Salvia miltiorrhiza Bunge by extraction with acetone under reflux, followed by silica gel column chromatography with stepwise elution with petroleum ether-ethyl acetate (25:1, v/v, 4.5 BV) to remove the non-target components and chloroform-methanol (10:1, v/v; 3 BV) to yield a crude reference extractive solution. After concentration, the solution was further purified by preparative reversed-phase HPLC on a C18 column with isocratic elution with 77% methanol aqueous solution to yield the total reference extractive of S. miltiorrhiza Bunge. Thereafter, the reference extractive was applied to the quality assessment of S. miltiorrhiza Bunge using high-performance liquid chromatography (HPLC) coupled with diode array detection (DAD). The validation of the method, including linearity, sensitivity, repeatability, stability and recovery testing, indicated that this method was valid, reliable and sensitive, with good reproducibility. The developed method was successfully applied to quantify seven batches of samples collected from different regions in China and the results were also similar to those obtained using reference standards, with relative standard

  10. Quantitative research.

    Science.gov (United States)

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  11. Application of a quantitative weight of evidence approach for ranking and prioritising occupational exposure scenarios for titanium dioxide and carbon nanomaterials.

    Science.gov (United States)

    Hristozov, Danail R; Gottardo, Stefania; Cinelli, Marco; Isigonis, Panagiotis; Zabeo, Alex; Critto, Andrea; Van Tongeren, Martie; Tran, Lang; Marcomini, Antonio

    2014-03-01

    Substantial limitations and uncertainties hinder the exposure assessment of engineered nanomaterials (ENMs). The present deficit of reliable measurements and models will inevitably lead in the near term to qualitative and uncertain exposure estimations, which may fail to support adequate risk assessment and management. Therefore it is necessary to complement the current toolset with user-friendly methods for near-term nanosafety evaluation. This paper proposes an approach for relative exposure screening of ENMs. For the first time, an exposure model explicitly implements quantitative weight of evidence (WoE) methods and utilises expert judgement for filling data gaps in the available evidence-base. Application of the framework is illustrated for screening of exposure scenarios for nanoscale titanium dioxide, carbon nanotubes and fullerenes, but it is applicable to other nanomaterials as well. The results show that the WoE-based model overestimates exposure for scenarios where expert judgement was substantially used to fill data gaps, which suggests its conservative nature. In order to test how variations in input data influence the obtained results, probabilistic Monte Carlo sensitivity analysis was applied to demonstrate that the model performs in stable manner.

  12. Quantitative determination of antidepressants and their select degradates by liquid chromatography/electrospray ionization tandem mass spectrometry in biosolids destined for land application.

    Science.gov (United States)

    Niemi, Lydia M; Stencel, Katherine A; Murphy, Madigan J; Schultz, Melissa M

    2013-08-06

    Antidepressants are one of the most widely dispensed classes of pharmaceuticals in the United States. As wastewater treatment plants are a primary source of pharmaceuticals in the environment, the use of biosolids as fertilizer is a potential route for antidepressants to enter the terrestrial environment. A microsolvent extraction method, utilizing green chemistry, was developed for extraction of the target antidepressants and degradation products from biosolids, or more specifically lagoon biosolids. Liquid chromatography/tandem mass spectrometry was used for quantitative determination of antidepressants in the lagoon biosolid extracts. Recoveries from matrix spiking experiments for the individual antidepressants had an average of 96%. The limits of detection for antidepressant pharmaceuticals and degradates ranged from 0.36 to 8.0 ng/kg wet weight. The method was applied to biosolids destined for land application. A suite of antidepressants was consistently detected in the lagoon biosolid samples, and thus antidepressants are being introduced to terrestrial environments through the land application of these biosolids. Sertraline and norsertraline were the most abundant antidepressant and degradation product detected in the biosolid samples. Detected, individual antidepressant concentrations ranged from 8.5 ng/kg (norfluoxetine) to 420 ng/kg wet weight (norsertraline).

  13. Quantitative Literacy.

    Science.gov (United States)

    Daniele, Vincent A.

    1993-01-01

    Quantitative literacy for students with deafness is addressed, noting work by the National Council of Teachers of Mathematics to establish curriculum standards for grades K-12. The standards stress problem solving, communication, reasoning, making mathematical connections, and the need for educators of the deaf to pursue mathematics literacy with…

  14. Determination of quantitative trait variants by concordance via application of the a posteriori granddaughter design to the U.S. Holstein population

    Science.gov (United States)

    Experimental designs that exploit family information can provide substantial predictive power in quantitative trait variant discovery projects. Concordance between quantitative trait locus genotype as determined by the a posteriori granddaughter design and marker genotype was determined for 29 trai...

  15. Biomization and quantitative climate reconstruction techniques in northwestern Mexico—With an application to four Holocene pollen sequences

    Science.gov (United States)

    Ortega-Rosas, C. I.; Guiot, J.; Peñalba, M. C.; Ortiz-Acosta, M. E.

    2008-04-01

    New paleovegetation and paleoclimatic reconstructions from the Sierra Madre Occidental (SMO) in northwestern Mexico are presented. This work involves climate and biome reconstruction using Plant Functional Types (PFT) assigned to pollen taxa. We used fossil pollen data from four Holocene peat bogs located at different altitudes (1500-2000 m) at the border region of Sonora and Chihuahua at around 28° N latitude (Ortega-Rosas, C.I. 2003. Palinología de la Ciénega de Camilo: datos para la historia de la vegetación y el clima del Holoceno medio y superior en el NW de la Sierra Madre Occidental, Sonora, Mexico. Master Thesis, Universidad Nacional Autónoma de México, México D.F.; Ortega-Rosas, C.I., Peñalba, M.C., Guiot, J. Holocene altitudinal shifts in vegetation belts and environmental changes in the Sierra Madre Occidental, Northwestern Mexico. Submitted for publication of Palaeobotany and Palynology). The closest modern pollen data come from pollen analysis across an altitudinal transect from the Sonoran Desert towards the highlands of the temperate SMO at the same latitude (Ortega-Rosas, C.I. 2003. Palinología de la Ciénega de Camilo: datos para la historia de la vegetación y el clima del Holoceno medio y superior en el NW de la Sierra Madre Occidental, Sonora, Mexico. Master Thesis, Universidad Nacional Autónoma de México, México D.F.). An additional modern pollen dataset of 400 sites across NW Mexico and the SW United States was compiled from different sources (Davis, O.K., 1995. Climate and vegetation pattern in surface samples from arid western U.S.A.: application to Holocene climatic reconstruction. Palynology 19, 95-119, North American Pollen Database, Latin-American Pollen Database, personal data, and different scientific papers). For the biomization method (Prentice, I.C., Guiot, J., Huntley, B., Jolly, D., Cheddadi, R., 1996. Reconstructing biomes from paleoecological data: a general method and its application to European pollen data at 0 and

  16. Effectiveness, acceptability and usefulness of mobile applications for cardiovascular disease self-management: Systematic review with meta-synthesis of quantitative and qualitative data.

    Science.gov (United States)

    Coorey, Genevieve M; Neubeck, Lis; Mulley, John; Redfern, Julie

    2018-01-01

    Background Mobile technologies are innovative, scalable approaches to reducing risk of cardiovascular disease but evidence related to effectiveness and acceptability remains limited. We aimed to explore the effectiveness, acceptability and usefulness of mobile applications (apps) for cardiovascular disease self-management and risk factor control. Design Systematic review with meta-synthesis of quantitative and qualitative data. Methods Comprehensive search of multiple databases (Medline, Embase, CINAHL, SCOPUS and Cochrane CENTRAL) and grey literature. Studies were included if the intervention was primarily an app aimed at improving at least two lifestyle behaviours in adults with cardiovascular disease. Meta-synthesis of quantitative and qualitative data was performed to review and evaluate findings. Results Ten studies of varying designs including 607 patients from five countries were included. Interventions targeted hypertension, heart failure, stroke and cardiac rehabilitation populations. Factors that improved among app users were rehospitalisation rates, disease-specific knowledge, quality of life, psychosocial well-being, blood pressure, body mass index, waist circumference, cholesterol and exercise capacity. Improved physical activity, medication adherence and smoking cessation were also characteristic of app users. Appealing app features included tracking healthy behaviours, self-monitoring, disease education and personalised, customisable content. Small samples, short duration and selection bias were noted limitations across some studies, as was the relatively low overall scientific quality of evidence. Conclusions Multiple behaviours and cardiovascular disease risk factors appear modifiable in the shorter term with use of mobile apps. Evidence for effectiveness requires larger, controlled studies of longer duration, with emphasis on process evaluation data to better understand important system- and patient-level characteristics.

  17. Workshop on quantitative dynamic stratigraphy

    Energy Technology Data Exchange (ETDEWEB)

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  18. Quantitative Pathology: Historical Background, Clinical Research ...

    African Journals Online (AJOL)

    Quantitative Pathology: Historical Background, Clinical Research and Application of Nuclear Morphometry and DNA Image Cytometry. A Buhmeida. Abstract. No Abstract Keywords: quantitative, pathology, nuclear, morphometry, cytometry, histogram. Libyan Journal of Medicine Vol. 1 (2) 2006: pp. 126-139.

  19. A quantitative microbial risk assessment model for total coliforms and E. coli in surface runoff following application of biosolids to grassland.

    Science.gov (United States)

    Clarke, Rachel; Peyton, Dara; Healy, Mark G; Fenton, Owen; Cummins, Enda

    2017-05-01

    In Ireland, the land application of biosolids is the preferred option of disposing of municipal sewage waste. Biosolids provide nutrients in the form of nitrogen, phosphorus, potassium and increases organic matter. It is also an economic way for a country to dispose of its municipal waste. However, biosolids may potentially contain a wide range of pathogens, and following rainfall events, may be transported in surface runoff and pose a potential risk to human health. Thus, a quantitative risk assessment model was developed to estimate potential pathogens in surface water and the environmental fate of the pathogens following dilution, residence time in a stream, die-off rate, drinking water treatment and human exposure. Surface runoff water quality data was provided by project partners. Three types of biosolids, anaerobically digested (AD), lime stabilised (LS), and thermally dried (TD)) were applied on micro plots. Rainfall was simulated at three time intervals (24, 48 and 360 h) following land application. It was assumed that this water entered a nearby stream and was directly abstracted for drinking water. Consumption data for drinking water and body weight was obtained from an Irish study and assigned distributions. Two dose response models for probability of illness were considered for total and faecal coliform exposure incorporating two different exposure scenarios (healthy populations and immuno-compromised populations). The simulated annual risk of illness for healthy populations was below the US EPA and World Health Organisation tolerable level of risk (10-4 and 10-6, respectively). However, immuno-compromised populations may still be at risk as levels were greater than the tolerable level of risk for that subpopulation. The sensitivity analysis highlighted the importance of residence time in a stream on the bacterial die-off rate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Application of a qualitative and quantitative real-time polymerase chain reaction method for detecting genetically modified papaya line 55-1 in papaya products.

    Science.gov (United States)

    Nakamura, Kosuke; Akiyama, Hiroshi; Takahashi, Yuki; Kobayashi, Tomoko; Noguchi, Akio; Ohmori, Kiyomi; Kasahara, Masaki; Kitta, Kazumi; Nakazawa, Hiroyuki; Kondo, Kazunari; Teshima, Reiko

    2013-01-15

    Genetically modified (GM) papaya (Carica papaya L.) line 55-1 (55-1), which is resistant to papaya ringspot virus infection, has been marketed internationally. Many countries have mandatory labeling regulations for GM foods, and there is a need for specific methods for detecting 55-1. Here, an event- and construct-specific real-time polymerase chain reaction (PCR) method was developed for detecting 55-1 in papaya products. Quantitative detection was possible for fresh papaya fruit up to dilutions of 0.001% and 0.01% (weight per weight [w/w]) for homozygous SunUp and heterozygous Rainbow cultivars, respectively, in non-GM papaya. The limit of detection and quantification was as low as 250 copies of the haploid genome according to a standard reference plasmid. The method was applicable to qualitative detection of 55-1 in eight types of processed products (canned papaya, pickled papaya, dried fruit, papaya-leaf tea, jam, puree, juice, and frozen dessert) containing papaya as a main ingredient. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Effects of Single and Combined Application of Organic, Biological and Chemical Fertilizers on Quantitative and Qualitative Yield of Coriander (Coriandrum sativum

    Directory of Open Access Journals (Sweden)

    M. Aghhavani Shajari

    2016-07-01

    Full Text Available Introduction: Medicinal plants were one of the main natural resources of Iran from ancient times. Coriander (Coriandrum sativum L. is from Apiaceae family that it has cultivated extensively in the world. Management and environmental factors such as nutritional management has a significant impact on the quantity and quality of plants. Application of organic fertilizers in conventional farming systems is not common and most of the nutritional need of plants supply through chemical fertilizers for short period. Excessive and unbalanced use of fertilizers in the long period, reduce crop yield and soil biological activity, accumulation of nitrates and heavy metals, and finally cause negative environmental effects and increase the cost of production. The use of bio-fertilizers and organic matter are taken into consideration to reduce the use of chemical fertilizers and increase the quality of most crops. Stability and soil fertility through the use of organic fertilizers are important due to having most of the elements required by plants and beneficial effects on physical, chemical, biological and soil fertility. Therefore, the aim of this research was to evaluate the effects of organic, biological and chemical fertilizers on quality and quantity characteristics of coriander. Materials and Methods: In order to study the effects of single and combined applications of organic, biological and chemical fertilizers on quantitative and qualitative characteristics of Coriander (Coriandrum sativum, an experiment was conducted based on a randomized complete block design with three replications and 12 treatments at Research Station, Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, in - 2011. Treatments included: (1 mycorrhizae (Glomus mosseae (2 biosulfur (Thiobacillus sp., (3 chemical fertilizer (NPK, (4 cow manure, 5( vermin compost, 6( mycorrhizae + chemical fertilizer, 7( mycorrhizae + cow manure, 8( mycorrhizae + vermicompost, 9( biosulfur

  2. The Application of Two-Eyed Seeing Decolonizing Methodology in Qualitative and Quantitative Research for the Treatment of Intergenerational Trauma and Substance Use Disorders

    National Research Council Canada - National Science Library

    Marsh, Teresa Naseba; Cote-Meek, Sheila; Toulouse, Pamela; Najavits, Lisa M; Young, Nancy L

    2015-01-01

    .... Using both qualitative and quantitative methods, the authors systematically discuss the research methodology with the hope to inspire other health researchers who are attempting to incorporate...

  3. Resonant elastic scattering of {sup 15}O and a new reaction path in the CNO cycle; Spectroscopie par diffusion elastique resonante d' {sup 15}O et nouveau chemin de reaction dans le cycle CNO

    Energy Technology Data Exchange (ETDEWEB)

    Stefan, Gheorghe Iulian [Ecole doctorale SIMEM, U.F.R. Sciences, Universite de Caen Basse-Normandie, 14032 Caen Cedex (France)

    2006-12-15

    This work presents a very accurate experimental method based on radioactive beams for the study of the spectroscopical properties of unbound states. It makes use of inverse kinematical elastic scattering of the ions of an radioactive beam from a target of stable nuclei. An application of the method for the study of radioactive nuclei of astrophysical interests is given, namely of {sup 19}Ne and {sup 16}F nuclei. It is shown that on the basis of the properties of proton-emitting unbound levels of {sup 19}Ne one can develop a method of experimental study of nova explosions. It is based on observation of gamma emissions following the gamma decays of the radionuclides generated in the explosion. The most interesting radioactive nucleus involved in this process is {sup 18}F the yield of which depends strongly on the rate of {sup 18}F(p,{alpha}){sup 15}O reaction. This yield depends in turn of the properties of the states of the ({sup 18}F + p) compound nucleus, i.e. the {sup 19}Ne nucleus. In addition it was studied the unbound {sup 16}F nucleus also of astrophysical significance in {sup 15}O rich environment. Since {sup 16}F is an unbound nucleus the reaction of {sup 15}O with protons, although abundant in most astrophysical media, appears to be negligible. Thus the question that was posed was whether the exotic {sup 15}O(p,{beta}{sup +}){sup 16}O resonant reaction acquires some importance in various astrophysical media. In this work one describes a novel approach to study the reaction mechanisms which could change drastically the role of non-bound nuclei in stellar processes. One implies this mechanism to the processes (p,{gamma})({beta}){sup +} and (p,{gamma}) (p,{gamma}) within {sup 15}O rich media. The experimental studies of the {sup 19}Ne and {sup 16}F were carried out with a radioactive beam of {sup 15}O ions of very low energy produced by SPIRAL at GANIL. To improve the energy resolution thin targets were used with a 0 angle of observation relative to the beam

  4. The Relationship between Student's Quantitative Skills, Application of Math, Science Courses, and Science Marks at Single-Sex Independent High Schools

    Science.gov (United States)

    Cambridge, David

    2012-01-01

    For independent secondary schools who offer rigorous curriculum to attract students, integration of quantitative skills in the science courses has become an important definition of rigor. However, there is little research examining students' quantitative skills in relation to high school science performance within the single-sex independent school…

  5. Effect of Sodium Chloride Concentrations and Its Foliar Application Time on Quantitative and Qualitative Characteristics of Pomegranate Fruit (Punica granatum L. CV. “Malas Saveh”

    Directory of Open Access Journals (Sweden)

    V. Rouhi

    2016-02-01

    Full Text Available Introduction: Pomegranate (Punica granatum L. belong to Punicaceae family is native to Iran and grown extensively in arid and semi-arid regions worldwide. Pomegranate is also important in human medicine and its components have a wide range of clinical applications. Cracking causes a major fruit loss, which is a serious commercial loss to farmers. Fruit cracking, seems to be a problem that lessens the marketability to a great extent. Fruit cracking is one of the physiological disorders wherever pomegranate trees are grown. It may be due to moisture imbalances as this fruit is very sensitive to variation in soil moisture prolonged drought causes hardening of skin and if this is followed by heavy irrigation the pulp grows then skin grows and cracks. Many factors i.e., climate, soil and irrigation, varieties, pruning, insects and nutrition statues influence the growth and production of fruit trees. Deficiencies of various nutrients are related to soil types, plants and even to various cultivars. Most nutrients are readily fixed in soil having different PH. Plant roots are unable to absorb these nutrients adequately from the dry topsoil. Foliar fertilization is particularly useful under conditions where the absorption of nutrients through the soil and this difficult situation to be present in the nutrients such as calcium. Since the calcium element is needed, so spraying them at the right time is correct way to save the plant requirements. Therefore, a research conducted on effect of sodium chloride concentrations and its foliar application time on quantitative and qualitative characteristics of pomegranate fruit (Punica granatum L. CV. “Malas Saveh”. Materials and Methods: An experiment conducted at Jarghoyeh, Esfahan, Iran in 2012. The factors were Sodium chloride (0, 5 and 10 g/L and times of spray (15, 45 and 75 days before harvest. The study was factorial experiment in the base of randomized complete blocks design with three replications

  6. Strategies for quantitation of phosphoproteomic data

    DEFF Research Database (Denmark)

    Palmisano, Giuseppe; Thingholm, Tine Engberg

    2010-01-01

    Recent developments in phosphoproteomic sample-preparation techniques and sensitive mass spectrometry instrumentation have led to large-scale identifications of phosphoproteins and phosphorylation sites from highly complex samples. This has facilitated the implementation of different quantitation...... will be on different quantitation strategies. Methods for metabolic labeling, chemical modification and label-free quantitation and their applicability or inapplicability in phosphoproteomic studies are discussed....

  7. Quantitative Computertomographie

    Directory of Open Access Journals (Sweden)

    Engelke K

    2002-01-01

    Full Text Available Die quantitative Computertomographie (QCT ist neben der Dual X-ray-Absorptiometry (DXA eine Standardmethode in der Osteodensitometrie. Wichtigste Meßorte, für die auch kommerzielle Lösungen existieren, sind die Lendenwirbelsäule (LWS und der distale Unterarm. Untersuchungen des Tibia- oder auch des Femurschaftes haben dagegen untergeordnete Bedeutung. Untersuchungen der LWS werden mit klinischen Ganzkörpertomographen durchgeführt. Dafür existieren spezielle Aufnahme- und Auswerteprotokolle. Für QCT-Messungen an peripheren Meßorten (pQCT, insbesondere am distalen Unterarm, wurden kompakte CT-Scanner entwickelt, die heute als Tischgeräte angeboten werden. Entscheidende Vorteile der QCT im Vergleich mit der DXA sind die exakte dreidimensionale Lokalisation des Meßvolumens, die isolierte Erfassung dieses Volumens ohne Überlagerung des umgebenden Gewebes und die Separation trabekulären und kortikalen Knochens. Mit QCT wird die Konzentration des Knochenmineralgehaltes innerhalb einer definierten Auswerteregion (ROI, region of interest bestimmt. Die Konzentration wird typischerweise als Knochenmineraldichte (BMD, bone mineral density bezeichnet und in g/cm3 angegeben. Dagegen wird mit dem projektiven Verfahren der DXA lediglich eine Flächenkonzentration in g/cm2 bestimmt, die in Analogie zur QCT als Flächendichte bezeichnet wird. Der Unterschied zwischen Dichte (QCT und Flächendichte (DXA wird aber in der Literatur meistens vernachlässigt.

  8. Quantitative Analysen

    Science.gov (United States)

    Hübner, Philipp

    Der heilige Gral jeglicher Analytik ist, den wahren Wert bestimmen zu können. Dies bedingt quantitative Messmethoden, welche in der molekularen Analytik nun seit einiger Zeit zur Verfügung stehen. Das generelle Problem bei der Quantifizierung ist, dass wir meistens den wahren Wert weder kennen noch bestimmen können! Aus diesem Grund behelfen wir uns mit Annäherungen an den wahren Wert, indem wir aus Laborvergleichsuntersuchungen den Median oder den (robusten) Mittelwert berechnen oder indem wir einen Erwartungswert (expected value) aufgrund der Herstellung des Probenmaterials berechnen. Bei diesen Versuchen der Annäherung an den wahren Wert findet beabsichtigterweise eine Normierung der Analytik statt, entweder nach dem demokratischen Prinzip, dass die Mehrheit bestimmt oder durch zur Verfügungsstellung von geeignetem zertifiziertem Referenzmaterial. Wir müssen uns folglich bewusst sein, dass durch dieses Vorgehen zwar garantiert wird, dass die Mehrheit der Analysenlaboratorien gleich misst, wir jedoch dabei nicht wissen, ob alle gleich gut oder allenfalls gleich schlecht messen.

  9. Optimisation de la masse des pertiques elastiques:Calcul et ...

    African Journals Online (AJOL)

    The present survey approaches the elastic frames analysis under mobile repeated loading, and their conception optimal to the elastic stage of the mechanical behavior of the material. A method of precision of the reports of rigidity of the different elements of the frames under mobile repeated loading has been developed.

  10. Caracterisation mecanique dynamique de materiaux poro-visco-elastiques

    Science.gov (United States)

    Renault, Amelie

    Poro-viscoelastic materials are well modelled with Biot-Allard equations. This model needs a number of geometrical parameters in order to describe the macroscopic geometry of the material and elastic parameters in order to describe the elastic properties of the material skeleton. Several characterisation methods of viscoelastic parameters of porous materials are studied in this thesis. Firstly, quasistatic and resonant characterization methods are described and analyzed. Secondly, a new inverse dynamic characterization of the same modulus is developed. The latter involves a two layers metal-porous beam, which is excited at the center. The input mobility is measured. The set-up is simplified compared to previous methods. The parameters are obtained via an inversion procedure based on the minimisation of the cost function comparing the measured and calculated frequency response functions (FRF). The calculation is done with a general laminate model. A parametric study identifies the optimal beam dimensions for maximum sensitivity of the inversion model. The advantage of using a code which is not taking into account fluid-structure interactions is the low computation time. For most materials, the effect of this interaction on the elastic properties is negligible. Several materials are tested to demonstrate the performance of the method compared to the classical quasi-static approaches, and set its limitations and range of validity. Finally, conclusions about their utilisation are given. Keywords. Elastic parameters, porous materials, anisotropy, vibration.

  11. The quantitative Morse theorem

    OpenAIRE

    Loi, Ta Le; Phien, Phan

    2013-01-01

    In this paper, we give a proof of the quantitative Morse theorem stated by {Y. Yomdin} in \\cite{Y1}. The proof is based on the quantitative Sard theorem, the quantitative inverse function theorem and the quantitative Morse lemma.

  12. Satellite- and pollen-based quantitative woody cover reconstructions for northern Asia: Verification and application to late-Quaternary pollen data

    Science.gov (United States)

    Tarasov, Pavel; Williams, John W.; Andreev, Andrei; Nakagawa, Takeshi; Bezrukova, Elena; Herzschuh, Ulrike; Igarashi, Yaeko; Müller, Stefanie; Werner, Kirstin; Zheng, Zhuo

    2007-12-01

    Accurate reconstruction of late-Quaternary vegetation cover is necessary for better understanding of past vegetation dynamics, the role of vegetation feedbacks in glacial-interglacial climate variations, and for validating vegetation and climate models. In this paper over 1700 surface-pollen spectra from the former Soviet Union, Mongolia, northern China, and northern Japan together with data from the Advanced Very High Resolution Radiometer (AVHRR) were used to calibrate modern-analogue method for quantitatively reconstructing past woody cover from fossil pollen data. The AVHRR-based estimates of woody cover percentages within a 21 × 21 km window around pollen sampling sites were attributed to the respective modern pollen spectra. Reconstructions of modern woody cover using the pollen data and best-modern-analogues (BMA) method matched well to the original AVHRR-based estimates, for both total woody cover ( r2 = 0.77) and its fractions, including broad-leaved ( r2 = 0.66), needle-leaved ( r2 = 0.79), deciduous ( r2 = 0.60) and evergreen ( r2 = 0.76) woody cover. Discrepancies in the pollen-AVHRR cross-validation may be caused by long-distance transport of arboreal pollen, patchy forest distributions, underrepresentation of Larix and Populus in pollen records, and errors in the AVHRR classification. The generally strong correlations encourage application of the modern-analogue approach for reconstructing late-Quaternary variations in vegetation cover from northern Asian fossil pollen records. At the last glacial maximum (LGM: ˜ 21,000 cal yr BP), areas presently occupied by boreal forest were much more open, suggesting a reduction in total woody cover to below 20% at most modern forest sites. Pollen records from northern and central Siberia suggest a rather quick spread of tree and shrub vegetation after 15,000 cal yr BP, presumably in response to increased summer insolation. Woody cover histories are spatially variable in the modern forest-steppe, where tree

  13. A kinetic-based sigmoidal model for the polymerase chain reaction and its application to high-capacity absolute quantitative real-time PCR

    Directory of Open Access Journals (Sweden)

    Stewart Don

    2008-05-01

    Full Text Available Abstract Background Based upon defining a common reference point, current real-time quantitative PCR technologies compare relative differences in amplification profile position. As such, absolute quantification requires construction of target-specific standard curves that are highly resource intensive and prone to introducing quantitative errors. Sigmoidal modeling using nonlinear regression has previously demonstrated that absolute quantification can be accomplished without standard curves; however, quantitative errors caused by distortions within the plateau phase have impeded effective implementation of this alternative approach. Results Recognition that amplification rate is linearly correlated to amplicon quantity led to the derivation of two sigmoid functions that allow target quantification via linear regression analysis. In addition to circumventing quantitative errors produced by plateau distortions, this approach allows the amplification efficiency within individual amplification reactions to be determined. Absolute quantification is accomplished by first converting individual fluorescence readings into target quantity expressed in fluorescence units, followed by conversion into the number of target molecules via optical calibration. Founded upon expressing reaction fluorescence in relation to amplicon DNA mass, a seminal element of this study was to implement optical calibration using lambda gDNA as a universal quantitative standard. Not only does this eliminate the need to prepare target-specific quantitative standards, it relegates establishment of quantitative scale to a single, highly defined entity. The quantitative competency of this approach was assessed by exploiting "limiting dilution assay" for absolute quantification, which provided an independent gold standard from which to verify quantitative accuracy. This yielded substantive corroborating evidence that absolute accuracies of ± 25% can be routinely achieved. Comparison

  14. Deterministic quantitative risk assessment development

    Energy Technology Data Exchange (ETDEWEB)

    Dawson, Jane; Colquhoun, Iain [PII Pipeline Solutions Business of GE Oil and Gas, Cramlington Northumberland (United Kingdom)

    2009-07-01

    Current risk assessment practice in pipeline integrity management is to use a semi-quantitative index-based or model based methodology. This approach has been found to be very flexible and provide useful results for identifying high risk areas and for prioritizing physical integrity assessments. However, as pipeline operators progressively adopt an operating strategy of continual risk reduction with a view to minimizing total expenditures within safety, environmental, and reliability constraints, the need for quantitative assessments of risk levels is becoming evident. Whereas reliability based quantitative risk assessments can be and are routinely carried out on a site-specific basis, they require significant amounts of quantitative data for the results to be meaningful. This need for detailed and reliable data tends to make these methods unwieldy for system-wide risk k assessment applications. This paper describes methods for estimating risk quantitatively through the calibration of semi-quantitative estimates to failure rates for peer pipeline systems. The methods involve the analysis of the failure rate distribution, and techniques for mapping the rate to the distribution of likelihoods available from currently available semi-quantitative programs. By applying point value probabilities to the failure rates, deterministic quantitative risk assessment (QRA) provides greater rigor and objectivity than can usually be achieved through the implementation of semi-quantitative risk assessment results. The method permits a fully quantitative approach or a mixture of QRA and semi-QRA to suit the operator's data availability and quality, and analysis needs. For example, consequence analysis can be quantitative or can address qualitative ranges for consequence categories. Likewise, failure likelihoods can be output as classical probabilities or as expected failure frequencies as required. (author)

  15. Application of multivariable analysis methods to the quantitative detection of gas by tin dioxide micro-sensors; Application des methodes d'analyse multivariables a la detection quantitative de gaz par microcapteurs a base de dioxyde d'etain

    Energy Technology Data Exchange (ETDEWEB)

    Perdreau, N.

    2000-01-17

    The electric conductivity of tin dioxide depends on the temperature of the material and on the nature and environment of the surrounding gas. This work shows that the treatment by multivariable analysis methods of electric conductance signals of one sensor allows to determine concentrations of binary or ternary mixtures of ethanol (0-80 ppm), carbon monoxide (0-300 ppm) and methane (0-1000 ppm). A part of this study has consisted of the design and the implementation of an automatic testing bench allowing to acquire the electric conductance of four sensors in thermal cycle and under gaseous cycles. It has also revealed some disturbing effects (humidity,..) of the measurement. Two techniques of sensor fabrication have been used to obtain conductances (depending of temperature) distinct for each gas, reproducible for the different sensors and enough stable with time to allow the exploitation of the signals by multivariable analysis methods (tin dioxide under the form of thin layers obtained by reactive evaporation or under the form of sintered powder bars). In a last part, it has been shown that the quantitative determination of gas by the application of chemo-metry methods is possible although the relation between the electric conductances in one part and the temperatures and concentrations in another part is non linear. Moreover, the modelling with the 'Partial Least Square' method and a pretreatment allows to obtain performance data comparable to those obtained with neural networks. (O.M.)

  16. Laboratory controlled quantitative information about reduction in air pollution using the "Basa njengo Magogo" methodology and applicability to low-smoke fuels

    CSIR Research Space (South Africa)

    Le Roux, Lukas J

    2005-01-01

    Full Text Available an experiment under controlled laboratory conditions to gather quantitative data on the reduction in particulate emissions associated with the Basa njengo Magogo method of lighting coal fires. CSIR was further contracted to assess whether the Basa njengo Magogo...

  17. Quantitative model calculation of the time-dependent protoporphyrin IX concentration in normal human epidermis after delivery of ALA by passive topical application or lontophoresis

    NARCIS (Netherlands)

    Star, Willem M.; Aalders, Maurice C. G.; Sac, Arnoldo; Sterenborg, Henricus J. C. M.

    2002-01-01

    We present a mathematical layer model to quantitatively calculate the diffusion of 5-aminolevulinic acid (ALA) in the skin in vivo, its uptake into the cells and its conversion to protoporphyrin IX (PpIX) and subsequently to heme. The model is a modification and extension of a recently presented

  18. Quantitative microbiological risk assessment as a tool to obtain useful information for risk managers - specific application to Listeria monocytogenes and ready-to-eat meat products

    NARCIS (Netherlands)

    Mataragas, M.; Zwietering, M.H.; Skandamis, P.N.; Drosinos, E.H.

    2010-01-01

    The presence of Listeria monocytogenes in a sliced cooked, cured ham-like meat product was quantitatively assessed. Sliced cooked, cured meat products are considered as high risk products. These ready-to-eat, RTE, products (no special preparation, e.g. thermal treatment, before eating is required),

  19. The applicability of TaqMan-based quantitative real-time PCR assays for detecting and enumeratIng Cryptosporidium spp. oocysts in the environment

    Science.gov (United States)

    Molecular detection methods such as PCR have been extensively used to type Cryptosporidium oocysts detected in the environment. More recently, studies have developed quantitative real-time PCR assays for detection and quantification of microbial contaminants in water as well as ...

  20. Quantitative Microbial Risk Assessment Tutorial: HSPF Setup, Application, and Calibration of Flows and Microbial Fate and Transport on an Example Watershed

    Science.gov (United States)

    A Quantitative Microbial Risk Assessment (QMRA) infrastructure that automates the manual process of characterizing transport of pathogens and microorganisms, from the source of release to a point of exposure, has been developed by loosely configuring a set of modules and process-...

  1. Novel application of quantitative single-photon emission computed-tomography/computed tomography to predict early response to methimazole in Graves' disease

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyun Joo; Bang, Ji In; Kim, Ji Young; Moon, Jae Hoon [Seoul National University Bundang Hospital, Seoul National University College of Medicine, Seongnam (Korea, Republic of); So, Young [Dept. of Nuclear Medicine, Konkuk University Medical Center, Seoul (Korea, Republic of); Lee, Won Woo [Institute of Radiation Medicine, Medical Research Center, Seoul National University, Seoul (Korea, Republic of)

    2017-06-15

    Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of {sup 99m}Tc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). A novel parameter of thyroid uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients.

  2. Novel Application of Quantitative Single-Photon Emission Computed Tomography/Computed Tomography to Predict Early Response to Methimazole in Graves' Disease.

    Science.gov (United States)

    Kim, Hyun Joo; Bang, Ji-In; Kim, Ji-Young; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2017-01-01

    Since Graves' disease (GD) is resistant to antithyroid drugs (ATDs), an accurate quantitative thyroid function measurement is required for the prediction of early responses to ATD. Quantitative parameters derived from the novel technology, single-photon emission computed tomography/computed tomography (SPECT/CT), were investigated for the prediction of achievement of euthyroidism after methimazole (MMI) treatment in GD. A total of 36 GD patients (10 males, 26 females; mean age, 45.3 ± 13.8 years) were enrolled for this study, from April 2015 to January 2016. They underwent quantitative thyroid SPECT/CT 20 minutes post-injection of 99mTc-pertechnetate (5 mCi). Association between the time to biochemical euthyroidism after MMI treatment and %uptake, standardized uptake value (SUV), functional thyroid mass (SUVmean × thyroid volume) from the SPECT/CT, and clinical/biochemical variables, were investigated. GD patients had a significantly greater %uptake (6.9 ± 6.4%) than historical control euthyroid patients (n = 20, 0.8 ± 0.5%, p < 0.001) from the same quantitative SPECT/CT protocol. Euthyroidism was achieved in 14 patients at 156 ± 62 days post-MMI treatment, but 22 patients had still not achieved euthyroidism by the last follow-up time-point (208 ± 80 days). In the univariate Cox regression analysis, the initial MMI dose (p = 0.014), %uptake (p = 0.015), and functional thyroid mass (p = 0.016) were significant predictors of euthyroidism in response to MMI treatment. However, only %uptake remained significant in a multivariate Cox regression analysis (p = 0.034). A %uptake cutoff of 5.0% dichotomized the faster responding versus the slower responding GD patients (p = 0.006). A novel parameter of thyroid %uptake from quantitative SPECT/CT is a predictive indicator of an early response to MMI in GD patients.

  3. HCG blood test - quantitative

    Science.gov (United States)

    ... blood test - quantitative; Beta-HCG blood test - quantitative; Pregnancy test - blood - quantitative ... of a screening test for Down syndrome. This test is also done to diagnose abnormal conditions not related to pregnancy that can raise HCG level.

  4. A Novel HPLC Method for the Concurrent Analysis and Quantitation of Seven Water-Soluble Vitamins in Biological Fluids (Plasma and Urine: A Validation Study and Application

    Directory of Open Access Journals (Sweden)

    Margherita Grotzkyj Giorgi

    2012-01-01

    Full Text Available An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B1, B2, B5, B6, B9, B12 in biological matrices (plasma and urine. Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males. Interday and intraday precision were <4% and <7%, respectively, for all vitamins. Recovery percentages ranged from 93% to 100%.

  5. A Novel HPLC Method for the Concurrent Analysis and Quantitation of Seven Water-Soluble Vitamins in Biological Fluids (Plasma and Urine): A Validation Study and Application

    Science.gov (United States)

    Grotzkyj Giorgi, Margherita; Howland, Kevin; Martin, Colin; Bonner, Adrian B.

    2012-01-01

    An HPLC method was developed and validated for the concurrent detection and quantitation of seven water-soluble vitamins (C, B1, B2, B5, B6, B9, B12) in biological matrices (plasma and urine). Separation was achieved at 30°C on a reversed-phase C18-A column using combined isocratic and linear gradient elution with a mobile phase consisting of 0.01% TFA aqueous and 100% methanol. Total run time was 35 minutes. Detection was performed with diode array set at 280 nm. Each vitamin was quantitatively determined at its maximum wavelength. Spectral comparison was used for peak identification in real samples (24 plasma and urine samples from abstinent alcohol-dependent males). Interday and intraday precision were vitamins. Recovery percentages ranged from 93% to 100%. PMID:22536136

  6. Application of a quantitative 1H-NMR method for the determination of amygdalin in Persicae semen, Armeniacae semen, and Mume fructus.

    Science.gov (United States)

    Tanaka, Rie; Nitta, Akane; Nagatsu, Akito

    2014-01-01

    A quantitative (1)H-NMR method (qHNMR) was used to measure the amygdalin content of Persicae semen, Armeniacae semen, and Mume fructus, in each of which amygdalin constitutes a major component. The purity of amygdalin was calculated from the ratio of the intensity of the amygdalin H-2 signal at δ 6.50 ppm in pyridine-d 5 to that of the hexamethyldisilane (HMD) signal at 0 ppm. The HMD concentration was corrected by the International System of Units (SI) traceability with certified reference material (CRM)-grade bisphenol A. qHNMR revealed the amygdalin contents to be 2.72 and 3.13% in 2 lots of Persicae semen, 3.62 and 5.19% in 2 lots of Armeniacae semen, and 0.23% in Mume fructus. Thus, we demonstrated the utility of this method for the quantitative analysis of crude drugs.

  7. Development and validation of a high throughput LC–MS/MS method for simultaneous quantitation of pioglitazone and telmisartan in rat plasma and its application to a pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Pinaki Sengupta

    2017-12-01

    Full Text Available Management of cardiovascular risk factors in diabetes demands special attention due to their co-existence. Pioglitazone (PIO and telmisartan (TLM combination can be beneficial in effective control of cardiovascular complication in diabetes. In this research, we developed and validated a high throughput LC–MS/MS method for simultaneous quantitation of PIO and TLM in rat plasma. This developed method is more sensitive and can quantitate the analytes in relatively shorter period of time compared to the previously reported methods for their individual quantification. Moreover, till date, there is no bioanalytical method available to simultaneously quantitate PIO and TLM in a single run. The method was validated according to the USFDA guidelines for bioanalytical method validation. A linear response of the analytes was observed over the range of 0.005–10 µg/mL with satisfactory precision and accuracy. Accuracy at four quality control levels was within 94.27%–106.10%. The intra- and inter-day precision ranged from 2.32%–10.14 and 5.02%–8.12%, respectively. The method was reproducible and sensitive enough to quantitate PIO and TLM in rat plasma samples of a preclinical pharmacokinetic study. Due to the potential of PIO-TLM combination to be therapeutically explored, this method is expected to have significant usefulness in future. Keywords: LC–MS/MS, Rat plasma, Pharmacokinetic applicability, Telmisartan, Pioglitazone, Pharmacokinetic application

  8. Application of quantitative light-induced fluorescence to determine the depth of demineralization of dental fluorosis in enamel microabrasion: a case report

    OpenAIRE

    Park, Tae-Young; Choi, Han-Sol; Ku, Hee-Won; Kim, Hyun-Su; Lee, Yoo-Jin; Min, Jeong-Bum

    2016-01-01

    Enamel microabrasion has become accepted as a conservative, nonrestorative method of removing intrinsic and superficial dysmineralization defects from dental fluorosis, restoring esthetics with minimal loss of enamel. However, it can be difficult to determine if restoration is necessary in dental fluorosis, because the lesion depth is often not easily recognized. This case report presents a method for analysis of enamel hypoplasia that uses quantitative light-induced fluorescence (QLF) follow...

  9. Application of molecular connectivity and electro-topological indices in quantitative structure-activity analysis of pyrazole derivatives as inhibitors of factor Xa and thrombin.

    Science.gov (United States)

    Krishnasamy, Chandravel; Raghuraman, Arjun; Kier, Lemont B; Desai, Umesh R

    2008-12-01

    Factor Xa and thrombin, two critical pro-coagulant enzymes of the clotting cascade, are the primary target of current anticoagulation research that aims to develop potent, orally bioavailable, synthetic small-molecule inhibitors. To determine structural features that might play important roles in factor Xa and thrombin recognition and oral bioavailability, quantitative structure-activity and structure-property analyses were performed on the factor Xa and thrombin inhibition data and Caco-2 cell-permeability data of 3-substituted pyrazole-5-carboxamides reported by Pinto et al. (J. Med. Chem. 2001, 44, 566). The factor Xa and thrombin inhibition potencies, and Caco-2 cell permeability of the 3-substituted pyrazole-5-carboxamides could be quantitatively described through molecular connectivity and atom level E-state indices. Different quantitative structure-activity and structure-property models were derived for each of the three biological properties. The models are statistically relevant with correlation coefficients of at least 0.9, and contain only two or three molecular descriptor variables. The study demonstrates the use of molecular connectivity and E-state indices in understanding factor Xa and thrombin inhibition. In addition, the models may be useful for predictive purposes in generating molecules with better potency, specificity, and oral bioavailability.

  10. GPC and quantitative phase imaging

    DEFF Research Database (Denmark)

    Palima, Darwin; Banas, Andrew Rafael; Villangca, Mark Jayson

    2016-01-01

    shaper followed by the potential of GPC for biomedical and multispectral applications where we experimentally demonstrate the active light shaping of a supercontinuum laser over most of the visible wavelength range. Finally, we discuss how GPC can be advantageously applied for Quantitative Phase Imaging...

  11. Quantitative measurement of the chemical composition of geological standards with a miniature laser ablation/ionization mass spectrometer designed for in situ application in space research

    Science.gov (United States)

    Neuland, M. B.; Grimaudo, V.; Mezger, K.; Moreno-García, P.; Riedo, A.; Tulej, M.; Wurz, P.

    2016-03-01

    A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface.

  12. Improved simultaneous quantitation of candesartan and hydrochlorthiazide in human plasma by UPLC–MS/MS and its application in bioequivalence studies

    Directory of Open Access Journals (Sweden)

    Bhupinder Singh

    2014-04-01

    Full Text Available A validated ultra-performance liquid chromatography mass spectrometric method (UPLC–MS/MS was used for the simultaneous quantitation of candesartan (CN and hydrochlorothiazide (HCT in human plasma. The analysis was performed on UPLC–MS/MS system using turbo ion spray interface. Negative ions were measured in multiple reaction monitoring (MRM mode. The analytes were extracted using a liquid–liquid extraction (LLE method by using 0.1 mL of plasma volume. The lower limit of quantitation for CN and HCT was 1.00 ng/mL whereas the upper limit of quantitation was 499.15 ng/mL and 601.61 ng/mL for CN and HCT respectively. CN d4 and HCT-13Cd2 were used as the internal standards for CN and HCT respectively. The chromatography was achieved within 2.0 min run time using a C18 Phenomenex, Gemini NX (100 mm×4.6 mm, 5 µm column with organic mixture:buffer solution (80:20, v/v at a flow rate of 0.800 mL/min. The method has been successfully applied to establish the bioequivalence of candesartan cilexetil (CNC and HCT immediate release tablets with reference product in human subjects. Keywords: Candesartan cilexetil, Hydrochlorothiazide, UPLC–MS/MS, Bioequivalence, Candesartan cilexetil-hydrochlorothiazide (ATACAND HCT

  13. Application of a tri-axial accelerometry-based portable motion recorder for the quantitative assessment of hippotherapy in children and adolescents with cerebral palsy.

    Science.gov (United States)

    Mutoh, Tomoko; Mutoh, Tatsushi; Takada, Makoto; Doumura, Misato; Ihara, Masayo; Taki, Yasuyuki; Tsubone, Hirokazu; Ihara, Masahiro

    2016-10-01

    [Purpose] This case series aims to evaluate the effects of hippotherapy on gait and balance ability of children and adolescents with cerebral palsy using quantitative parameters for physical activity. [Subjects and Methods] Three patients with gait disability as a sequela of cerebral palsy (one female and two males; age 5, 12, and 25 years old) were recruited. Participants received hippotherapy for 30 min once a week for 2 years. Gait parameters (step rate, step length, gait speed, mean acceleration, and horizontal/vertical displacement ratio) were measured using a portable motion recorder equipped with a tri-axial accelerometer attached to the waist before and after a 10-m walking test. [Results] There was a significant increase in step length between before and after a single hippotherapy session. Over the course of 2 year intervention, there was a significant increase in step rate, gait speed, step length, and mean acceleration and a significant improvement in horizontal/vertical displacement ratio. [Conclusion] The data suggest that quantitative parameters derived from a portable motion recorder can track both immediate and long-term changes in the walking ability of children and adolescents with cerebral palsy undergoing hippotherapy.

  14. Development and application of a quantitative real-time PCR assay for rapid detection of the multifaceted yeast Kazachstania servazzii in food.

    Science.gov (United States)

    Spanoghe, Martin; Godoy Jara, Mario; Rivière, John; Lanterbecq, Deborah; Gadenne, Martine; Marique, Thierry

    2017-04-01

    The beneficial contributions of Kazachstania servazzii are well-established in various food processes. This yeast also contributes in the spoilage of finished packaged food due to abundant gas production. In particular, an occurrence of K. servazzii was recently positively correlated with the formation of severe package swelling of some prepared fresh pizzas. To circumscribe this concern, a quantitative SYBR green real-time PCR assay based on a newly designed specific primer pair targeting the ribosomal ITS1-5.8S-ITS2 region of K. servazzii was developed. The quantification was enabled using a standard curve created from serially diluted plasmids containing the target sequence of the K. servazzii strain. A validation of the assay was achieved by enumeration of K. servazzii DNA copies from artificially infected culture broths containing non-contaminated pizza substrates. The newly developed method was then tested on total DNA extracted from packaged fresh pizzas, in which certain lots were swollen and thus suspected of containing K. servazzii. This study highlights that this newly developed quantitative assay is not only sufficiently sensitive, specific and reliable to be functionally used in food control as a routine method of detection, but also promising in specific studies that seek to further characterize the dynamic of this yeast in some increasingly popular food processes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Quantitative analysis of Fe and Co in Co-substituted magnetite using XPS: The application of non-linear least squares fitting (NLLSF)

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hongmei, E-mail: hmliu@gig.ac.cn [CAS Key Laboratory of Mineralogy and Metallogeny/Guangdong Provincial Key Laboratory of Mineral Physics and Materials, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou, 510640 (China); Wei, Gaoling [Guangdong Key Laboratory of Agricultural Environment Pollution Integrated Control, Guangdong Institute of Eco-Environmental and Soil Sciences, Guangzhou, 510650 (China); Xu, Zhen [School of Materials Science and Engineering, Central South University, Changsha, 410012 (China); Liu, Peng; Li, Ying [CAS Key Laboratory of Mineralogy and Metallogeny/Guangdong Provincial Key Laboratory of Mineral Physics and Materials, Guangzhou Institute of Geochemistry, Chinese Academy of Sciences, Guangzhou, 510640 (China); University of Chinese Academy of Sciences, Beijing, 100049 (China)

    2016-12-15

    Highlights: • XPS and Auger peak overlapping complicates Co-substituted magnetite quantification. • Disrurbance of Auger peaks was eliminated by non-linear least squares fitting. • Fitting greatly improved the accuracy of quantification for Co and Fe. • Catalytic activity of magnetite was enhanced with the increase of Co substitution. - Abstract: Quantitative analysis of Co and Fe using X-ray photoelectron spectroscopy (XPS) is of important for the evaluation of the catalytic ability of Co-substituted magnetite. However, the overlap of XPS peaks and Auger peaks for Co and Fe complicate quantification. In this study, non-linear least squares fitting (NLLSF) was used to calculate the relative Co and Fe contents of a series of synthesized Co-substituted magnetite samples with different Co doping levels. NLLSF separated the XPS peaks of Co 2p and Fe 2p from the Auger peaks of Fe and Co, respectively. Compared with a control group without fitting, the accuracy of quantification of Co and Fe was greatly improved after elimination by NLLSF of the disturbance of Auger peaks. A catalysis study confirmed that the catalytic activity of magnetite was enhanced with the increase of Co substitution. This study confirms the effectiveness and accuracy of the NLLSF method in XPS quantitative calculation of Fe and Co coexisting in a material.

  16. Development and application of a quantitative method based on LC-QqQ MS/MS for determination of steviol glycosides in Stevia leaves.

    Science.gov (United States)

    Molina-Calle, M; Sánchez de Medina, V; Delgado de la Torre, M P; Priego-Capote, F; Luque de Castro, M D

    2016-07-01

    Stevia is a currently well-known plant thanks to the presence of steviol glycosides, which are considered as sweeteners obtained from a natural source. In this research, a method based on LC-MS/MS by using a triple quadrupole detector was developed for quantitation of 8 steviol glycosides in extracts from Stevia leaves. The ionization and fragmentation parameters for selected reaction monitoring were optimized. Detection and quantitation limits ranging from 0.1 to 0.5ng/mL and from 0.5 to 1ng/mL, respectively, were achieved: the lowest attained so far. The steviol glycosides were quantified in extracts from leaves of seven varieties of Stevia cultivated in laboratory, greenhouse and field. Plants cultivated in field presented higher concentration of steviol glycosides than those cultivated in greenhouse. Thus, the way of cultivation clearly influences the concentration of these compounds. The inclusion of branches together with leaves as raw material was also evaluated, showing that this inclusion modifies, either positively or negatively, the concentration of steviol glycosides. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Application of iTRAQ-Based Quantitative Proteomics Approach to Identify Deregulated Proteins Associated with Liver Toxicity Induced by Polygonum Multiflorum in Rats.

    Science.gov (United States)

    Lin, Longfei; Li, Hui; Lin, Hongmei; Zhang, Miao; Qu, Changhai; Yan, Lei; Yin, Xingbin; Ni, Jian

    2017-01-01

    Clinical reports on adverse reactions that result from Polygonum multiflorum (PM) and its preparations, especially regarding liver injury, have recently received widespread attention. This study aimed to investigate the mechanism of hepatotoxicity induced by different PM extracts through iTRAQ quantitative proteomics. The different PM extracts were orally administrated for 90 days to rats, and the hepatotoxicity effect was evaluated through measurement of biochemical indexes, oxidative damage indexes and hematoxylin-eosin (HE) staining. Then, the hepatotoxicity mechanism was investigated by iTRAQ quantitative proteomics. The results of biochemical and histopathological analyses showed that liver injury occurred in all groups of rats given by various PM extracts, which proved all of the PM extracts could induce hepatotoxicity. The hepatotoxicity mechanism may differ between the total extract group and the other groups through the results of biochemical indicators. The iTRAQ proteomics study showed that hepatotoxicity resulting from PM was mainly related to the abnormal activity of mitochondrion function-related oxidative phosphorylation pathways. This iTRAQ proteomics study revealed that the hepatotoxicity induced by PM is primarily related to the oxidative phosphorylation pathways. NADH dehydrogenase family proteins and Slc16a2 could be potential biomarkers of hepatotoxicity resulting from PM. © 2017 The Author(s). Published by S. Karger AG, Basel.

  18. Application of iTRAQ-Based Quantitative Proteomics Approach to Identify Deregulated Proteins Associated with Liver Toxicity Induced by Polygonum Multiflorum in Rats

    Directory of Open Access Journals (Sweden)

    Longfei Lin

    2017-10-01

    Full Text Available Background/Aims: Clinical reports on adverse reactions that result from Polygonum multiflorum (PM and its preparations, especially regarding liver injury, have recently received widespread attention. This study aimed to investigate the mechanism of hepatotoxicity induced by different PM extracts through iTRAQ quantitative proteomics. Methods: The different PM extracts were orally administrated for 90 days to rats, and the hepatotoxicity effect was evaluated through measurement of biochemical indexes, oxidative damage indexes and hematoxylin-eosin (HE staining. Then, the hepatotoxicity mechanism was investigated by iTRAQ quantitative proteomics. Results: The results of biochemical and histopathological analyses showed that liver injury occurred in all groups of rats given by various PM extracts, which proved all of the PM extracts could induce hepatotoxicity. The hepatotoxicity mechanism may differ between the total extract group and the other groups through the results of biochemical indicators. The iTRAQ proteomics study showed that hepatotoxicity resulting from PM was mainly related to the abnormal activity of mitochondrion function-related oxidative phosphorylation pathways. Conclusion: This iTRAQ proteomics study revealed that the hepatotoxicity induced by PM is primarily related to the oxidative phosphorylation pathways. NADH dehydrogenase family proteins and Slc16a2 could be potential biomarkers of hepatotoxicity resulting from PM.

  19. Quantitative Decision Support Requires Quantitative User Guidance

    Science.gov (United States)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  20. Absolute quantitation of Met using mass spectrometry for clinical application: assay precision, stability, and correlation with MET gene amplification in FFPE tumor tissue.

    Directory of Open Access Journals (Sweden)

    Daniel V T Catenacci

    Full Text Available Overexpression of Met tyrosine kinase receptor is associated with poor prognosis. Overexpression, and particularly MET amplification, are predictive of response to Met-specific therapy in preclinical models. Immunohistochemistry (IHC of formalin-fixed paraffin-embedded (FFPE tissues is currently used to select for 'high Met' expressing tumors for Met inhibitor trials. IHC suffers from antibody non-specificity, lack of quantitative resolution, and, when quantifying multiple proteins, inefficient use of scarce tissue.After describing the development of the Liquid-Tissue-Selected Reaction Monitoring-mass spectrometry (LT-SRM-MS Met assay, we evaluated the expression level of Met in 130 FFPE gastroesophageal cancer (GEC tissues. We assessed the correlation of SRM Met expression to IHC and mean MET gene copy number (GCN/nucleus or MET/CEP7 ratio by fluorescence in situ hybridization (FISH.Proteomic mapping of recombinant Met identified 418TEFTTALQR426 as the optimal SRM peptide. Limits of detection (LOD and quantitation (LOQ for this peptide were 150 and 200 amol/µg tumor protein, respectively. The assay demonstrated excellent precision and temporal stability of measurements in serial sections analyzed one year apart. Expression levels of 130 GEC tissues ranged (<150 amol/µg to 4669.5 amol/µg. High correlation was observed between SRM Met expression and both MET GCN and MET/CEP7 ratio as determined by FISH (n = 30; R2 = 0.898. IHC did not correlate well with SRM (n = 44; R2 = 0.537 nor FISH GCN (n = 31; R2 = 0.509. A Met SRM level of ≥1500 amol/µg was 100% sensitive (95% CI 0.69-1 and 100% specific (95% CI 0.92-1 for MET amplification.The Met SRM assay measured the absolute Met levels in clinical tissues with high precision. Compared to IHC, SRM provided a quantitative and linear measurement of Met expression, reliably distinguishing between non-amplified and amplified MET tumors. These results demonstrate a novel

  1. Application of direct HPTLC-MALDI for the qualitative and quantitative profiling of neutral and acidic glycosphingolipids: the case of NEU3 overexpressing C2C12 murine myoblasts.

    Science.gov (United States)

    Torretta, Enrica; Vasso, Michele; Fania, Chiara; Capitanio, Daniele; Bergante, Sonia; Piccoli, Marco; Tettamanti, Guido; Anastasia, Luigi; Gelfi, Cecilia

    2014-05-01

    Glycosphingolipids (GSLs) are a class of ubiquitous lipids characterized by a wide structural repertoire and a variety of functional implications. Importantly, altered levels have been correlated with different diseases, suggesting their crucial role in health. Conventional methods for the characterization and quantification are based on high-performance TLC (HPTLC) separation and comparison with the migration distance of standard samples or on MS. We set up and herein report the application of an ImagePrep method for glycosphingolipids qualitative and quantitative profiling through direct HPTLC-MALDI with particular application to wild-type and NEU3 sialidase-overexpressing C2C12 myoblasts. Lipids were analyzed by HPTLC, coupled with MALDI-TOF, and the resulting GSLs profiles were compared to the [³H]sphingolipids HPTLC patterns obtained after metabolic radiolabeling. GSLs detection by HPTLC-MALDI was optimized by testing different methods for matrix delivery and by performing quantitative analyses using serial dilutions of GSLs standards. Through this approach an accurate analysis of each variant of neutral and acidic GSLs, including the detection of different fatty-acid chain variants for each GSL, was provided and these results demonstrated that HPTLC-MALDI is an easy and high-throughput analytical method for GSLs profiling, suggesting its use for an early detection of markers in different diseases, including cancer and heart ischemia. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Quantitative Methods for Molecular Diagnostic and Therapeutic Imaging

    OpenAIRE

    Li, Quanzheng

    2013-01-01

    This theme issue provides an overview on the basic quantitative methods, an in-depth discussion on the cutting-edge quantitative analysis approaches as well as their applications for both static and dynamic molecular diagnostic and therapeutic imaging.

  3. Application of quantitative light-induced fluorescence to determine the depth of demineralization of dental fluorosis in enamel microabrasion: a case report

    Directory of Open Access Journals (Sweden)

    Tae-Young Park

    2016-08-01

    Full Text Available Enamel microabrasion has become accepted as a conservative, nonrestorative method of removing intrinsic and superficial dysmineralization defects from dental fluorosis, restoring esthetics with minimal loss of enamel. However, it can be difficult to determine if restoration is necessary in dental fluorosis, because the lesion depth is often not easily recognized. This case report presents a method for analysis of enamel hypoplasia that uses quantitative light-induced fluorescence (QLF followed by a combination of enamel microabrasion with carbamide peroxide home bleaching. We describe the utility of QLF when selecting a conservative treatment plan and confirming treatment efficacy. In this case, the treatment plan was based on QLF analysis, and the selected combination treatment of microabrasion and bleaching had good results.

  4. Application of quantitative light-induced fluorescence to determine the depth of demineralization of dental fluorosis in enamel microabrasion: a case report.

    Science.gov (United States)

    Park, Tae-Young; Choi, Han-Sol; Ku, Hee-Won; Kim, Hyun-Su; Lee, Yoo-Jin; Min, Jeong-Bum

    2016-08-01

    Enamel microabrasion has become accepted as a conservative, nonrestorative method of removing intrinsic and superficial dysmineralization defects from dental fluorosis, restoring esthetics with minimal loss of enamel. However, it can be difficult to determine if restoration is necessary in dental fluorosis, because the lesion depth is often not easily recognized. This case report presents a method for analysis of enamel hypoplasia that uses quantitative light-induced fluorescence (QLF) followed by a combination of enamel microabrasion with carbamide peroxide home bleaching. We describe the utility of QLF when selecting a conservative treatment plan and confirming treatment efficacy. In this case, the treatment plan was based on QLF analysis, and the selected combination treatment of microabrasion and bleaching had good results.

  5. Quantitation of fungal mRNAs in complex substrates by reverse transcription PCR and its application to Phanerochaete chrysosporium-colonized soil.

    Science.gov (United States)

    Lamar, R T; Schoenike, B; Vanden Wymelenberg, A; Stewart, P; Dietrich, D M; Cullen, D

    1995-06-01

    Thorough analysis of fungi in complex substrates has been hampered by inadequate experimental tools for assessing physiological activity and estimating biomass. We report a method for the quantitative assessment of specific fungal mRNAs in soil. The method was applied to complex gene families of Phanerochaete chrysosporium, a white-rot fungus widely used in studies of organopollutant degradation. Among the genes implicated in pollutant degradation, two closely related lignin peroxidase transcripts were detected in soil. The pattern of lignin peroxidase gene expression was unexpected; certain transcripts abundant in defined cultures were not detected in soil cultures. Transcripts encoding cellobiohydrolases and beta-tubulin were also detected. The method will aid in defining the roles of specific genes in complex biological processes such as organopollutant degradation, developing strategies for strain improvement, and identifying specific fungi in environmental samples.

  6. Development and Application of a Screening Method of Absolute Quantitative PCR To Detect the Abuse of Sex Steroid Hormone Administration in Male Bovines.

    Science.gov (United States)

    Starvaggi Cucuzza, Laura; Biolatti, Bartolomeo; Divari, Sara; Pregel, Paola; Scaglione, Frine E; Sereno, Alessandra; Cannizzo, Francesca T

    2017-06-14

    A methodology for the absolute quantification of regucalcin gene through quantitative PCR was set up to confirm that the decrease of regucalcin gene expression in the testis is an effective biomarker for tracing sex steroid hormone treatment in bovine husbandry. On the basis of TaqMan technology, an external standard curve was generated. Using in vivo experiments, a ROC curve was developed to calculate the criterion value, specificity, and sensitivity for this potential biomarker. Then, regucalcin gene expression was assessed in veal calves and beef intended for human consumption. In 11 of 54 calves and in 5 of 70 beef cattle the regucalcin gene was expressed under their respective cutoff. Additionally, a mild decrease of regucalcin protein expression was revealed by immunohistochemistry in subjects tested positive via qPCR. These preliminary results suggest that this transcriptomics test may be employed as a novel diagnostic screening tool, improving significantly the overall efficacy of food control.

  7. Quantitative Structure-Relative Volatility Relationship Model for Extractive Distillation of Ethylbenzene/p-Xylene Mixtures: Application to Binary and Ternary Mixtures as Extractive Agents

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Young-Mook; Oh, Kyunghwan; You, Hwan; No, Kyoung Tai [Bioinformatics and Molecular Design Research Center, Seoul (Korea, Republic of); Jeon, Yukwon; Shul, Yong-Gun; Hwang, Sung Bo; Shin, Hyun Kil; Kim, Min Sung; Kim, Namseok; Son, Hyoungjun [Yonsei University, Seoul (Korea, Republic of); Chu, Young Hwan [Sangji University, Wonju (Korea, Republic of); Cho, Kwang-Hwi [Soongsil University, Seoul (Korea, Republic of)

    2016-04-15

    Ethylbenzene (EB) and p-xylene (PX) are important chemicals for the production of industrial materials; accordingly, their efficient separation is desired, even though the difference in their boiling points is very small. This paper describes the efforts toward the identification of high-performance extractive agents for EB and PX separation by distillation. Most high-performance extractive agents contain halogen atoms, which present health hazards and are corrosive to distillation plates. To avoid this disadvantage of extractive agents, we developed a quantitative structure-relative volatility relationship (QSRVR) model for designing safe extractive agents. We have previously developed and reported QSRVR models for single extractive agents. In this study, we introduce extended QSRVR models for binary and ternary extractive agents. The QSRVR models accurately predict the relative volatilities of binary and ternary extractive agents. The service to predict the relative volatility for binary and ternary extractive agents is freely available from the Internet at http://qsrvr.o pengsi.org/.

  8. A multiplex GC-MS/MS technique for the sensitive and quantitative single-run analysis of acidic phytohormones and related compounds, and its application to Arabidopsis thaliana.

    Science.gov (United States)

    Müller, Axel; Düchting, Petra; Weiler, Elmar W

    2002-11-01

    A highly sensitive and accurate multiplex gas chromatography-tandem mass spectrometry (GC-MS/MS) technique is reported for indole-3-acetic acid, abscisic acid, jasmonic acid, 12-oxo-phytodienoic acid and salicylic acid. The optimized setup allows the routine processing and analysis of up to 60 plant samples of between 20 and 200 mg of fresh weight per day. The protocol was designed and the equipment used was chosen to facilitate implementation of the method into other laboratories and to provide access to state-of-the-art analytical tools for the acidic phytohormones and related signalling molecules. Whole-plant organ-distribution maps for indole-3-acetic acid, abscisic acid, jasmonic acid, 12-oxo-phytodienoic acid and salicylic acid were generated for Arabidopsis thaliana (L.) Heynh. For leaves of A. thaliana, a spatial resolution of hormone quantitation down to approximately 2 mm(2) was achieved.

  9. [Method validation according to ISO 15189 and SH GTA 04: application for the extraction of DNA and its quantitative evaluation by a spectrophotometric assay].

    Science.gov (United States)

    Harlé, Alexandre; Lion, Maëva; Husson, Marie; Dubois, Cindy; Merlin, Jean-Louis

    2013-01-01

    According to the French legislation on medical biology (January 16th, 2010), all biological laboratories must be accredited according to ISO 15189 for at least 50% of their activities before the end of 2016. The extraction of DNA from a sample of interest, whether solid or liquid is one of the critical steps in molecular biology and specifically in somatic or constitutional genetic. The extracted DNA must meet a number of criteria such quality and also be in sufficient concentration to allow molecular biology assays such as the detection of somatic mutations. This paper describes the validation of the extraction and purification of DNA using chromatographic column extraction and quantitative determination by spectrophotometric assay, according to ISO 15189 and the accreditation technical guide in Human Health SH-GTA-04.

  10. Socioeconomic influences on biodiversity, ecosystem services and human well-being: a quantitative application of the DPSIR model in Jiangsu, China.

    Science.gov (United States)

    Hou, Ying; Zhou, Shudong; Burkhard, Benjamin; Müller, Felix

    2014-08-15

    One focus of ecosystem service research is the connection between biodiversity, ecosystem services and human well-being as well as the socioeconomic influences on them. Despite existing investigations, exact impacts from the human system on the dynamics of biodiversity, ecosystem services and human well-being are still uncertain because of the insufficiency of the respective quantitative analyses. Our research aims are discerning the socioeconomic influences on biodiversity, ecosystem services and human well-being and demonstrating mutual impacts between these items. We propose a DPSIR framework coupling ecological integrity, ecosystem services as well as human well-being and suggest DPSIR indicators for the case study area Jiangsu, China. Based on available statistical and surveying data, we revealed the factors significantly impacting biodiversity, ecosystem services and human well-being in the research area through factor analysis and correlation analysis, using the 13 prefecture-level cities of Jiangsu as samples. The results show that urbanization and industrialization in the urban areas have predominant positive influences on regional biodiversity, agricultural productivity and tourism services as well as rural residents' living standards. Additionally, the knowledge, technology and finance inputs for agriculture also have generally positive impacts on these system components. Concerning regional carbon storage, non-cropland vegetation cover obviously plays a significant positive role. Contrarily, the expansion of farming land and the increase of total food production are two important negative influential factors of biodiversity, ecosystem's food provisioning service capacity, regional tourism income and the well-being of the rural population. Our study provides a promising approach based on the DPSIR model to quantitatively capture the socioeconomic influential factors of biodiversity, ecosystem services and human well-being for human-environmental systems

  11. Development of pharmacophore similarity-based quantitative activity hypothesis and its applicability domain: applied on a diverse data-set of HIV-1 integrase inhibitors.

    Science.gov (United States)

    Kumar, Sivakumar Prasanth; Jasrai, Yogesh T; Mehta, Vijay P; Pandya, Himanshu A

    2015-01-01

    Quantitative pharmacophore hypothesis combines the 3D spatial arrangement of pharmacophore features with biological activities of the ligand data-set and predicts the activities of geometrically and/or pharmacophoric similar ligands. Most pharmacophore discovery programs face difficulties in conformational flexibility, molecular alignment, pharmacophore features sampling, and feature selection to score models if the data-set constitutes diverse ligands. Towards this focus, we describe a ligand-based computational procedure to introduce flexibility in aligning the small molecules and generating a pharmacophore hypothesis without geometrical constraints to define pharmacophore space, enriched with chemical features necessary to elucidate common pharmacophore hypotheses (CPHs). Maximal common substructure (MCS)-based alignment method was adopted to guide the alignment of carbon molecules, deciphered the MCS atom connectivity to cluster molecules in bins and subsequently, calculated the pharmacophore similarity matrix with the bin-specific reference molecules. After alignment, the carbon molecules were enriched with original atoms in their respective positions and conventional pharmacophore features were perceived. Distance-based pharmacophoric descriptors were enumerated by computing the interdistance between perceived features and MCS-aligned 'centroid' position. The descriptor set and biological activities were used to develop support vector machine models to predict the activities of the external test set. Finally, fitness score was estimated based on pharmacophore similarity with its bin-specific reference molecules to recognize the best and poor alignments and, also with each reference molecule to predict outliers of the quantitative hypothesis model. We applied this procedure to a diverse data-set of 40 HIV-1 integrase inhibitors and discussed its effectiveness with the reported CPH model.

  12. Application of ovine luteinizing hormone (LH) radioimmunoassay in the quantitation of LH in different mammalian species. [/sup 125/I tracer technique

    Energy Technology Data Exchange (ETDEWEB)

    Millar, R.P.; Aehnelt, C.

    1977-09-01

    A sensitive double antibody radioimmunoassay has been developed for measuring luteinizing hormone (LH) in various African mammalian species, using rabbit anti-ovine LH serum (GDN 15) and radioiodinated rat LH or ovine LH. Serum and pituitary homogenates from some African mammals (hyrax, reedbuck, sable, impala, tsessebe, thar, spring-hare, ground squirrel and cheetah, as well as the domestic sheep, cow and horse and laboratory rat and hamster) produced displacement curves parallel to that of the ovine LH standards. The specificity of the assay was examined in detail for one species, the rock hyrax. Radioimmunoassay and bioassay estimates of LH in hyrax pituitaries containing widely differing quantities of pituitary hormones were similar. In sexually active male hyrax mean plasma LH was 12.1 ng/ml and pituitary LH 194 ..mu..g/gland, but in sexually quiescent hyrax mean plasma LH was 2.4 ng/ml and mean pituitary LH 76 ..mu..g/gland. Intravenous injection of 10 ..mu..g of luteinizing hormone releasing hormone increased mean LH levels in hyrax from 0.9 ng/ml to 23.2 ng/ml by 30 min. Conversely, im injection of 250 ..mu..g testosterone induced a fall in LH levels in male hyrax from 1.7 ng/ml to 0.7 ng/ml 6 h after administration. Although the specificity of the assay for quantitating plasma LH in other species was not categorically established, there was a good correlation between plasma LH concentration and reproductive state in the bontebok, impala, spring-hare, thar, cheetah, domestic horse and laboratory rat, suggesting the potential use of the antiserum in quantitating LH in a variety of mammalian species.

  13. Stepwise sensitivity analysis from qualitative to quantitative: Application to the terrestrial hydrological modeling of a Conjunctive Surface-Subsurface Process (CSSP) land surface model

    Science.gov (United States)

    Gan, Yanjun; Liang, Xin-Zhong; Duan, Qingyun; Choi, Hyun Il; Dai, Yongjiu; Wu, Huan

    2015-06-01

    An uncertainty quantification framework was employed to examine the sensitivities of 24 model parameters from a newly developed Conjunctive Surface-Subsurface Process (CSSP) land surface model (LSM). The sensitivity analysis (SA) was performed over 18 representative watersheds in the contiguous United States to examine the influence of model parameters in the simulation of terrestrial hydrological processes. Two normalized metrics, relative bias (RB) and Nash-Sutcliffe efficiency (NSE), were adopted to assess the fit between simulated and observed streamflow discharge (SD) and evapotranspiration (ET) for a 14 year period. SA was conducted using a multiobjective two-stage approach, in which the first stage was a qualitative SA using the Latin Hypercube-based One-At-a-Time (LH-OAT) screening, and the second stage was a quantitative SA using the Multivariate Adaptive Regression Splines (MARS)-based Sobol' sensitivity indices. This approach combines the merits of qualitative and quantitative global SA methods, and is effective and efficient for understanding and simplifying large, complex system models. Ten of the 24 parameters were identified as important across different watersheds. The contribution of each parameter to the total response variance was then quantified by Sobol' sensitivity indices. Generally, parameter interactions contribute the most to the response variance of the CSSP, and only 5 out of 24 parameters dominate model behavior. Four photosynthetic and respiratory parameters are shown to be influential to ET, whereas reference depth for saturated hydraulic conductivity is the most influential parameter for SD in most watersheds. Parameter sensitivity patterns mainly depend on hydroclimatic regime, as well as vegetation type and soil texture. This article was corrected on 26 JUN 2015. See the end of the full text for details.

  14. Application of protein A-modified capillary-channeled polymer polypropylene fibers to the quantitation of IgG in complex matrices.

    Science.gov (United States)

    Trang, Hung K; Marcus, R Kenneth

    2017-08-05

    Polypropylene (PP) capillary-channeled polymer (C-CP) fibers loaded with recombinant Staphyloccocus aureus protein A (rSPA) were used as an affinity chromatography stationary phase for the quantitation of immunoglobulin G (IgG) in complex biological matrices. Optimization of the chromatographic method regarding mobile phase components and load/elution conditions was performed. The six-minute analysis, including a load step with 12mM phosphate at pH 7.4, an elution step with 0.025% phosphoric acid and a re-equilibration step, was employed for quantitation of IgG1 from 0.075 to 3.00mgmL -1 in an IgG-free CHO cell supernatant matrix. Quantification of IgG1 content in a different CHO cell line was accomplished using the external calibration curve as well as using a standard addition approach. The high level of agreement between the two approaches suggests that the protein A-modified C-CP fiber phase is immune from matrix effects due to concomitant species such as host cell proteins (HCPs), host cell DNA, media components and other leachables and extractables. The inter-day and intra-day precision of the method were 3.1 and 3.5%RSD respectively for a single column. Column-to-column variability was 1.31 and 6.62%RSD for elution time and peak area, respectively, across columns prepared in different batches. The method reported here is well-suited for IgG analysis in complex harvest cell culture media in both the development and production environments. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Information Theory and Security: Quantitative Information Flow

    Science.gov (United States)

    Malacaria, Pasquale; Heusser, Jonathan

    We present the information theoretical basis of Quantitative Information Flow. We show the relationship between lattices, partitions and information theoretical concepts and their applicability to quantify leakage of confidential information in programs, including looping programs.

  16. Quantitative Evaluation of MDP-Ca Salt and DCPD after Application of an MDP-based One-step Self-etching Adhesive on Enamel and Dentin.

    Science.gov (United States)

    Yokota, Yoko; Fujita, Kou Nakajima; Uchida, Ryoichiro; Aida, Etsuko; Aoki, Naoko Tabei; Aida, Masahiro; Nishiyama, Norihiro

    To investigate the effects of an experimental 10-methacryloyloxydecyl dihydrogen phosphate (MDP)-based one-step self-etching adhesive (EX adhesive) applied to enamel and dentin on the production of calcium salt of MDP (MDP-Ca salt) and dicalcium phosphate dehydrate (DCPD) at various periods. The EX adhesive was prepared. Bovine enamel and dentin reactants were prepared by varying the application period of the EX adhesive: 0.5, 1, 5, 30, 60 and 1440 min. Enamel and dentin reactants were analyzed using x-ray diffraction and solid-state phosphorus-31 nuclear magnetic resonance (31P NMR). Curvefitting analyses of corresponding 31P NMR spectra were performed. Enamel and dentin developed several types of MDP-Ca salts and DCPDs with amorphous and crystalline phases throughout the application period. The predominant molecular species of MDP-Ca salt was determined as the monocalcium salt of the MDP monomer. Dentin showed a faster production rate and greater produced amounts of MDP-Ca salt than did enamel, since enamel showed a knee-point in the production rate of the MDP-Ca salt at the application period of 5 min. In contrast, enamel developed greater amounts of DCPD than did dentin and two types of DCPDs with different crystalline phases at application periods > 30 min. The amounts of MDP-Ca salt developed during the 30-s application of the EX adhesive on enamel and dentin were 7.3 times and 21.2 times greater than DCPD, respectively. The MDP-based one-step adhesive yielded several types of MDP-Ca salts and DCPD with an amorphous phase during the 30-s application period on enamel and dentin.

  17. Contrasting P- T- t paths from the basement of the Tisia Unit (Slavonian Mts., NE Croatia): Application of quantitative phase diagrams and monazite age dating

    Science.gov (United States)

    Horváth, Péter; Balen, Dražen; Finger, Fritz; Tomljenović, Bruno; Krenn, Erwin

    2010-06-01

    Medium-grade mica schists and intercalated paragneisses and amphibolites from the basement of the Tisia Unit, Slavonian Mountains., northeastern Croatia, contain complexly zoned garnets. At the Kutjevo locality, mica schists are characterised by garnets with Mn-rich cores and Ca-rich rims. Mn decreases steadily from core to rim and Ca increases abruptly. This is in contrast to the paragneisses and amphibolites which contain garnets with smoothly decreasing Ca from core to rim. Quantitative phase diagrams and garnet composition isopleths calculated from bulk rock analyses reveal that the Ca-poor garnet cores in the mica schists formed during an earlier event at 584-592 °C and 6.4-7.8 kbar. Ca-rich rims formed at conditions of 600-660 °C and 11-12 kbar — calculated using garnet isopleths and mineral thermobarometry. The paragneiss and amphibolite provide similar P- T information for the later peak event (ca. 650 °C, 10-12 kbar) but do not preserve a record of the earlier, lower P- T event and modelling shows that garnet was not stable at these conditions. Contrary to previous studies on this outcrop and rock type, no staurolite was observed and quantitative phase diagrams contoured for H 2O mode isopleths indicate that the rock did not cross staurolite-bearing fields during the retrograde P- T path. Mica schists from the Krndija locality contain zoned polyphase garnets. Phase diagram calculations reveal that Ca-rich garnet cores formed between 520 and 630 °C and 7-8 kbar. Rims have a lower Ca content and formed at considerably reduced pressures together with andalusite and staurolite at ca 530-570 °C and 3-4 kbar. Since both localities were traditionally considered to be part of the same tectono-metamorphic unit, evidence presented here clearly shows that this cannot be the case. EMP monazite ages are Variscan (350 Ma) in the Krndija mica schists and around pre- or early Variscan (440 Ma) in the Kutjevo mica schists. We therefore propose a more complex

  18. Quantitative X-ray Elemental Imaging in Plant Materials at the Subcellular Level with a Transmission Electron Microscope: Applications and Limitations

    Directory of Open Access Journals (Sweden)

    Shaoliang Chen

    2014-04-01

    Full Text Available Energy-dispersive X-ray microanalysis (EDX is a technique for determining the distribution of elements in various materials. Here, we report a protocol for high-spatial-resolution X-ray elemental imaging and quantification in plant tissues at subcellular levels with a scanning transmission electron microscope (STEM. Calibration standards were established by producing agar blocks loaded with increasing KCl or NaCl concentrations. TEM-EDX images showed that the salts were evenly distributed in the agar matrix, but tended to aggregate at high concentrations. The mean intensities of K+, Cl−, and Na+ derived from elemental images were linearly correlated to the concentrations of these elements in the agar, over the entire concentration range tested (R > 0.916. We applied this method to plant root tissues. X-ray images were acquired at an actual resolution of 50 nm ´ 50 nm to 100 nm ´ 100 nm. We found that cell walls exhibited higher elemental concentrations than vacuoles. Plants exposed to salt stress showed dramatic accumulation of Na+ and Cl− in the transport tissues, and reached levels similar to those applied in the external solution (300 mM. The advantage of TEM-EDX mapping was the high-spatial-resolution achieved for imaging elemental distributions in a particular area with simultaneous quantitative analyses of multiple target elements.

  19. Application of Fuzzy Set Theory to Quantitative Analysis of Correctness of the Mathematical Model Based on the ADI Method during Solidification

    Directory of Open Access Journals (Sweden)

    Xiaofeng Niu

    2013-01-01

    Full Text Available The explicit finite difference (EFD method is used to calculate the casting temperature field during the solidification process. Because of its limited time step, the computational efficiency of the EFD method is lower than that of the alternating direction implicit (ADI method. A model based on the equivalent specific heat method and the ADI method that improves computational efficiency is established. The error of temperature field simulation comes from model simplification, the acceptable hypotheses and calculation errors caused by different time steps, and the different mesh numbers that are involved in the process of numerical simulation. This paper quantitatively analyzes the degree of similarity between simulated and experimental results by the hamming distance (HD. For a thick-walled position, the time step influences the simulation results of the temperature field and the number of casting meshes has little influence on the simulation results of temperature field. For a thin-walled position, the time step has minimal influence on the simulation results of the temperature field and the number of casting meshes has a larger influence on the simulation results of temperature field.

  20. Application of Hansch's model to capsaicinoids and capsinoids: a study using the quantitative structure-activity relationship. A novel method for the synthesis of capsinoids.

    Science.gov (United States)

    Barbero, Gerardo F; Molinillo, José M G; Varela, Rosa M; Palma, Miguel; Macías, Francisco A; Barroso, Carmelo G

    2010-03-24

    We describe a synthetic approach for two families of compounds, the capsaicinoids and capsinoids, as part of a study of the quantitative relationship between structure and activity. A total of 14 capsaicinoids of increasing lateral chain lengths, from 2 to 16 carbon atoms, were synthesized. In addition, 14 capsinoids with identical lateral chains, as well as capsiate and dihydrocapsiate, have been synthesized, and a new method for the synthesis of these compounds has been developed. The yields range from 48.35 to 98.98%. It has been found that the synthetic capsaicinoids and capsinoids present a lipophilia similar to those of the natural compounds and present similar biological activity. The bioactivity of the synthetic capsaicinoids and capsinoids decreases proportionally to the degree of difference in lipophilia (higher or lower) compared to the natural compounds. Biological activity was determined using the etiolated wheat (Triticum aestivum L.) coleoptiles bioassay and by comparing results of the synthesis with those presented by their counterpart natural compounds. The bioactivities found correlated directly to the lipophilic properties of the synthesized compounds.

  1. The optokinetic reflex as a tool for quantitative analyses of nervous system function in mice: application to genetic and drug-induced variation.

    Directory of Open Access Journals (Sweden)

    Hugh Cahill

    2008-04-01

    Full Text Available The optokinetic reflex (OKR, which serves to stabilize a moving image on the retina, is a behavioral response that has many favorable attributes as a test of CNS function. The OKR requires no training, assesses the function of diverse CNS circuits, can be induced repeatedly with minimal fatigue or adaptation, and produces an electronic record that is readily and objectively quantifiable. We describe a new type of OKR test apparatus in which computer-controlled visual stimuli and streamlined data analysis facilitate a relatively high throughput behavioral assay. We used this apparatus, in conjunction with infrared imaging, to quantify basic OKR stimulus-response characteristics for C57BL/6J and 129/SvEv mouse strains and for genetically engineered lines lacking one or more photoreceptor systems or with an alteration in cone spectral sensitivity. A second generation (F2 cross shows that the characteristic difference in OKR frequency between C57BL/6J and 129/SvEv is inherited as a polygenic trait. Finally, we demonstrate the sensitivity and high temporal resolution of the OKR for quantitative analysis of CNS drug action. These experiments show that the mouse OKR is well suited for neurologic testing in the context of drug discovery and large-scale phenotyping programs.

  2. Quantitative characterization of chitosan in the skin by Fourier-transform infrared spectroscopic imaging and ninhydrin assay: application in transdermal sciences.

    Science.gov (United States)

    Nawaz, A; Wong, T W

    2016-07-01

    The chitosan has been used as the primary excipient in transdermal particulate dosage form design. Its distribution pattern across the epidermis and dermis is not easily accessible through chemical assay and limited to radiolabelled molecules via quantitative autoradiography. This study explored Fourier-transform infrared spectroscopy imaging technique with built-in microscope as the means to examine chitosan molecular distribution over epidermis and dermis with the aid of histology operation. Fourier-transform infrared spectroscopy skin imaging was conducted using chitosan of varying molecular weights, deacetylation degrees, particle sizes and zeta potentials, obtained via microwave ligation of polymer chains at solution state. Both skin permeation and retention characteristics of chitosan increased with the use of smaller chitosan molecules with reduced acetyl content and size, and increased positive charge density. The ratio of epidermal to dermal chitosan content decreased with the use of these chitosan molecules as their accumulation in dermis (3.90% to 18.22%) was raised to a greater extent than epidermis (0.62% to 1.92%). A larger dermal chitosan accumulation nonetheless did not promote the transdermal polymer passage more than the epidermal chitosan. A small increase in epidermal chitosan content apparently could fluidize the stratum corneum and was more essential to dictate molecular permeation into dermis and systemic circulation. The histology technique aided Fourier-transform infrared spectroscopy imaging approach introduces a new dimension to the mechanistic aspect of chitosan in transdermal delivery. © 2015 The Authors Journal of Microscopy © 2015 Royal Microscopical Society.

  3. Poly(acrylic acid)-coated iron oxide nanoparticles: quantitative evaluation of the coating properties and applications for the removal of a pollutant dye.

    Science.gov (United States)

    Fresnais, J; Yan, M; Courtois, J; Bostelmann, T; Bée, A; Berret, J-F

    2013-04-01

    In this work, 6-12 nm iron oxide nanoparticles were synthesized and coated with poly(acrylic acid) chains of molecular weight 2100 g mol(-1). Based on a quantitative evaluation of the dispersions, the bare and coated particles were thoroughly characterized. The number densities of polymers adsorbed at the particle surface and of available chargeable groups were found to be 1.9±0.3 nm(-2) and 26±4 nm(-2), respectively. Occurring via a multi-site binding mechanism, the electrostatic coupling leads to a solid and resilient anchoring of the chains. To assess the efficacy of the particles for pollutant remediation, the adsorption isotherm of methylene blue molecules, a model of pollutant, was determined. The excellent agreement between the predicted and the measured amounts of adsorbed dyes suggests that most carboxylates participate to the complexation and adsorption mechanisms. An adsorption of 830 mg g(-1) was obtained. This quantity compares well with the highest values available for this dye. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. An overview on development and application of an experimental platform for quantitative cardiac imaging research in rabbit models of myocardial infarction.

    Science.gov (United States)

    Feng, Yuanbo; Bogaert, Jan; Oyen, Raymond; Ni, Yicheng

    2014-10-01

    To exploit the advantages of using rabbits for cardiac imaging research and to tackle the technical obstacles, efforts have been made under the framework of a doctoral research program. In this overview article, by cross-referencing the current literature, we summarize how we have developed a preclinical cardiac research platform based on modified models of reperfused myocardial infarction (MI) in rabbits; how the in vivo manifestations of cardiac imaging could be closely matched with those ex vivo macro- and microscopic findings; how these imaging outcomes could be quantitatively analyzed, validated and demonstrated; and how we could apply this cardiac imaging platform to provide possible solutions to certain lingering diagnostic and therapeutic problems in experimental cardiology. In particular, tissue components in acute cardiac ischemia have been stratified and characterized, post-infarct lipomatous metaplasia (LM) as a common but hardly illuminated clinical pathology has been identified in rabbit models, and a necrosis avid tracer as well as an anti-ischemic drug have been successfully assessed for their potential utilities in clinical cardiology. These outcomes may interest the researchers in the related fields and help strengthen translational research in cardiovascular diseases.

  5. Development of an LC–MS/MS method for the quantitation of deoxyglycychloxazol in rat plasma and its application in pharmacokinetic study

    Directory of Open Access Journals (Sweden)

    Rongshan Li

    2016-06-01

    Full Text Available Deoxyglycychloxazol (TY501 is a glycyrrhetinic acid derivative which exhibits high anti-inflammatory activity and reduced pseudoaldosteronism compared to glycyrrhetinic acid. In this study, a sensitive and rapid liquid chromatography–tandem mass spectrometry (LC–MS/MS method was established for the quantitation of TY501 in rat plasma. Plasma samples were treated by precipitating protein with methanol and supernatants were separated by a Symmetry C8 column with the mobile phase consisting of methanol and 10 mM ammonium formate (containing 0.1% of formic acid (90:10, v/v. The selected reaction monitoring (SRM transitions were performed at m/z 647.4→191.2 for TY501 and m/z 473.3→143.3 for astragaloside aglycone (IS in the positive ion mode with atmospheric pressure chemical ionization (APCI source. Calibration curve was linear over the concentration range of 5–5000 ng/mL. The lower limit of quantification was 5 ng/mL. The mean recovery was over 88%. The intra- and inter-day precisions were lower than 6.0% and 12.8%, respectively, and the accuracy was within ±1.3%. TY501 was stable under usual storage conditions and handling procedure. The validated method has been successfully applied to a pharmacokinetic study after oral administration of TY501 to rats at a dosage of 10 mg/kg.

  6. Genetic Map Construction and Quantitative Trait Locus (QTL) Detection of Growth-Related Traits in Litopenaeus vannamei for Selective Breeding Applications

    Science.gov (United States)

    Andriantahina, Farafidy; Liu, Xiaolin; Huang, Hao

    2013-01-01

    Growth is a priority trait from the point of view of genetic improvement. Molecular markers linked to quantitative trait loci (QTL) have been regarded as useful for marker-assisted selection (MAS) in complex traits as growth. Using an intermediate F2 cross of slow and fast growth parents, a genetic linkage map of Pacific whiteleg shrimp, Litopenaeusvannamei, based on amplified fragment length polymorphisms (AFLP) and simple sequence repeats (SSR) markers was constructed. Meanwhile, QTL analysis was performed for growth-related traits. The linkage map consisted of 451 marker loci (429 AFLPs and 22 SSRs) which formed 49 linkage groups with an average marker space of 7.6 cM; they spanned a total length of 3627.6 cM, covering 79.50% of estimated genome size. 14 QTLs were identified for growth-related traits, including three QTLs for body weight (BW), total length (TL) and partial carapace length (PCL), two QTLs for body length (BL), one QTL for first abdominal segment depth (FASD), third abdominal segment depth (TASD) and first abdominal segment width (FASW), which explained 2.62 to 61.42% of phenotypic variation. Moreover, comparison of linkage maps between L. vannamei and Penaeusjaponicus was applied, providing a new insight into the genetic base of QTL affecting the growth-related traits. The new results will be useful for conducting MAS breeding schemes in L. vannamei. PMID:24086466

  7. Genetic map construction and quantitative trait locus (QTL detection of growth-related traits in Litopenaeus vannamei for selective breeding applications.

    Directory of Open Access Journals (Sweden)

    Farafidy Andriantahina

    Full Text Available Growth is a priority trait from the point of view of genetic improvement. Molecular markers linked to quantitative trait loci (QTL have been regarded as useful for marker-assisted selection (MAS in complex traits as growth. Using an intermediate F2 cross of slow and fast growth parents, a genetic linkage map of Pacific whiteleg shrimp, Litopenaeusvannamei, based on amplified fragment length polymorphisms (AFLP and simple sequence repeats (SSR markers was constructed. Meanwhile, QTL analysis was performed for growth-related traits. The linkage map consisted of 451 marker loci (429 AFLPs and 22 SSRs which formed 49 linkage groups with an average marker space of 7.6 cM; they spanned a total length of 3627.6 cM, covering 79.50% of estimated genome size. 14 QTLs were identified for growth-related traits, including three QTLs for body weight (BW, total length (TL and partial carapace length (PCL, two QTLs for body length (BL, one QTL for first abdominal segment depth (FASD, third abdominal segment depth (TASD and first abdominal segment width (FASW, which explained 2.62 to 61.42% of phenotypic variation. Moreover, comparison of linkage maps between L. vannamei and Penaeusjaponicus was applied, providing a new insight into the genetic base of QTL affecting the growth-related traits. The new results will be useful for conducting MAS breeding schemes in L. vannamei .

  8. Application of SSH and quantitative real time PCR to construction of gene expression profiles from scallop Chlamys farreri in response to exposure to tetrabromobisphenol A.

    Science.gov (United States)

    Gong, Xiaoli; Pan, Luqing; Miao, Jingjing; Liu, Na

    2012-11-01

    TBBPA-induced genes were identified using suppression subtractive hybridization (SSH) from Chlamys farreri. A total of 203 and 44 clones from SSH forward and reverse library were respectively obtained including cellular process, immune system process, response to stimulus, metabolic process and signaling etc. Differential gene expressions were compared between scallops from control and TBBPA treatment groups (400 μg/L, 15 days) using quantitative real time RT-PCR. For further research, eight significant genes expression from scallops exposed to TBBPA (0; 100; 200; 400 μg/L) sampling at 0, 1, 3, 6 and 15 days, were utilized for Q-RT-PCR. The results revealed that the expression level of most selected cDNAs was dominantly up-regulated or down-regulated in the TBBPA-induced scallops. These findings provide basic genomic information of the bivalve and the selected genes may be the potential molecular biomarkers for TBBPA pollution in aquatic environment. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  9. Development and application of a reversed-phase high-performance liquid chromatographic method for quantitation and characterization of a Chikungunya virus-like particle vaccine.

    Science.gov (United States)

    Shytuhina, Anastasija; Pristatsky, Pavlo; He, Jian; Casimiro, Danilo R; Schwartz, Richard M; Hoang, Van M; Ha, Sha

    2014-10-17

    To effectively support the development of a Chikungunya (CHIKV) virus-like particle (VLP) vaccine, a sensitive and robust high-performance liquid chromatography (HPLC) method that can quantitate CHIKV VLPs and monitor product purity throughout the manufacturing process is needed. We developed a sensitive reversed-phase HPLC (RP-HPLC) method that separates capsid, E1, and E2 proteins in CHIKV VLP vaccine with good resolution. Each protein component was verified by sodium dodecyl sulfate polyacrylamide gel electrophoresis (SDS-PAGE) and matrix-assisted laser desorption/ionization time-of-flight (MALDI-ToF) mass spectrometry (MS). The post-translational modifications on the viral glycoproteins E1 and E2 were further identified by intact protein mass measurements with liquid chromatography-mass spectrometry (LC-MS). The RP-HPLC method has a linear range of 0.51-12 μg protein, an accuracy of 96-106% and a precision of 12% RSD, suitable for vaccine product release testing. In addition, we demonstrated that the RP-HPLC method is useful for characterizing viral glycoprotein post-translational modifications, monitoring product purity during process development and assessing product stability during formulation development. Published by Elsevier B.V.

  10. Rigour in quantitative research.

    Science.gov (United States)

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  11. Effects of Single and Combined Application of Organic, Biological and Chemical Fertilizers on Quantitative and Qualitative Yield of Coriander (Coriandrum sativum)

    OpenAIRE

    M. Aghhavani Shajari; P Rezvani Moghaddam; R. Ghorbani; M nasiri mahallati

    2016-01-01

    Introduction: Medicinal plants were one of the main natural resources of Iran from ancient times. Coriander (Coriandrum sativum L.) is from Apiaceae family that it has cultivated extensively in the world. Management and environmental factors such as nutritional management has a significant impact on the quantity and quality of plants. Application of organic fertilizers in conventional farming systems is not common and most of the nutritional need of plants supply through chemical fertilizers...

  12. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    Science.gov (United States)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  13. The application of continuous wavelet transform and least squares support vector machine for the simultaneous quantitative spectrophotometric determination of Myricetin, Kaempferol and Quercetin as flavonoids in pharmaceutical plants

    Science.gov (United States)

    Sohrabi, Mahmoud Reza; Darabi, Golnaz

    2016-01-01

    Flavonoids are γ-benzopyrone derivatives, which are highly regarded in these researchers for their antioxidant property. In this study, two new signals processing methods been coupled with UV spectroscopy for spectral resolution and simultaneous quantitative determination of Myricetin, Kaempferol and Quercetin as flavonoids in Laurel, St. John's Wort and Green Tea without the need for any previous separation procedure. The developed methods are continuous wavelet transform (CWT) and least squares support vector machine (LS-SVM) methods integrated with UV spectroscopy individually. Different wavelet families were tested by CWT method and finally the Daubechies wavelet family (Db4) for Myricetin and the Gaussian wavelet families for Kaempferol (Gaus3) and Quercetin (Gaus7) were selected and applied for simultaneous analysis under the optimal conditions. The LS-SVM was applied to build the flavonoids prediction model based on absorption spectra. The root mean square errors for prediction (RMSEP) of Myricetin, Kaempferol and Quercetin were 0.0552, 0.0275 and 0.0374, respectively. The developed methods were validated by the analysis of the various synthetic mixtures associated with a well- known flavonoid contents. Mean recovery values of Myricetin, Kaempferol and Quercetin, in CWT method were 100.123, 100.253, 100.439 and in LS-SVM method were 99.94, 99.81 and 99.682, respectively. The results achieved by analyzing the real samples from the CWT and LS-SVM methods were compared to the HPLC reference method and the results were very close to the reference method. Meanwhile, the obtained results of the one-way ANOVA (analysis of variance) test revealed that there was no significant difference between the suggested methods.

  14. Simultaneous determination of linagliptin and metformin by reverse phase-high performance liquid chromatography method: An application in quantitative analysis of pharmaceutical dosage forms

    Directory of Open Access Journals (Sweden)

    Prathyusha Vemula

    2015-01-01

    Full Text Available To enhance patient compliance toward treatment in diseases like diabetes, usually a combination of drugs is prescribed. Therefore, an anti-diabetic fixed-dose combination of 2.5 mg of linagliptin 500 mg of metformin was taken for simultaneous estimation of both the drugs by reverse phase-high performance liquid chromatography (RP-HPLC method. The present study aimed to develop a simple and sensitive RP-HPLC method for the simultaneous determination of linagliptin and metformin in pharmaceutical dosage forms. The chromatographic separation was designed and evaluated by using linagliptin and metformin working standard and sample solutions in the linearity range. Chromatographic separation was performed on a C 18 column using a mobile phase of 70:30 (v/v mixture of methanol and 0.05 M potassium dihydrogen orthophosphate (pH adjusted to 4.6 with orthophosphoric acid delivered at a flow rate of 0.6 mL/min and UV detection at 267 nm. Linagliptin and metformin shown linearity in the range of 2-12 μg/mL and 400-2400 μg/mL respectively with correlation co-efficient of 0.9996 and 0.9989. The resultant findings analyzed for standard deviation (SD and relative standard deviation to validate the developed method. The retention time of linagliptin and metformin was found to be 6.3 and 4.6 min and separation was complete in <10 min. The method was validated for linearity, accuracy and precision were found to be acceptable over the linearity range of the linagliptin and metformin. The method was found suitable for the routine quantitative analysis of linagliptin and metformin in pharmaceutical dosage forms.

  15. An orientation sensitive approach in biomolecule interaction quantitative structure-activity relationship modeling and its application in ion-exchange chromatography.

    Science.gov (United States)

    Kittelmann, Jörg; Lang, Katharina M H; Ottens, Marcel; Hubbuch, Jürgen

    2017-01-27

    Quantitative structure-activity relationship (QSAR) modeling for prediction of biomolecule parameters has become an established technique in chromatographic purification process design. Unfortunately available descriptor sets fail to describe the orientation of biomolecules and the effects of ionic strength in the mobile phase on the interaction with the stationary phase. The literature describes several special descriptors used for chromatographic retention modeling, all of these do not describe the screening of electrostatic potential by the mobile phase in use. In this work we introduce two new approaches of descriptor calculations, namely surface patches and plane projection, which capture an oriented binding to charged surfaces and steric hindrance of the interaction with chromatographic ligands with regard to electrostatic potential screening by mobile phase ions. We present the use of the developed descriptor sets for predictive modeling of Langmuir isotherms for proteins at different pH values between pH 5 and 10 and varying ionic strength in the range of 10-100mM. The resulting model has a high correlation of calculated descriptors and experimental results, with a coefficient of determination of 0.82 and a predictive coefficient of determination of 0.92 for unknown molecular structures and conditions. The agreement of calculated molecular interaction orientations with both, experimental results as well as molecular dynamic simulations from literature is shown. The developed descriptors provide the means for improved QSAR models of chromatographic processes, as they reflect the complex interactions of biomolecules with chromatographic phases. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Qualitative and quantitative analysis of a group of volatile organic compounds in biological samples by HS-GC/FID: application in practical cases.

    Science.gov (United States)

    Monteiro, C; Franco, J M; Proença, P; Castañera, A; Claro, A; Vieira, D N; Corte-Real, F

    2014-10-01

    A simple and sensitive procedure, using n-propanol as internal standard (IS), was developed and validated for the qualitative and quantitative analysis of a group of 11 volatile organic substances with different physicochemical properties (1-butanol, 2-propanol, acetaldehyde, ethyl acetate, acetone, acetonitrile, chloroform, diethyl ether, methanol, toluene and p-xylene) in whole blood, urine and vitreous humor. Samples were prepared by dilution with an aqueous solution of internal standard followed by Headspace Gas Chromatography with a Flame-ionization Detector (HS GC-FID) analysis. Chromatographic separation was performed using two capillary columns with different polarities (DB-ALC2: 30m×0.320mm×1.2μm and DB-ALC1: 30m×0.320mm×1.8μm), thus providing a change in the retention and elution order of volatiles. This dual column confirmation increases the specificity, since the risk of another substance co-eluting at the same time in both columns is very small. The method was linear from 5 to 1000mg/L for toluene and p-xylene, 50-1000mg/L for chloroform, and 50-2000mg/L for the remaining substances, with correlation coefficients of over 0.99 for all compounds. The limits of detection (LOD) ranged 1 to 10mg/L, while the limits of quantification (LOQ) ranged from 2 to 31mg/L. The intra-day precision (CV<6.4%), intermediate precision (CV<7.0%) and accuracy (relative error ±10%) of the method were in conformity with the criteria normally accepted in bioanalytical method validation. The method developed has been applied to forensic cases, with the advantages that it uses a small sample volume and does not require any extraction procedure as it makes use of a headspace injection technique. Published by Elsevier Ireland Ltd.

  17. Quantitation of itopride in human serum by high-performance liquid chromatography with fluorescence detection and its application to a bioequivalence study.

    Science.gov (United States)

    Singh, Sonu Sundd; Jain, Manish; Sharma, Kuldeep; Shah, Bhavin; Vyas, Meghna; Thakkar, Purav; Shah, Ruchy; Singh, Shriprakash; Lohray, Brajbhushan

    2005-04-25

    A new method was developed for determination of itopride in human serum by reversed phase high-performance liquid chromatography (HPLC) with fluorescence detection (excitation at 291 nm and emission at 342 nm). The method employed one-step extraction of itopride from serum matrix with a mixture of tert-butyl methyl ether and dichloromethane (70:30, v/v) using etoricoxib as an internal standard. Chromatographic separation was obtained within 12.0 min using a reverse phase YMC-Pack AM ODS column (250 mm x 4.6 mm, 5 microm) and an isocratic mobile phase constituting of a mixture of 0.05% tri-fluoro acetic acid in water and acetonitrile (75:25, v/v) flowing at a flow rate of 1.0 ml/min. The method was linear in the range of 14.0 ng/ml to 1000.0 ng/ml. The lower limit of quantitation (LLOQ) was 14.0 ng/ml. Average recovery of itopride and the internal standard from the biological matrix was more than 66.04 and 64.57%, respectively. The inter-day accuracy of the drug containing serum samples was more than 97.81% with a precision of 2.31-3.68%. The intra-day accuracy was 96.91% or more with a precision of 5.17-9.50%. Serum samples containing itopride were stable for 180.0 days at -70+/-5 degrees C and for 24.0 h at ambient temperature (25+/-5 degrees C). The method was successfully applied to the bioequivalence study of itopride in healthy, male human subjects.

  18. Nested quantitative PCR approach for urinary cell-free EZH2 mRNA and its potential clinical application in bladder cancer.

    Science.gov (United States)

    Zhang, Xin; Zhang, Yanli; Liu, Xinfeng; Liu, Tong; Li, Peilong; Du, Lutao; Yang, Yongmei; Wang, Lili; Wang, Chuanxin

    2016-10-15

    EZH2 is overexpressed in bladder cancer (BC) and plays important roles in tumor development and progression. Recent studies show cell free (cf) RNAs released from cancer cells can reflect tissues changes and are stable and detectable in urine. Although conventional quantitative real-time PCR (qPCR) is highly sensitive, low abundances of urinary cf-RNAs usually result in false-negatives. Thus, this study develops a nested qPCR (nqPCR) approach to quantify cf-EZH2 mRNA in urine and further assess its clinical significance for BC. Forty urine samples were first selected to evaluate feasibility of nqPCR. Then, levels of urinary cf-EZH2 mRNA were detected using developed method in an independent cohort of subjects with 91 healthy, 81 cystitis, 169 nonmuscle invasive BC (NMIBC) and 103 muscle-invasive BC (MIBC). In cf-EZH2 mRNA detection, nqPCR method was significantly associated with qPCR, but it could detect more urine samples and increase detection limit three orders of magnitude. Based on nqPCR method, cf-EZH2 mRNA levels have been found to be increased in urine of NMIBC and MIBC patients (p EZH2 mRNA showed higher diagnostic ability for MIBC (p  0.05). Moreover, it also could distinguish MIBC from NMIBC, with AUC of 0.787. For MIBC patients, high expression of cf-EZH2 mRNA associated with advanced stage and was an independent predictor of reduced disease free survival or overall survival. In conclusion, detection of cf-EZH2 mRNA in urine by nqPCR is a sensitive and noninvasive approach and may be used for diagnosis and prognosis prediction of MIBC. © 2016 UICC.

  19. Application of quantitative structure-toxicity relationships for the comparison of the cytotoxicity of 14 p-benzoquinone congeners in primary cultured rat hepatocytes versus PC12 cells.

    Science.gov (United States)

    Siraki, Arno G; Chan, Tom S; O'Brien, Peter J

    2004-09-01

    Quinones are believed to induce their toxicity by two main mechanisms: oxygen activation by redox cycling and alkylation of essential macromolecules. The physicochemical parameters that underlie this activity have not been elucidated, although redox potential is believed to play a significant role. In this study, we have evaluated the cytotoxicity, formation of reactive oxygen species (ROS), and the glutathione (GSH) depleting ability of 14 p-benzoquinone congeners in primary rat hepatocyte and PC12 cell cultures. All experiments were performed under identical conditions (37 degrees C, 5% CO2/air) in 96-well plates. The most cytotoxic quinone was found to be tetrachloro-p-benzoquinone (chloranil), and the least toxic was duroquinone or 2,6-di-tert-butyl-p-benzoquinone. The cytotoxic order varied between the cell types, and in particular, the di-substituted methoxy or methyl p-benzoquinones were particularly more cytotoxic towards PC12 cells. We have derived one- and two-parameter quantitative structure-toxicity relationships (QSTRs) which revealed that the most cytotoxic quinones had the highest electron affinity and the smallest volume. Cytotoxicity did not correlate with the lipophilicity of the quinone. Furthermore, we found that p-benzoquinone cytotoxicity correlated well with hepatocyte ROS formation and GSH depletion, whereas in PC12 cells, cytotoxicity did not correlate with ROS formation and somewhat correlated with GSH depletion. Hepatocytes had far greater hydrogen peroxide detoxifying capacity than PC12 cells, but PC12 cells contained more GSH/mg protein. Thus, p-benzoquinone-induced ROS formation was greater towards PC12 cells than with hepatocytes. To our knowledge, this is the first QSTR derived for p-benzoquinone cytotoxicity in these cell types and could form the basis for distinguishing certain cell-specific cytotoxic mechanisms.

  20. Establishment of real time allele specific locked nucleic acid quantitative PCR for detection of HBV YIDD (ATT mutation and evaluation of its application.

    Directory of Open Access Journals (Sweden)

    Yongbin Zeng

    Full Text Available BACKGROUND: Long-term use of nucleos(tide analogues can increase risk of HBV drug-resistance mutations. The rtM204I (ATT coding for isoleucine is one of the most important resistance mutation sites. Establishing a simple, rapid, reliable and highly sensitive assay to detect the resistant mutants as early as possible is of great clinical significance. METHODS: Recombinant plasmids for HBV YMDD (tyrosine-methionine-aspartate-aspartate and YIDD (tyrosine-isoleucine-aspartate-aspartate were constructed by TA cloning. Real time allele specific locked nucleic acid quantitative PCR (RT-AS-LNA-qPCR with SYBR Green I was established by LNA-modified primers and evaluated with standard recombinant plasmids, clinical templates (the clinical wild type and mutant HBV DNA mixture and 102 serum samples from nucleos(tide analogues-experienced patients. The serum samples from a chronic hepatitis B (CHB patient firstly received LMV mono therapy and then switched to LMV + ADV combined therapy were also dynamically analyzed for 10 times. RESULTS: The linear range of the assay was between 1×10(9 copies/μl and 1 × 10(2 copies/μl. The low detection limit was 1 × 10(1 copies/μl. Sensitivity of the assay were 10(-6, 10(-4 and 10(-2 in the wild-type background of 1 × 10(9 copies/μl, 1 × 10(7 copies/μl and 1 × 10(5 copies/μl, respectively. The sensitivity of the assay in detection of clinical samples was 0.03%. The complete coincidence rate between RT-AS-LNA-qPCR and direct sequencing was 91.2% (93/102, partial coincidence rate was 8.8% (9/102, and no complete discordance was observed. The two assays showed a high concordance (Kappa = 0.676, P = 0.000. Minor variants can be detected 18 weeks earlier than the rebound of HBV DNA load and alanine aminotransferase level. CONCLUSIONS: A rapid, cost-effective, high sensitive, specific and reliable method of RT-AS-LNA-qPCR with SYBR Green I for early and absolute quantification of HBV YIDD (ATT coding for isoleucine

  1. Development and application of a quantitative PCR assay to study equine herpesvirus 5 invasion and replication in equine tissues in vitro and in vivo.

    Science.gov (United States)

    Zarski, Lila M; High, Emily A; Nelli, Rahul K; Bolin, Steven R; Williams, Kurt J; Hussey, Gisela

    2017-10-01

    Equine herpesvirus 5 (EHV-5) infection is associated with pulmonary fibrosis in horses, but further studies on EHV-5 persistence in equine cells are needed to fully understand viral and host contributions to disease pathogenesis. Our aim was to develop a quantitative PCR (qPCR) assay to measure EHV-5 viral copy number in equine cell cultures, blood lymphocytes, and nasal swabs of horses. Furthermore, we used a recently developed equine primary respiratory cell culture system to study EHV-5 pathogenesis at the respiratory tract. PCR primers and a probe were designed to target gene E11 of the EHV-5 genome. Sensitivity and repeatability were established, and specificity was verified by testing multiple isolates of EHV-5, as well as DNA from other equine herpesviruses. Four-week old fully differentiated (mature), newly seeded (immature) primary equine respiratory epithelial cell (ERECs), and equine dermal cell cultures were inoculated with EHV-5 and the cells and supernatants collected daily for 14days. Blood lymphocytes and nasal swabs were collected from horses experimentally infected with equine herpesvirus 1 (EHV-1). The qPCR assay detected EHV-5 at stable concentrations throughout 14days in inoculated mature EREC and equine dermal cell cultures (peaking at 202 and 5861 viral genomes per 10(6) cellular β actin, respectively). EHV-5 copies detected in the immature EREC cultures increased over 14days and reached levels greater than 10,000 viral genomes per 10(6) cellular β actin. Moreover, EHV-5 was detected in the lymphocytes of 76% of horses and in the nasal swabs of 84% of horses experimentally infected with EHV-1 pre-inoculation with EHV-1. Post-inoculation with EHV-1, EHV-5 was detected in lymphocytes of 52% of horses while EHV-5 levels in nasal swabs were not significantly different from pre-inoculation levels. In conclusion, qPCR was a reliable technique to investigate viral load in in vivo and in vitro samples, and EHV-5 replication in equine epithelial

  2. Multi-component quantitation of meso/nanostructural surfaces and its application to local chemical compositions of copper meso/nanostructures self-organized on silica

    Science.gov (United States)

    Huang, Chun-Yi; Chang, Hsin-Wei; Chang, Che-Chen

    2018-03-01

    Knowledge about the chemical compositions of meso/nanomaterials is fundamental to development of their applications in advanced technologies. Auger electron spectroscopy (AES) is an effective analysis method for the characterization of meso/nanomaterial structures. Although a few studies have reported the use of AES for the analysis of the local composition of these structures, none have explored in detail the validity of the meso/nanoanalysis results generated by the AES instrument. This paper addresses the limitations of AES and the corrections necessary to offset them for this otherwise powerful meso/nanoanalysis tool. The results of corrections made to the AES multi-point analysis of high-density copper-based meso/nanostructures provides major insights into their local chemical compositions and technological prospects, which the primitive composition output of the AES instrument failed to provide.

  3. Separation and quantitation of oxypurines by isocratic high-pressure liquid chromatography: application to xanthinuria and the Lesch-Nyhan syndrome.

    Science.gov (United States)

    Crawhall, J C; Itiaba, K; Katz, S

    1983-10-01

    An isocratic HPLC technique has been developed for the separation and measurement of urine and plasma oxypurines in a patient with xanthinuria. The case history and laboratory data are presented. Xanthine excretion was 172 mg/g creatinine and hypoxanthine was 45 mg/g creatinine. Uric acid was too small to be measured but uricase determination showed only 3 mg/24 hr. Serum oxypurine analysis showed hypoxanthine 0.87 mg/dl and xanthine 0.35 mg/dl. Uric acid was not seen in this patient's serum but could be readily measured in normal control subjects. The technique can also be used to separate nucleotides from purine bases, and we have demonstrated its application to the measurement of erythrocyte hypoxanthine guanine phosphoribosyl transferase and adenine phosphoribosyl transferase in a kindred associated with the Lesch-Nyhan syndrome.

  4. Quantitative computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Judith E. [Royal Infirmary and University, Manchester (United Kingdom)], E-mail: judith.adams@manchester.ac.uk

    2009-09-15

    Quantitative computed tomography (QCT) was introduced in the mid 1970s. The technique is most commonly applied to 2D slices in the lumbar spine to measure trabecular bone mineral density (BMD; mg/cm{sup 3}). Although not as widely utilized as dual-energy X-ray absortiometry (DXA) QCT has some advantages when studying the skeleton (separate measures of cortical and trabecular BMD; measurement of volumetric, as opposed to 'areal' DXA-BMDa, so not size dependent; geometric and structural parameters obtained which contribute to bone strength). A limitation is that the World Health Organisation (WHO) definition of osteoporosis in terms of bone densitometry (T score -2.5 or below using DXA) is not applicable. QCT can be performed on conventional body CT scanners, or at peripheral sites (radius, tibia) using smaller, less expensive dedicated peripheral CT scanners (pQCT). Although the ionising radiation dose of spinal QCT is higher than for DXA, the dose compares favorably with those of other radiographic procedures (spinal radiographs) performed in patients suspected of having osteoporosis. The radiation dose from peripheral QCT scanners is negligible. Technical developments in CT (spiral multi-detector CT; improved spatial resolution) allow rapid acquisition of 3D volume images which enable QCT to be applied to the clinically important site of the proximal femur, more sophisticated analysis of cortical and trabecular bone, the imaging of trabecular structure and the application of finite element analysis (FEA). Such research studies contribute importantly to the understanding of bone growth and development, the effect of disease and treatment on the skeleton and the biomechanics of bone strength and fracture.

  5. Added value of experts' knowledge to improve a quantitative microbial exposure assessment model--Application to aseptic-UHT food products.

    Science.gov (United States)

    Pujol, Laure; Johnson, Nicholas Brian; Magras, Catherine; Albert, Isabelle; Membré, Jeanne-Marie

    2015-10-15

    In a previous study, a quantitative microbial exposure assessment (QMEA) model applied to an aseptic-UHT food process was developed [Pujol, L., Albert, I., Magras, C., Johnson, N. B., Membré, J. M. Probabilistic exposure assessment model to estimate aseptic UHT product failure rate. 2015 International Journal of Food Microbiology. 192, 124-141]. It quantified Sterility Failure Rate (SFR) associated with Bacillus cereus and Geobacillus stearothermophilus per process module (nine modules in total from raw material reception to end-product storage). Previously, the probabilistic model inputs were set by experts (using knowledge and in-house data). However, only the variability dimension was taken into account. The model was then improved using expert elicitation knowledge in two ways. First, the model was refined by adding the uncertainty dimension to the probabilistic inputs, enabling to set a second order Monte Carlo analysis. The eight following inputs, and their impact on SFR, are presented in detail in this present study: D-value for each bacteria of interest (B. cereus and G. stearothermophilus) associated with the inactivation model for the UHT treatment step, i.e., two inputs; log reduction (decimal reduction) number associated with the inactivation model for the packaging sterilization step for each bacterium and each part of the packaging (product container and sealing component), i.e., four inputs; and bacterial spore air load of the aseptic tank and the filler cabinet rooms, i.e., two inputs. Second, the model was improved by leveraging expert knowledge to develop further the existing model. The proportion of bacteria in the product which settled on surface of pipes (between the UHT treatment and the aseptic tank on one hand, and between the aseptic tank and the filler cabinet on the other hand) leading to a possible biofilm formation for each bacterium, was better characterized. It was modeled as a function of the hygienic design level of the aseptic

  6. Assessment of vulnerability in karst aquifers using a quantitative integrated numerical model: catchment characterization and high resolution monitoring - Application to semi-arid regions- Lebanon.

    Science.gov (United States)

    Doummar, Joanna; Aoun, Michel; Andari, Fouad

    2016-04-01

    Karst aquifers are highly heterogeneous and characterized by a duality of recharge (concentrated; fast versus diffuse; slow) and a duality of flow which directly influences groundwater flow and spring responses. Given this heterogeneity in flow and infiltration, karst aquifers do not always obey standard hydraulic laws. Therefore the assessment of their vulnerability reveals to be challenging. Studies have shown that vulnerability of aquifers is highly governed by recharge to groundwater. On the other hand specific parameters appear to play a major role in the spatial and temporal distribution of infiltration on a karst system, thus greatly influencing the discharge rates observed at a karst spring, and consequently the vulnerability of a spring. This heterogeneity can only be depicted using an integrated numerical model to quantify recharge spatially and assess the spatial and temporal vulnerability of a catchment for contamination. In the framework of a three-year PEER NSF/USAID funded project, the vulnerability of a karst catchment in Lebanon is assessed quantitatively using a numerical approach. The aim of the project is also to refine actual evapotranspiration rates and spatial recharge distribution in a semi arid environment. For this purpose, a monitoring network was installed since July 2014 on two different pilot karst catchment (drained by Qachqouch Spring and Assal Spring) to collect high resolution data to be used in an integrated catchment numerical model with MIKE SHE, DHI including climate, unsaturated zone, and saturated zone. Catchment characterization essential for the model included geological mapping and karst features (e.g., dolines) survey as they contribute to fast flow. Tracer experiments were performed under different flow conditions (snow melt and low flow) to delineate the catchment area, reveal groundwater velocities and response to snowmelt events. An assessment of spring response after precipitation events allowed the estimation of the

  7. Quantitive DNA Fiber Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  8. Development and validation of sensitive and rapid UPLC-MS/MS method for quantitative determination of daclatasvir in human plasma: Application to a bioequivalence study.

    Science.gov (United States)

    Rezk, Mamdouh R; Bendas, Ehab R; Basalious, Emad B; Karim, Iman A

    2016-09-05

    A rapid and sensitive UPLC-MS/MS method was developed and validated for determination of daclatasvir (DAC) in human plasma using sofosbuvir (SOF) as an internal standard (IS). The Xevo TQD LC-MS/MS was operated under the multiple-reaction monitoring mode using electrospray ionization. Precipitation with acetonitrile was used in sample preparation. The prepared samples were chromatographed on Acquity UPLC HSS C18 (50×2.1mm, 1.8μm) column by pumping 10mM ammonium formate (pH 3.5) and acetonitrile in an isocratic mode at a flow rate of 0.30ml/min. Method validation was performed as per the FDA guidelines and the standard curves were found to be linear in the range of 5-4000ng/ml for DAC. The intra-day and inter-day precision and accuracy results were within the acceptable limits. A very short run time of 1.2min made it possible to analyze more than 500 human plasma samples per day. The wider range of quantification of DAC allowed the applicability of the developed method for its determination in a bioequivalence study in human volunteers. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Quantitative determination of trigonelline in mouse serum by means of hydrophilic interaction liquid chromatography-MS/MS analysis: Application to a pharmacokinetic study.

    Science.gov (United States)

    Szczesny, Damian; Bartosińska, Ewa; Jacyna, Julia; Patejko, Małgorzata; Siluk, Danuta; Kaliszan, Roman

    2018-02-01

    Trigonelline is a pyridine alkaloid found in fenugreek seeds and coffee beans. Most of the previous studies are concerned with the quantification of trigonelline along with other constituents in coffee herbs or beverages. Only a few have focused on its determination in animal or human tissues by applying different modes of HPLC with UV or MS detection. The aim of the study was to develop and validate a fast and simple method for trigonelline determination in serum by the use of hydrophilic interaction liquid chromatography (HILIC) with ESI-MS/MS detection. Separation of trigonelline was achieved on a Kinetex HILIC column operated at 35°C with acetonitrile-ammonium formate (10 mm, pH = 3) buffer mixture (55:45, v/v) as the mobile phase. The developed method was successfully applied to determine trigonelline concentration in mouse serum after intravenous administration of 10 mg/kg. The developed assay is sensitive (limit of detection = 1.5 ng/mL, limit of quantification = 5.0 ng/mL) and linear in a concentration range from 5.0 to 250.0 ng/mL. Sample preparation is limited to deproteinization, centrifugation and filtration. The application of the HILIC mode of chromatography with MS detection and selection of deuterated trigonelline as internal standard allowed a rapid and precise method of trigonelline quantification to be to developed. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Quantitative evaluation of 2x2 arrays of Lucite cone applicators in flat layered phantoms using Gaussian-beam-predicted and thermographically measured SAR distributions

    Energy Technology Data Exchange (ETDEWEB)

    Rietveld, P.J.M.; Lumori, M.L.D.; Zee, J. van der; Rhoon, G.C. van [University Hospital Rotterdam - Daniel den Hoed Cancer Center, Department of Radiation Oncology, Subdivision of Hyperthermia, Rotterdam (Netherlands); Lumori, M.L.D. [Vesalius College and Department of Electrical Engineering, Vrije Universiteit Brussel, Pleinlaan 2, B-1050 Brussels (Belgium)

    1998-08-01

    SAR distributions from four different E-field-orientated 2x2 arrays of incoherently driven Lucite cone applicators (LCAs) were investigated. The LCAs operated at 433 MHz with an aperture of 10.5cmx10.5cm each. Two techniques were used to obtain SAR distributions in flat layered phantoms: Gaussian beam (GB) predictions and thermographical (TG) imaging. The GB predictions showed that the effective field size of the different array configurations varied by up to 3%. The TG-measured SAR distribution showed significant deviations from the GB-predicted SAR distributions (maximum 34.6%). The difference between GB-predicted and TG-measured SAR levels (averaged per 10% GB-predicted SAR intervals) equalled less than 11.3% for the parallel E-field orientated array and respectively 15.1% for the clockwise-orientated array. When antennae in the clockwise-orientated array were more widely spread (array aperture 23cmx23cm) in order to diminish their mutual interactions, these differences decreased to 12.4%. However, the overall difference within the 50% SAR or higher range decreased from 14% to 9%. The results lead us to conclude that LCAs can be used clinically and their antenna interactions are not considered to be a problem under clinical conditions. (author)

  11. Application of smart spectrophotometric methods and artificial neural network for the simultaneous quantitation of olmesartan medoxamil, amlodipine besylate and hydrochlorothiazide in their combined pharmaceutical dosage form

    Science.gov (United States)

    2013-01-01

    Background New, simple and specific spectrophotometric methods and artificial neural network (ANN) were developed and validated in accordance with ICH guidelines for the simultaneous estimation of Olmesartan (OLM), Amlodipine (AML), and Hydrochlorothiazide (HCT) in commercial tablets. Results For spectrophotometric methods: First, Amlodipine (AML) was determined by direct spectrophotometry at 359 nm and by application of the ratio subtraction, the AML spectrum was removed from the mixture spectra. Then Hydrochlorothiazide (HCT) was determined directly at 315 nm without interference from Olmesartan medoxamil (OLM) which could be determined using the isoabsorptive method. The calibration curve is linear over the concentration range of 5–40, 2.5-40 and 2–40 μg mL-1 for AML, OLM and HCT, respectively. ANN (as a multivariate calibration method) was also applied for the simultaneous determination of the three analytes in their combined pharmaceutical dosage form using spectral region from 230–340 nm. Conclusions The proposed methods were successfully applied for the assay of the three analytes in laboratory prepared mixtures and combined pharmaceutical tablets with excellent recoveries. No interference was observed from common pharmaceutical additives. The results were favorably compared with those obtained by a reference spectrophotometric method. The methods are validated according to the ICH guidelines and accuracy, precision and repeatability are found to be within the acceptable limit. PMID:23374392

  12. Quantitative dispersion microscopy

    OpenAIRE

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Ramachandra R Dasari; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live...

  13. Quenching of the electrochemiluminescence of RU-complex tagged shared-stem hairpin probes by graphene oxide and its application to quantitative turn-on detection of DNA.

    Science.gov (United States)

    Huang, Xiang; Huang, Xiaopeng; Zhang, An; Zhuo, Bangrong; Lu, Fushen; Chen, Yaowen; Gao, Wenhua

    2015-08-15

    Efficient and stable quenching of electrochemiluminescence (ECL) of tris(2,2'-bipyridine)-ruthenium(II) (Ru(bpy)3(2+))/tri-n-propylamine (TPrA) system by graphene oxide (GO) at the glassy carbon electrode (GCE) was reported. For figuring out the possible reasons of the quenching mechanism, the electrochemical and ECL performance of GO, different reduction degree of reduced graphene oxide (RGOs) and polymer wrapped GO modified GCEs were systematacially investigated. The results demonstrated that the oxygen-containing groups and poor electrical conductivity of GO, along with the distance between GO and Ru(bpy)3(2+) was suggested as the reasons for quenching ECL. On the basis of this essential quenching mechanism, a novel "signal on" ECL DNA biosensor for ultrasensitive detection of specific DNA sequence was constructed by self-assembling the ECL probe of thiolated shared-stem hairpin DNA (SH-DNA) tagged with Ru complex (Ru(bpy)3(2+) derivatives) on the surface of GO/gold nanoparticles (AuNPs) modified GCE. The ECL probe sequences have their ECL signal efficiently quenched when they are self-assembled on the surface of GO unless they hybridizes with their target DNA (t-DNA) sequence. The designed ECL biosensor exhibited excellent stability and reproducibility, outstanding selectivity, and an extremely sensitive response to t-DNA in a wide linear range of 100 aM-10 pM with a low detection limit of 65 aM. Our findings and the design of biosensing switch would open a new avenue in the application of GO based ECL quenching strategy for ultrasensitive bioassays. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Quantitative Algebraic Reasoning

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Panangaden, Prakash; Plotkin, Gordon

    2016-01-01

    We develop a quantitative analogue of equational reasoning which we call quantitative algebra. We define an equality relation indexed by rationals: a =ε b which we think of as saying that “a is approximately equal to b up to an error of ε”. We have 4 interesting examples where we have a quantitative...... equational theory whose free algebras correspond to well known structures. In each case we have finitary and continuous versions. The four cases are: Hausdorff metrics from quantitive semilattices; pWasserstein metrics (hence also the Kantorovich metric) from barycentric algebras and also from pointed...

  15. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    less problematic if research projects assert strategic or political feminist aims. Still, a feminist deconstructive argument can be formed against quantitative studies in which socially constructed categories are considered independently determined. However, by application of Williams’ ideas......Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... of treating the categories in question as dependently rather than independently determined, social categories can be deconstructed quantitatively, enriching both the theoretical and empirical understandings of population-level social constructions of genders, ethnicities etc. Quantitative deconstruction has...

  16. Challenges and perspectives in quantitative NMR.

    Science.gov (United States)

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Optofluidic time-stretch quantitative phase microscopy.

    Science.gov (United States)

    Guo, Baoshan; Lei, Cheng; Wu, Yi; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Lee, Sangwook; Isozaki, Akihiro; Li, Ming; Jiang, Yiyue; Yasumoto, Atsushi; Di Carlo, Dino; Tanaka, Yo; Yatomi, Yutaka; Ozeki, Yasuyuki; Goda, Keisuke

    2017-10-12

    Innovations in optical microscopy have opened new windows onto scientific research, industrial quality control, and medical practice over the last few decades. One of such innovations is optofluidic time-stretch quantitative phase microscopy - an emerging method for high-throughput quantitative phase imaging that builds on the interference between temporally stretched signal and reference pulses by using dispersive properties of light in both spatial and temporal domains in an interferometric configuration on a microfluidic platform. It achieves the continuous acquisition of both intensity and phase images with a high throughput of more than 10,000 particles or cells per second by overcoming speed limitations that exist in conventional quantitative phase imaging methods. Applications enabled by such capabilities are versatile and include characterization of cancer cells and microalgal cultures. In this paper, we review the principles and applications of optofluidic time-stretch quantitative phase microscopy and discuss its future perspective. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. The effects of super absorbent polymer application into soil and humic acid foliar application on some agrophysiological criteria and quantitative and qualitative yield of sugar beet (Beta vulgaris L. under Mashhad conditions

    Directory of Open Access Journals (Sweden)

    M Jahan

    2016-05-01

    Full Text Available Drought stress is the most limiting factor of agricultural production through the world. To evaluate the effect of super absorbent and humic acid to reduce drought stress in sugar beet production, a strip split plot arrangement based on randomized complete block design with three replications was conducted at Research Field of Faculty Agriculture, Ferdowsi University of Mashhad, Iran during growing season of 2010-2011. The main plot factor was application and no application of super absorbent polymer and the sub plot factor was foliar application and no application of humic acid. Two irrigation intervals (7 and 10 days assigned to strip plots. The results showed that super absorbent application comparing to no super absorbent affected leaf area index (LAI, sugar gross yield (SGY and SPAD readings significantly, as the highest amount for this traits were 3.4, 4.7 t ha-1 and 46.2, respectively. Humic acid foliar application resulted to the highest LAI (3.4 and SPAD reading (45.1 which significantly were different with other treatments. Irrigation interval of 7 days resulted to the highest LAI (3.8 and root yield (24.9 t ha-1. The highest SPAD reading (49.9 resulted from super absorbent and humic acid application with 7 days interval irrigation interaction. Dry matter yield (DM and leaf number per plant showed a positive and significant correlation (p≤0.01 with tuber yield (TY, SGY and SPAD readings. The strongest correlation coefficients were obtained for DM and LAI, and between DM and SGY. This positive and significant correlation emphasis that any factor increasing LAI will increase DM and thereby, SGY. The positive and significant correlations were observed between DM and SPAD readings, and between SPAD readings and TY. SGY estimation model predicted that SGY was determined by some variables such as TY, SP and SPAD reading. In general, these results indicate super absorbent application could increase soil water holding capacity and

  19. QUANTITATIVE CONFOCAL LASER SCANNING MICROSCOPY

    Directory of Open Access Journals (Sweden)

    Merete Krog Raarup

    2011-05-01

    Full Text Available This paper discusses recent advances in confocal laser scanning microscopy (CLSM for imaging of 3D structure as well as quantitative characterization of biomolecular interactions and diffusion behaviour by means of one- and two-photon excitation. The use of CLSM for improved stereological length estimation in thick (up to 0.5 mm tissue is proposed. The techniques of FRET (Fluorescence Resonance Energy Transfer, FLIM (Fluorescence Lifetime Imaging Microscopy, FCS (Fluorescence Correlation Spectroscopy and FRAP (Fluorescence Recovery After Photobleaching are introduced and their applicability for quantitative imaging of biomolecular (co-localization and trafficking in live cells described. The advantage of two-photon versus one-photon excitation in relation to these techniques is discussed.

  20. Adubação do milho: III - Adubação mineral quantitativa Fertilizer experiments with corn: III - Quantitative applications of mineral fertilizers

    Directory of Open Access Journals (Sweden)

    G. P. Viégas

    1955-01-01

    , and K2O, applied respectively as Chilean nitrate, superphosphate, and potassium chloride. The fertilizers were placed in the furrow at planting time and mixed with the soil direct under the seed. Injury to the germinating seed resulting from salt concentration was noted only in 1949, when germination was substantially affected, especially in plots that received high nitrogen and potash. The number of plants per plot after thinning was, however, comparable for all treatments. Phosphorus promoted a considerable increase in yield. A study of the adjusted means showed that a single dose of phosphorus increased the yield in 521 kg/ha when compared with plots receiving only NK. Double rates of phosphorus increased the yield in 806 kg/ha (44%, but higher rates of this element did not promote any further increase. No gain in yield due to the application of nitrogen or potash was noticed in these experiments.

  1. Variance in total levels of phospholipase C zeta (PLC-ζ) in human sperm may limit the applicability of quantitative immunofluorescent analysis as a diagnostic indicator of oocyte activation capability.

    Science.gov (United States)

    Kashir, Junaid; Jones, Celine; Mounce, Ginny; Ramadan, Walaa M; Lemmon, Bernadette; Heindryckx, Bjorn; de Sutter, Petra; Parrington, John; Turner, Karen; Child, Tim; McVeigh, Enda; Coward, Kevin

    2013-01-01

    To examine whether similar levels of phospholipase C zeta (PLC-ζ) protein are present in sperm from men whose ejaculates resulted in normal oocyte activation, and to examine whether a predominant pattern of PLC-ζ localization is linked to normal oocyte activation ability. Laboratory study. University laboratory. Control subjects (men with proven oocyte activation capacity; n = 16) and men whose sperm resulted in recurrent intracytoplasmic sperm injection failure (oocyte activation deficient [OAD]; n = 5). Quantitative immunofluorescent analysis of PLC-ζ protein in human sperm. Total levels of PLC-ζ fluorescence, proportions of sperm exhibiting PLC-ζ immunoreactivity, and proportions of PLC-ζ localization patterns in sperm from control and OAD men. Sperm from control subjects presented a significantly higher proportion of sperm exhibiting PLC-ζ immunofluorescence compared with infertile men diagnosed with OAD (82.6% and 27.4%, respectively). Total levels of PLC-ζ in sperm from individual control and OAD patients exhibited significant variance, with sperm from 10 out of 16 (62.5%) exhibiting levels similar to OAD samples. Predominant PLC-ζ localization patterns varied between control and OAD samples with no predictable or consistent pattern. The results indicate that sperm from control men exhibited significant variance in total levels of PLC-ζ protein, as well as significant variance in the predominant localization pattern. Such variance may hinder the diagnostic application of quantitative PLC-ζ immunofluorescent analysis. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Effect of simultaneous application of mycorrhiza with compost, vermicompost and sulfural geranole on some quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system

    Directory of Open Access Journals (Sweden)

    P rezvani moghaddam

    2016-03-01

    quantitative and qualitative characteristics of sesame (Sesamum indicum L. in a low input cropping system was investigated. Materials and methods In order to evaluate the effects of simultaneous application of mycorrhiza and organic fertilizers on some quantitative and qualitative characteristics of sesame (Sesamum indicum L., an experiment was conducted based on randomized complete block design with three replications at Agricultural Research Farm, Ferdowsi University of Mashhad, Iran during growing season 2009-2010 growing season. Treatments were mycorrhiza (Glomus mosseae, mycorrhiza+compost, mycorrhiza+vermicompost, mycorrhiza+organic sulfural geranole, compost, vermicompost, Organic sulfural geranole and control (no fertilizer. Finally, data analysis was done using SAS 9.1 and means were compared by duncan’s multiple range test at 5% level of probability. Results and discussion The results showed that the effect of different organic and biological fertilizers were significant on seed yield. Seed yield significantly increased by using mycorrhiza in both condition of single and mixed with organic sulfural geranole and vermicompost compared to control treatment. Biological yield, in simultaneous application of vermicompost and organic sulfural geranole with mycorrhiza increased significantly compared to separate use of these fertilizers. All study organic fertilizers with mycorrhiza had significant effect on increasing oil content of sesame. Seed oil increased in simultaneous application of mycorrhiza and each of compost, vermicompost and organic sulfural geranole compared to separate application of mycorrhiza 12, 13 and 10 percentages, respectively. It seems that mycorrhiza and organic fertilizers improved quantitative and qualitative characteristics of sesame due to provide better conditions to absorption and transportation of nutrient to the plant (Hawkes et al., 2008. Conclusion In general, the results showed that the simultaneous use of ecological inputs can improve

  3. Quantitative Luminescence Imaging System

    Energy Technology Data Exchange (ETDEWEB)

    Batishko, C.R.; Stahl, K.A.; Fecht, B.A.

    1992-12-31

    The goal of the MEASUREMENT OF CHEMILUMINESCENCE project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  4. Applications

    Science.gov (United States)

    Stern, Arthur M.

    1986-07-01

    Economic incentives have spurred numerous applications of genetically engineered organisms in manufacture of pharmaceuticals and industrial chemicals. These successes, involving a variety of methods of genetic manipulation, have dispelled early fears that genetic engineering could not be handled safely, even in the laboratory. Consequently, the potential for applications in the wider environment without physical containment is being considered for agriculture, mining, pollution control, and pest control. These proposed applications range from modest extensions of current plant breeding techniques for new disease-resistant species to radical combinations of organisms (for example, nitrogen-fixing corn plants). These applications raise concerns about potential ecological impacts (see chapter 5), largely because of adverse experiences with both deliberate and inadvertent introductions of nonindigenous species.

  5. Synapse proteomics: current status and quantitative applications

    NARCIS (Netherlands)

    Li, K.W.; Jimenez, C.R.

    2008-01-01

    Chemical synapses are key organelles for neurotransmission. The coordinated actions of protein networks in diverse synaptic subdomains drive the sequential molecular events of transmitter release from the presynaptic bouton, activation of transmitter receptors located in the postsynaptic density and

  6. Quantitative cardiac ultrasound

    NARCIS (Netherlands)

    H. Rijsterborgh (Hans)

    1990-01-01

    textabstractThis thesis is about the various aspects of quantitative cardiac ultrasound. The first four chapters are mainly devoted to the reproducibility of echocardiographic measurements. These . are focussed on the variation of echocardiographic measurements within patients. An important

  7. On Quantitative Rorschach Scales.

    Science.gov (United States)

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  8. Quantitative physics tasks

    OpenAIRE

    Snětinová, Marie

    2015-01-01

    Title: Quantitative Physics Tasks Author: Mgr. Marie Snětinová Department: Department of Physics Education Supervisor of the doctoral thesis: doc. RNDr. Leoš Dvořák, CSc., Department of Physics Education Abstract: The doctoral thesis concerns with problem solving in physics, especially on students' attitudes to solving of quantitative physics tasks, and various methods how to develop students' problem solving skills in physics. It contains brief overview of the theoretical framework of proble...

  9. Whole cell, label free protein quantitation with data independent acquisition: quantitation at the MS2 level.

    Science.gov (United States)

    McQueen, Peter; Spicer, Vic; Schellenberg, John; Krokhin, Oleg; Sparling, Richard; Levin, David; Wilkins, John A

    2015-01-01

    Label free quantitation by measurement of peptide fragment signal intensity (MS2 quantitation) is a technique that has seen limited use due to the stochastic nature of data dependent acquisition (DDA). However, data independent acquisition has the potential to make large scale MS2 quantitation a more viable technique. In this study we used an implementation of data independent acquisition--SWATH--to perform label free protein quantitation in a model bacterium Clostridium stercorarium. Four tryptic digests analyzed by SWATH were probed by an ion library containing information on peptide mass and retention time obtained from DDA experiments. Application of this ion library to SWATH data quantified 1030 proteins with at least two peptides quantified (∼ 40% of predicted proteins in the C. stercorarium genome) in each replicate. Quantitative results obtained were very consistent between biological replicates (R(2) ∼ 0.960). Protein quantitation by summation of peptide fragment signal intensities was also highly consistent between biological replicates (R(2) ∼ 0.930), indicating that this approach may have increased viability compared to recent applications in label free protein quantitation. SWATH based quantitation was able to consistently detect differences in relative protein quantity and it provided coverage for a number of proteins that were missed in some samples by DDA analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Quantitative NIR chemical imaging in heritage science.

    Science.gov (United States)

    Cséfalvayová, Linda; Strlič, Matija; Karjalainen, Harri

    2011-07-01

    Until recently, applications of spectral imaging in heritage science mostly focused on qualitative examination of artworks. This is partly due to the complexity of artworks and partly due to the lack of appropriate standard materials. With the recent advance of NIR imaging spectrometers, the interval 1000-2500 nm became available for exploration, enabling us to extract quantitative chemical information from artworks. In this contribution, the development of 2D NIR quantitative chemical maps of heritage objects is discussed along with presentation of the first quantitative image. Further case studies include semiquantitative mapping of plasticiser distribution in a plastic object and identification of historic plastic materials. In the NIR imaging studies discussed, sets of 256 spatially registered images were collected at different wavelengths in the NIR region of 1000-2500 nm. The data was analyzed as a spectral cube, both as a stack of wavelength-resolved images and as a series of spectra, one per each sample pixel, using multivariate analysis. This approach is only possible using well-characterized reference sample collections, as quantitative imaging applications need to be developed, thus enabling spatial maps of damaged and degraded areas to be visualized to a level of chemical detail previously not possible. Such quantitative chemical mapping of vulnerable areas of heritage objects is invaluable, as it enables damage to historic objects to be quantitatively visualized.

  11. Targeted quantitation of proteins by mass spectrometry.

    Science.gov (United States)

    Liebler, Daniel C; Zimmerman, Lisa J

    2013-06-04

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.

  12. Analyse quantitative détaillée des distillats moyens par couplage CG/SM. Application à l'étude des schémas réactionnels du procédé d'hydrotraitement Quantitative Analysis of Middle Distillats by Gc/Ms Coupling. Application to Hydrotreatment Process Mechanisms

    Directory of Open Access Journals (Sweden)

    Fafet A.

    2006-11-01

    Full Text Available L'analyse détaillée des distillats moyens est une étape indispensable à la compréhension des mécanismes réactionnels et à la cinétique de certains procédés de raffinage comme l'hydrotraitement. Une nouvelle méthode associant, d'une part un couplage chromatographie en phase gazeuse/spectrométrie de masse (CG/SM et, d'autre part une analyse quantitative par famille chimique par spectrométrie de masse a été développée. La chromatographie en phase gazeuse, réalisée sur une colonne apolaire, effectue la distillation des composés présents dans le gazole et la spectrométrie de masse quantifie les familles chimiques par intervalle de nombre d'atomes de carbone ou de point d'ébullition. Elle permet d'accéder ainsi à la répartition par nombre d'atomes de carbone de chaque famille chimique (alcanes, cycloalcanes, hydrocarbures aromatiques à un ou plusieurs noyaux, hydrocarbures aromatiques soufrés. Cette méthode a été validée et appliquée à une charge et à une recette d'hydrotraitement. A detailed analysis of middle distillates is essential for understanding the reaction mechanism and for studying the kinetics of refining processes such hydrotreatment. In fact, when we see the complexity of saturated and aromatic hydrocarbon mixtures appearing in gas oil, we realize that it's necessary to have a very detailed analysis of those cuts to understand the mechanisms involved in refining processes and to be able to describe their kinetics. Each gas oil has a very different composition and therefore a specific reactivity. That is why we have tried to develop predictive kinetic models to avoid experimenting in pilot plants, which is very expensive. But, even if all the compounds of a gasoline (PI-200°C have now been identified and quantified, using gas chromatography (1, such is not the case for heavier cuts. Only an overall characterization can be made, by chemical family. The techniques employed are, for example, HPLC (3,4 or

  13. Extending Quantitative Easing

    DEFF Research Database (Denmark)

    Hallett, Andrew Hughes; Fiedler, Salomon; Kooths, Stefan

    The notes in this compilation address the pros and cons associated with the extension of ECB quantitative easing programme of asset purchases. The notes have been requested by the Committee on Economic and Monetary Affairs as an input for the February 2017 session of the Monetary Dialogue....

  14. Quantitative Decision Making.

    Science.gov (United States)

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  15. Quantitative Management in Libraries

    Science.gov (United States)

    Heinritz, Fred J.

    1970-01-01

    Based on a position paper orginally presented at the Institute on Quantitative Methods in Librarianship at Ohio State University Libraries in August, 1969, this discusses some of the elements of management: motion, time and cost studies, operations research and other mathematical techniques, and data processing equipment. (Author)

  16. Critical Quantitative Inquiry in Context

    Science.gov (United States)

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  17. Quantitative Characterization of Nanostructured Materials

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Frank (Bud) Bridges, University of California-Santa Cruz

    2010-08-05

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structure measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.

  18. Magnetoresistive biosensors for quantitative proteomics

    Science.gov (United States)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  19. Designing Quantitative Experiments Prediction Analysis

    CERN Document Server

    Wolberg, John

    2010-01-01

    The method of Prediction Analysis is applicable for anyone interested in designing a quantitative experiment. The design phase of an experiment can be broken down into problem dependent design questions (like the type of equipment to use and the experimental setup) and generic questions (like the number of data points required, range of values for the independent variables and measurement accuracy). This book is directed towards the generic design phase of the process. The methodology for this phase of the design process is problem independent and can be applied to experiments performed in most branches of science and technology. The purpose of the prediction analysis is to predict the accuracy of the results that one can expect from a proposed experiment. Prediction analyses can be performed using the REGRESS program which was developed by the author and can be obtained free-of-charge through the author's website. Many examples of prediction analyses are included in the book ranging from very simple experime...

  20. Innovations in Quantitative Risk Management

    CERN Document Server

    Scherer, Matthias; Zagst, Rudi

    2015-01-01

    Quantitative models are omnipresent –but often controversially discussed– in todays risk management practice. New regulations, innovative financial products, and advances in valuation techniques provide a continuous flow of challenging problems for financial engineers and risk managers alike. Designing a sound stochastic model requires finding a careful balance between parsimonious model assumptions, mathematical viability, and interpretability of the output. Moreover, data requirements and the end-user training are to be considered as well. The KPMG Center of Excellence in Risk Management conference Risk Management Reloaded and this proceedings volume contribute to bridging the gap between academia –providing methodological advances– and practice –having a firm understanding of the economic conditions in which a given model is used. Discussed fields of application range from asset management, credit risk, and energy to risk management issues in insurance. Methodologically, dependence modeling...

  1. Quantitative genetics of disease traits.

    Science.gov (United States)

    Wray, N R; Visscher, P M

    2015-04-01

    John James authored two key papers on the theory of risk to relatives for binary disease traits and the relationship between parameters on the observed binary scale and an unobserved scale of liability (James Annals of Human Genetics, 1971; 35: 47; Reich, James and Morris Annals of Human Genetics, 1972; 36: 163). These two papers are John James' most cited papers (198 and 328 citations, November 2014). They have been influential in human genetics and have recently gained renewed popularity because of their relevance to the estimation of quantitative genetics parameters for disease traits using SNP data. In this review, we summarize the two early papers and put them into context. We show recent extensions of the theory for ascertained case-control data and review recent applications in human genetics. © 2015 Blackwell Verlag GmbH.

  2. Workshop on quantitative dynamic stratigraphy. Final conference report

    Energy Technology Data Exchange (ETDEWEB)

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  3. Energy & Climate: Getting Quantitative

    Science.gov (United States)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  4. Applied quantitative finance

    CERN Document Server

    Chen, Cathy; Overbeck, Ludger

    2017-01-01

    This volume provides practical solutions and introduces recent theoretical developments in risk management, pricing of credit derivatives, quantification of volatility and copula modeling. This third edition is devoted to modern risk analysis based on quantitative methods and textual analytics to meet the current challenges in banking and finance. It includes 14 new contributions and presents a comprehensive, state-of-the-art treatment of cutting-edge methods and topics, such as collateralized debt obligations, the high-frequency analysis of market liquidity, and realized volatility. The book is divided into three parts: Part 1 revisits important market risk issues, while Part 2 introduces novel concepts in credit risk and its management along with updated quantitative methods. The third part discusses the dynamics of risk management and includes risk analysis of energy markets and for cryptocurrencies. Digital assets, such as blockchain-based currencies, have become popular b ut are theoretically challenging...

  5. Semi-automatic quantitative measurements of intracranial internal carotid artery stenosis and calcification using CT angiography

    NARCIS (Netherlands)

    Bleeker, Leslie; Marquering, Henk A.; van den Berg, René; Nederkoorn, Paul J.; Majoie, Charles B.

    2012-01-01

    Intracranial carotid artery atherosclerotic disease is an independent predictor for recurrent stroke. However, its quantitative assessment is not routinely performed in clinical practice. In this diagnostic study, we present and evaluate a novel semi-automatic application to quantitatively measure

  6. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  7. Stable Isotope Dilution Mass Spectrometry for Membrane Transporter Quantitation

    OpenAIRE

    Farrokhi, Vahid; McShane, Adam J.; Nemati, Reza; Yao, Xudong

    2013-01-01

    This review provides an introduction to stable isotope dilution mass spectrometry (MS) and its emerging applications in the analysis of membrane transporter proteins. Various approaches and application examples, for the generation and use of quantitation reference standards—either stable isotope-labeled peptides or proteins—are discussed as they apply to the MS quantitation of membrane proteins. Technological considerations for the sample preparation of membrane transporter proteins are also ...

  8. F# for quantitative finance

    CERN Document Server

    Astborg, Johan

    2013-01-01

    To develop your confidence in F#, this tutorial will first introduce you to simpler tasks such as curve fitting. You will then advance to more complex tasks such as implementing algorithms for trading semi-automation in a practical scenario-based format.If you are a data analyst or a practitioner in quantitative finance, economics, or mathematics and wish to learn how to use F# as a functional programming language, this book is for you. You should have a basic conceptual understanding of financial concepts and models. Elementary knowledge of the .NET framework would also be helpful.

  9. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  10. Macropinosome quantitation assay

    Directory of Open Access Journals (Sweden)

    Jack T.H. Wang

    2014-01-01

    can be applied also to non-homogenous cell populations including transiently transfected cell monolayers. We present the background necessary to consider when customising this protocol for application to new cell types or experimental variations.

  11. Quantitative resilience analysis through control design.

    Energy Technology Data Exchange (ETDEWEB)

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  12. Quantitative imaging as cancer biomarker

    Science.gov (United States)

    Mankoff, David A.

    2015-03-01

    The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine

  13. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  14. Quantitative Hyperspectral Reflectance Imaging

    Directory of Open Access Journals (Sweden)

    Ted A.G. Steemers

    2008-09-01

    Full Text Available Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared. By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms.

  15. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    Energy Technology Data Exchange (ETDEWEB)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  16. A comparison of three quantitative schlieren techniques

    Science.gov (United States)

    Hargather, Michael J.; Settles, Gary S.

    2012-01-01

    We compare the results of three quantitative schlieren techniques applied to the measurement and visualization of a two-dimensional laminar free-convection boundary layer. The techniques applied are Schardin's "calibrated" schlieren technique, in which a weak lens in the field-of-view provides a calibration of light deflection angle to facilitate quantitative measurements, "rainbow schlieren", in which the magnitude of schlieren deflection is coded by hue in the image, and "background-oriented schlieren" (BOS), in which quantitative schlieren-like results are had from measuring the distortion of a background pattern using digital-image-correlation software. In each case computers and software are applied to process the data, thus streamlining and modernizing the quantitative application of schlieren optics. (BOS, in particular, is only possible with digital-image-correlation software.) Very good results are had with the lens-calibrated standard schlieren method in the flow tested here. BOS likewise produces good results and requires less expensive apparatus than the other methods, but lacks the simplification of parallel light that they feature. Rainbow schlieren suffers some unique drawbacks, including the production of the required rainbow cutoff filter, and provides little significant benefit over the calibrated schlieren technique.

  17. Quantitative proteomics in the field of microbiology.

    Science.gov (United States)

    Otto, Andreas; Becher, Dörte; Schmidt, Frank

    2014-03-01

    Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Quantitative Computertomographie (QCT

    Directory of Open Access Journals (Sweden)

    Krestan C

    2013-01-01

    Full Text Available Die zentrale quantitative Computertomographie ist ein etabliertes Verfahren zur Knochendichtemessung. Die QCT kann an zentralen und peripheren Messorten durchgeführt werden, wobei die wichtigste zentrale Messregion die Lendenwirbelsäule ist. Die QCT unterscheidet sich von der DXA durch eine 3-dimensionale Messung im Vergleich zur 2-dimensionalen DXA-Untersuchung. Die T-Score-Definition der Osteoporose sollte nicht anhand von QCT-Untersuchungen verwendet werden, da ein Schwellwert von –2,5 zu einer deutlich höheren Prävalenz osteoporotischer Individuen führen würde. Stattdessen wurden Absolutwerte der Knochenmineraldichte für QCT vorgeschlagen. Die Bestimmung der Knochenmineraldichte aus Routine-CT-Untersuchungen stellt einen neuen Trend in der Osteoporosediagnostik dar. Neben der reinen Knochenmineraldichte ist die periphere QCT – und insbesondere die HR-(„high-resolution“- pQCT – in der Lage, Parameter über die trabekuläre und kortikale Knochenqualität zu bestimmen. Die Untersuchungspräzision ist für periphere QCT-Verfahren größer als für zentrale Messorte, was für Verlaufskontrollen relevant ist.

  19. Quantitative Electron Nanodiffraction.

    Energy Technology Data Exchange (ETDEWEB)

    Spence, John [Arizona State Univ., Mesa, AZ (United States)

    2015-01-30

    This Final report summarizes progress under this award for the final reporting period 2002 - 2013 in our development of quantitive electron nanodiffraction to materials problems, especially devoted to atomistic processes in semiconductors and electronic oxides such as the new artificial oxide multilayers, where our microdiffraction is complemented with energy-loss spectroscopy (ELNES) and aberration-corrected STEM imaging (9). The method has also been used to map out the chemical bonds in the important GaN semiconductor (1) used for solid state lighting, and to understand the effects of stacking sequence variations and interfaces in digital oxide superlattices (8). Other projects include the development of a laser-beam Zernike phase plate for cryo-electron microscopy (5) (based on the Kapitza-Dirac effect), work on reconstruction of molecular images using the scattering from many identical molecules lying in random orientations (4), a review article on space-group determination for the International Tables on Crystallography (10), the observation of energy-loss spectra with millivolt energy resolution and sub-nanometer spatial resolution from individual point defects in an alkali halide, a review article for the Centenary of X-ray Diffration (17) and the development of a new method of electron-beam lithography (12). We briefly summarize here the work on GaN, on oxide superlattice ELNES, and on lithography by STEM.

  20. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Quantitation of 4-Methyl-4-sulfanylpentan-2-one (4MSP) in Hops by a Stable Isotope Dilution Assay in Combination with GC×GC-TOFMS: Method Development and Application To Study the Influence of Variety, Provenance, Harvest Year, and Processing on 4MSP Concentrations.

    Science.gov (United States)

    Reglitz, Klaas; Steinhaus, Martin

    2017-03-22

    A stable isotope dilution assay was developed for quantitation of 4-methyl-4-sulfanylpentan-2-one (4MSP) in hops. The approach included the use of 4-(13C)methyl-4-sulfanyl(1,3,5-13C3)pentan-2-one as internal standard, selective isolation of hop thiols by mercurated agarose, and GC×GC-TOFMS analysis. Application of the method to 53 different hop samples revealed 4MSP concentrations between Hop processing such as drying and pelletizing had only a minor impact on 4MSP concentrations. Like the majority of other hop volatiles, 4MSP is predominantly located in the lupulin glands.

  2. Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.

    Science.gov (United States)

    Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W

    2017-01-01

    Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.

  3. Analysis of Two Quantitative Ultrasound Approaches.

    Science.gov (United States)

    Muleki-Seya, Pauline; Han, Aiguo; Andre, Michael P; Erdman, John W; O'Brien, William D

    2017-09-01

    There are two well-known ultrasonic approaches to extract sets of quantitative parameters: Lizzi-Feleppa (LF) parameters: slope, intercept, and midband; and quantitative ultrasound (QUS)-derived parameters: effective scatterer diameter (ESD) and effective acoustic concentration (EAC). In this study, the relation between the LF and QUS-derived parameters is studied theoretically and experimentally on ex vivo mouse livers. As expected from the theory, LF slope is correlated to ESD ([Formula: see text]), and from experimental data, LF midband is correlated to EAC ([Formula: see text]). However, LF intercept is not correlated to ESD ([Formula: see text]) nor EAC ([Formula: see text]). The unexpected correlation observed between LF slope and EAC ([Formula: see text]) results likely from the high correlation between ESD and EAC due to the inversion process. For a liver fat percentage estimation, an important potential medical application, the parameters presenting the better correlation are EAC ([Formula: see text]) and LF midband ([Formula: see text]).

  4. Using Local Data To Advance Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Stephen Sweet

    2008-07-01

    Full Text Available In this article we consider the application of local data as a means of advancing quantitative literacy. We illustrate the use of three different sources of local data: institutional data, Census data, and the National College Health Assessment survey. Our learning modules are applied in courses in sociology and communication, but the strategy of using local data can be integrated beyond these disciplinary boundaries. We demonstrate how these data can be used to stimulate student interests in class discussion, advance analytic skills, as well as develop capacities in written and verbal communication. We conclude by considering concerns that may influence the types of local data used and the challenges of integrating these data in a course in which quantitative analysis is not typically part of the curriculum.

  5. Portable smartphone based quantitative phase microscope

    Science.gov (United States)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  6. Using quantitative phase petrology to understand metamorphism

    Science.gov (United States)

    White, Richard

    2015-04-01

    Quantitative phase petrology has become one of the mainstay methods for interpreting metamorphic rocks and processes. Its increased utility has been driven by improvements to end-member thermodynamics, activity-composition relationships and computer programs to undertake calculations. Such improvements now allow us to undertake calculations in increasingly complex chemical systems that more closely reflect those of rocks. Recent progress in activity-composition (a-x) relationships is aimed at developing suites of a-x relationships in large chemical systems that are calibrated together, which will allow a more direct application of the method to metamorphic rocks. In addition, considerable progress has been made in how quantitative phase diagrams can be used to understand features, including chemical potential diagrams for reaction textures, methods for fractionating bulk compositions and methods for modelling open system processes. One feature of calculated phase diagrams is that they present us with a great amount of information, such as mineral assemblages, mineral proportions, phase compositions, volume or density etc. An important aspect to using this information is to understand the potential uncertainties associated with these, which are significant. These uncertainties require that calculated phase diagrams be used with caution to interpret observed features in rocks. Features such as mineral zoning and reaction textures should still be interpreted in a semi-quantitative way, even if based on a fully quantitative diagram. Exercises such as the interpretation of reaction overstepping based on relating phase diagrams to observed mineral core compositions are likely to give spurious results given the infelicities in existing a-x models. Despite these limitations, quantitative phase petrology remains the most useful approach to interpreting the metamorphic history of rocks in that it provides a theoretical framework in which to interpret observed features rather

  7. Mastering R for quantitative finance

    CERN Document Server

    Berlinger, Edina; Badics, Milán; Banai, Ádám; Daróczi, Gergely; Dömötör, Barbara; Gabler, Gergely; Havran, Dániel; Juhász, Péter; Margitai, István; Márkus, Balázs; Medvegyev, Péter; Molnár, Julia; Szucs, Balázs Árpád; Tuza, Ágnes; Vadász, Tamás; Váradi, Kata; Vidovics-Dancs, Ágnes

    2015-01-01

    This book is intended for those who want to learn how to use R's capabilities to build models in quantitative finance at a more advanced level. If you wish to perfectly take up the rhythm of the chapters, you need to be at an intermediate level in quantitative finance and you also need to have a reasonable knowledge of R.

  8. A quantitative benefit-risk assessment approach to improve decision making in drug development: Application of a multicriteria decision analysis model in the development of combination therapy for overactive bladder.

    Science.gov (United States)

    de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R

    2016-04-01

    A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.

  9. Analysis of specific RNA in cultured cells through quantitative integration of q-PCR and N-SIM single cell FISH images: Application to hormonal stimulation of StAR transcription.

    Science.gov (United States)

    Lee, Jinwoo; Foong, Yee Hoon; Musaitif, Ibrahim; Tong, Tiegang; Jefcoate, Colin

    2016-07-05

    The steroidogenic acute regulatory protein (StAR) has been proposed to serve as the switch that can turn on/off steroidogenesis. We investigated the events that facilitate dynamic StAR transcription in response to cAMP stimulation in MA-10 Leydig cells, focusing on splicing anomalies at StAR gene loci. We used 3' reverse primers in a single reaction to respectively quantify StAR primary (p-RNA), spliced (sp-RNA/mRNA), and extended 3' untranslated region (UTR) transcripts, which were quantitatively imaged by high-resolution fluorescence in situ hybridization (FISH). This approach delivers spatio-temporal resolution of initiation and splicing at single StAR loci, and transfers individual mRNA molecules to cytoplasmic sites. Gene expression was biphasic, initially showing slow splicing, transitioning to concerted splicing. The alternative 3.5-kb mRNAs were distinguished through the use of extended 3'UTR probes, which exhibited distinctive mitochondrial distribution. Combining quantitative PCR and FISH enables imaging of localization of RNA expression and analysis of RNA processing rates. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. ETUDE VARIATIONNELLE DU CONTACT SANS FROTTEMENT ENTRE UN CORPS ELASTIQUE ET UNE FONDATION RIGIDE

    Directory of Open Access Journals (Sweden)

    B TENIOU

    2002-06-01

    Full Text Available Le but de ce travail est l'étude variationnelle du contact sans frottement entre un corps élastique et une fondation rigide. La loi de comportement de ce corps est non-linéaire et le contact est modélisé par les conditions de Signorini. Ce travail est divisé en trois parties. La première est destinée à définir quelques outils de l'analyse fonctionnelle. La seconde est consacrée aux résultats d'existence et d'unicité de la solution. La troisième est réservée à l'étude de quelques propriétés de la solution. Ce travail diffère de celui de Drabla, Sofonea et Teniou [1] par l'étude de nouvelles propriétés de la solution qui sont dues à l'introduction d'un paramètre dans la loi de comportement et le changement d'une condition aux limites.

  11. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    Science.gov (United States)

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  12. Quantitative analysis of 'calanchi

    Science.gov (United States)

    Agnesi, Valerio; Cappadonia, Chiara; Conoscenti, Christian; Costanzo, Dario; Rotigliano, Edoardo

    2010-05-01

    Three years (2006 - 2009) of monitoring data from two calanchi sites located in the western Sicilian Appennines are analyzed and discussed: the data comes from two networks of erosion pins and a rainfall gauge station. The aim of the present research is to quantitatively analyze the effects of erosion by water and to investigate their relationships with rainfall trends and specific properties of the two calanchi fronts. Each of the sites was equipped with a grid of randomly distributed erosion pins, made of 41 nodes for the "Catalfimo" site, and 13 nodes for the "Ottosalme" site (in light of the general homogeneity of its geomorphologic conditions); the erosion pins consist in 2 cm graded iron stakes, 100 cm long, with a section having a diameter of 1.6 cm. Repeated readings at the erosion pins allowed to estimate point topographic height variations; a total number of 21 surveys have been made remotely by acquiring high resolution photographs from a fixed view point. Since the two calanchi sites are very close each other (some hundred meters), a single rainfall gauge station was installed, assuming a strict climatic homogeneity of the investigated area. Rainfall data have been processed to derive the rain erosivity index signal, detecting a total number of 27 erosive events. Despite the close distance between the two sites, because of a different geologic setting, the calanchi fronts are characterized by the outcropping of different levels of the same formation (Terravecchia fm., Middle-Late Miocene); as a consequence, both mineralogical, textural and geotechnical (index) properties, as well as the topographic and geomorphologic characteristics, change. Therefore, in order to define the "framework" in which the two erosion pin grids have been installed, 40 samples of rock have been analyzed, and a geomorphologic detailed survey has been carried out; in particular, plasticity index, liquid limit, carbonate, pH, granulometric fractions and their mineralogic

  13. Quantitative historical hydrology in Europe

    Science.gov (United States)

    Benito, G.; Brázdil, R.; Herget, J.; Machado, M. J.

    2015-08-01

    In recent decades, the quantification of flood hydrological characteristics (peak discharge, hydrograph shape, and runoff volume) from documentary evidence has gained scientific recognition as a method to lengthen flood records of rare and extreme events. This paper describes the methodological evolution of quantitative historical hydrology under the influence of developments in hydraulics and statistics. In the 19th century, discharge calculations based on flood marks were the only source of hydrological data for engineering design, but were later left aside in favour of systematic gauge records and conventional hydrological procedures. In the last two decades, there has been growing scientific and public interest in understanding long-term patterns of rare floods, in maintaining the flood heritage and memory of extremes, and developing methods for deterministic and statistical application to different scientific and engineering problems. A compilation of 46 case studies across Europe with reconstructed discharges demonstrates that (1) in most cases present flood magnitudes are not unusual within the context of the last millennium, although recent floods may exceed past floods in some temperate European rivers (e.g. the Vltava and Po rivers); (2) the frequency of extreme floods has decreased since the 1950s, although some rivers (e.g. the Gardon and Ouse rivers) show a reactivation of rare events over the last two decades. There is a great potential for gaining understanding of individual extreme events based on a combined multiproxy approach (palaeoflood and documentary records) providing high-resolution time flood series and their environmental and climatic changes; and for developing non-systematic and non-stationary statistical models based on relations of past floods with external and internal covariates under natural low-frequency climate variability.

  14. Quantitative imaging with a mobile phone microscope.

    Directory of Open Access Journals (Sweden)

    Arunan Skandarajah

    Full Text Available Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  15. Quantitative Imaging with a Mobile Phone Microscope

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  16. Quantitative imaging with a mobile phone microscope.

    Science.gov (United States)

    Skandarajah, Arunan; Reber, Clay D; Switz, Neil A; Fletcher, Daniel A

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  17. Understanding quantitative research: part 1.

    Science.gov (United States)

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  18. Quantitative EPR A Practitioners Guide

    CERN Document Server

    Eaton, Gareth R; Barr, David P; Weber, Ralph T

    2010-01-01

    This is the first comprehensive yet practical guide for people who perform quantitative EPR measurements. No existing book provides this level of practical guidance to ensure the successful use of EPR. There is a growing need in both industrial and academic research to provide meaningful and accurate quantitative EPR results. This text discusses the various sample, instrument and software related aspects required for EPR quantitation. Specific topics include: choosing a reference standard, resonator considerations (Q, B1, Bm), power saturation characteristics, sample positioning, and finally, putting all the factors together to obtain an accurate spin concentration of a sample.

  19. Mixing quantitative with qualitative methods:

    DEFF Research Database (Denmark)

    Morrison, Ann; Viller, Stephen; Heck, Tamara

    2017-01-01

    with or are considering, researching, or working with both quantitative and qualitative evaluation methods (in academia or industry), join us in this workshop. In particular, we look at adding quantitative to qualitative methods to build a whole picture of user experience. We see a need to discuss both quantitative...... and qualitative research because there is often a perceived lack of understanding of the rigor involved in each. The workshop will result in a White Paper on the latest developments in this field, within Australia and comparative with international work. We anticipate sharing submissions and workshop outcomes...

  20. Current trends in quantitative proteomics - an update.

    Science.gov (United States)

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Quantitative petrostructure analysis. Technical summary report

    Energy Technology Data Exchange (ETDEWEB)

    Warren, N.

    1980-09-01

    The establishment of quantitative techniques would lead to the development of predictive tools which would be of obvious importance in applied geophysics and engineering. In rock physics, it would help establish laws for averaging the effects of finite densities of real cracks and pores. It would also help in elucidating the relation between observed complex crack structures and various models for the mechanical properties of single cracks. The petrostructure study is addressed to this problem. The purpose of the effort is to quantitatively characterize the mineral and crack texture of granitic rock samples. The rock structures are to be characterized in such a way that the results can be used (1) to constrain the modelling of the effect of cracks on the physical properties of rocks, and (2) to test the possibility of establishing quantitative and predictive relations between petrographic observables and whole rock properties. Statistical techniques are being developed and being applied to the problem of parameterizing complex texture and crack patterns of rock, and of measuring correlation of these parameters to other measurable variables. The study is an application in factor analysis.

  2. Next generation quantitative genetics in plants

    Directory of Open Access Journals (Sweden)

    José M Jiménez-Gómez

    2011-11-01

    Full Text Available The analysis of continuous phenotypic traits through quantitative trait loci analysis, or QTL analysis, allows identification of the loci responsible for the variation observed in nature. QTL analyses involve establishing associations between genetic markers and the phenotypic variation of a quantitative trait in a segregating population. The laborious task of acquiring genetic markers and phenotypes in segregating populations have continuously benefit from technical advances. The new high-throughput sequencing technologies, or HTS, are radically transforming the way QTL analyses are performed. These technologies have the ability of rapidly and inexpensively sequence billions of bases without previous knowledge of the genomes analyzed. The development of HTS has been accompanied by a rapid progress in experimental protocols, computational pipelines and statistical frameworks to fit researchers needs. Some of these advances allow detection of molecular markers and phenotypes with a resolution never achieved before. In this review I discuss the application of HTS in quantitative genetics, focusing on molecular marker discovery, population genotyping and expression profiling for eQTL analysis.

  3. Plant and animal communities along the Swedish Baltic Sea coast - the building of a database of quantitative data collected by SCUBA divers, its use and some GIS applications in the Graesoe area

    Energy Technology Data Exchange (ETDEWEB)

    Sandman, Antonia; Kautsky, Hans [Stockholm Univ. (Sweden). Dept. of Systems Ecology

    2004-06-01

    The aim of the project was to compile a single database with quantitative data collected by SCUBA divers from the whole Swedish Baltic Sea coast. Data of plant and animal biomass, together with position, depth and type of substrate from 19 areas along the Swedish coast from the county of Blekinge to Kalix in the Bothnian Bay were compiled in a single database. In all, the database contains 2,170 records (samples) from 179 different stations where in total 161 plant and 145 animal species have been found. The data were then illustrated by the geographical distribution of plant and animal biomass and by constructing a model to estimate future changes of the plant and animal communities in the Graesoe area in the Aaland Sea applying GIS-techniques. To illustrate the opportunities of the database the change of the composition of benthic plant and animal biomass with salinity was calculated. The proportion of marine species increased with increasing salinity and the benthic biomass was at its highest in the southern Baltic proper. Quantitative data from Grepen and the Graesoe-Singoe area were used to calculate present biomass in the Graesoe area. A scenario of the change in biomass distribution and total biomass caused by shore displacement was created using data from Raaneaa and Kalix in the Bothnian Bay. To map the biomass distribution the material was divided into different depth intervals. The change of biomass with time was calculated as a function of salinity change and reduction of the available area, caused by shore displacement. The total biomass for all plants and animals in the investigated area was 50,500 tonnes at present. In 2,000 years the total biomass will be 25,000 tonnes and in 4,000 years 3,600 tonnes due to shore displacement causing a decrease in both salinity and available substrate.To make an estimate of the species distribution and a rough estimate of their biomass in an unknown geographic area, the type of substrate, the depth and the wave

  4. Plant and animal communities along the Swedish Baltic Sea coast - the building of a database of quantitative data collected by SCUBA divers, its use and some GIS applications in the Graesoe area

    Energy Technology Data Exchange (ETDEWEB)

    Sandman, Antonia; Kautsky, Hans [Stockholm Univ. (Sweden). Dept. of Systems Ecology

    2005-03-01

    The aim of the project was to compile a single database with quantitative data collected by SCUBA divers from the whole Swedish Baltic Sea coast. Data of plant and animal biomass, together with position, depth and type of substrate from 19 areas along the Swedish coast from the county of Blekinge to Kalix in the Bothnian Bay were compiled in a single database. In all, the database contains 2,170 records (samples) from 179 different stations where in total 161 plant and 145 animal species have been found. The data were then illustrated by the geographical distribution of plant and animal biomass and by constructing a model to estimate future changes of the plant and animal communities in the Graesoe area in the Aaland Sea applying GIS-techniques. To illustrate the opportunities of the database the change of the composition of benthic plant and animal biomass with salinity was calculated. The proportion of marine species increased with increasing salinity and the benthic biomass was at its highest in the southern Baltic proper. Quantitative data from Grepen and the Graesoe-Singoe area were used to calculate present biomass in the Graesoe area. A scenario of the change in biomass distribution and total biomass caused by shore displacement was created using data from Raaneaa and Kalix in the Bothnian Bay. To map the biomass distribution the material was divided into different depth intervals. The change of biomass with time was calculated as a function of salinity change and reduction of the available area, caused by shore displacement. The total biomass for all plants and animals in the investigated area was 50,500 tonnes at present. In 2,000 years the total biomass will be 25,000 tonnes and in 4,000 years 3,600 tonnes due to shore displacement causing a decrease in both salinity and available substrate.To make an estimate of the species distribution and a rough estimate of their biomass in an unknown geographic area, the type of substrate, the depth and the wave

  5. Simultaneous quantitative determination of 20 active components in the traditional Chinese medicine formula Zhi-Zi-Da-Huang decoction by liquid chromatography coupled with mass spectrometry: application to study the chemical composition variations in different combinations.

    Science.gov (United States)

    Tang, Zheng; Yin, Ran; Bi, Kaishun; Zhu, Heyun; Han, Fei; Chen, Kelin; Wang, Fenrong

    2015-09-01

    Zhi-Zi-Da-Huang decoction (ZZDHD), a classical traditional Chinese medicine (TCM) prescription composed of four herbal medicines, has been widely used in treating various hepatobiliary disorders for a long time. The objective of this study was to develop a sensitive and efficient liquid chromatography coupled with mass spectrometry (LC-MS) method for quantitative determination of 20 active constituents, including three iridoid glycosides, 11 flavonoids, three anthraquinones and three tannins in ZZDHD. Separation was achieved on a phenomenex kinetex C18 column (150 × 4.6 mm, 2.6 µm) using gradient elution with a mobile phase consisting of acetonitrile and 0.1% formic acid in water. Detection was performed with electrospray ionization source in the negative ionization and selected ion monitoring mode. The established method was validated by determining the linearity (r(2) ≥ 0.9983), limit of quantification (0.16-300 ng/mL), precision (RSD ≤ 4.6%), average recovery (96.0-105.6%), repeatability (RSD ≤ 3.2%) and stability (RSD ≤ 4.5%). Then, the method was successfully applied to investigate the chemical composition variations owing to the interaction between the four component herbs of ZZDHD during the extraction process. It was found that different combinations of the herbs affect the extraction efficiency of chemical constituents in different ways. The validated LC-MS method provides a meaningful basis for quality control and further research on ZZDHD. Copyright © 2015 John Wiley & Sons, Ltd.

  6. Quantitative assessment of betamethasone dual-acting formulation in urine of patients with rheumatoid arthritis and ankylosing spondylitis after single-dose intramuscular administration and its application to long-term pharmacokinetic study.

    Science.gov (United States)

    Kopylov, Arthur T; Novikov, Alexander A; Kaysheva, Anna L; Vykhodets, Igor T; Karateev, Dmitry E; Zgoda, Victor G; Lisitsa, Andrey V

    2018-02-05

    Quantitative evaluation and assessment of pharmacokinetic parameters of Diprospan® (suspension for injection 7mg/mL (2mg+5mg/mL) of betamethasone) were performed in urine samples taken from patients with rheumatoid arthritis or ankylosing spondylitis for 28days after systemic intramuscular administration in routine clinical practice in an open-comparative prospective cohort study. The maximum betamethasone concentration was reached at day 4 of the follow-up; in some cases, β-phase of elimination of the drug was appeared at day 14 or at day 21 of the follow-up. The deferred β-phase elimination was likely a consequence of the physiological characteristics of the patients or of the influence of non-steroidal agents. The half-life of betamethasone was 8.5days. The elimination rate constant was 2.49h-1; the mean clearance was 4.72L/d. The recommended frequency of the drug administration to its complete elimination was estimated up to 48days. Mann-Whitney test showed no significant differences in pharmacokinetic characteristics between male and female subjects. The prolonged elimination phase was observed in patients with deviations in their body mass index, continual treatment by diclofenac and nimesulide or, possibly, after consuming an alcohol. The study was recorded in Clinical Trials open source with identifier NCT03119454. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Development and Validation of A Bioanalytical Method to Quantitate Enzalutamide and its Active Metabolite N-Desmethylenzalutamide in Human Plasma: Application to Clinical Management of Metastatic Castration-Resistant Prostate Cancer Patients.

    Science.gov (United States)

    Benoist, Guillemette E; van der Meulen, Eric; van Oort, Inge M; Beumer, Jan Hendrik; Somford, Diederik M; Schalken, Jack A; Burger, David M; van Erp, Nielka P

    2018-02-05

    Enzalutamide is a potent androgen-signaling receptor inhibitor and is licensed for the treatment of metastatic castration-resistant prostate cancer. N-desmethylenzalutamide is the active metabolite of enzalutamide. A method to quantitate enzalutamide and its active metabolite was developed and validated according to the European Medicine Agency (EMA) guidelines. Enzalutamide and N-desmethylenzalutamide were extracted by protein precipitation, separated on a C18 column with gradient elution and analyzed with tandem quadruple mass spectrometry in positive ion mode. A stable deuterated isotope (D6-enzalutamide) was used as an internal standard. The method was tested and stability was studied in real life patients with metastatic castration-resistant prostate cancer patients treated with enzalutamide. The calibration curve covered the range of 500-50000 ng/mL. Within- and between-day precisions were <8% and accuracies were within 108% for both enzalutamide and N-desmethylenzalutamide. Precisions for lower-limit-of-quantification level were <10% and accuracies within 116% for enzalutamide and N-desmethylenzalutamide. Enzalutamide and N-desmethylenzalutamide stability was proven for 24 hours for whole blood at ambient temperature, and 23 days for plasma at both ambient temperature and 2-8 °C. Long-term patient plasma stability was shown for 14 months at -40 °C. This bioanalytical method was successfully validated and applied to determine plasma concentrations of enzalutamide and N-desmethylenzalutamide in clinical studies and in routine patient care.

  8. A new method of separation and quantitation of mucus glycoprotein in rat gastric mucus gel layer and its application to mucus secretion induced by 16,16-dimethyl PGE2.

    Science.gov (United States)

    Komuro, Y; Ishihara, K; Ohara, S; Saigenji, K; Hotta, K

    1991-10-01

    A method was established for recovering the mucus gel layer of rat gastric mucosa without damage to underlying surface epithelium. The mucus gel was solubilized by stirring the gastric mucosa in a solution of N-acetylcysteine (NAC), a mucolytic agent. Optimal mucus gel solubilization was possible by treatment with 2% NAC for 5 minutes at room temperature. Mucus glycoprotein was quantitatively extracted and measured from the mucus gel sample obtained by the NAC treatment. This treatment caused no damage to surface epithelial cells, as observed by a light microscope. Besides NAC, pronase solution was also adequate for solubilizing the mucus gel layer without any damage to the surface epithelium. However, extraction and measurement of mucus glycoprotein from the pronase-treated mucus gel sample was not possible due to contamination by high molecular hexose-containing substances which were eluted along with the mucus glycoprotein from the column of Bio-Gel A-1.5m. This NAC method was used to examine changes in mucus glycoprotein content in the mucus gel at one hour following the oral administration of 16,16-dimethyl prostaglandin E2. A significant increase in mucus glycoprotein of the gel was brought about by the prostaglandin treatment. Thus, the present method was suitable for estimating the amount of mucus secreted in to the mucus gel layer.

  9. Quantitative cell biology: the essential role of theory.

    Science.gov (United States)

    Howard, Jonathon

    2014-11-05

    Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not. © 2014 Howard. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  10. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials.

    Science.gov (United States)

    Dalecki, Diane; Mercado, Karla P; Hocking, Denise C

    2016-03-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering.

  11. Development and validation of a highly sensitive and robust LC-MS/MS with electrospray ionization method for simultaneous quantitation of itraconazole and hydroxyitraconazole in human plasma: application to a bioequivalence study.

    Science.gov (United States)

    Bharathi, D Vijaya; Hotha, Kishore Kumar; Sagar, P V Vidya; Kumar, Sanagapati Sirish; Reddy, Pandu Ranga; Naidu, A; Mullangi, Ramesh

    2008-06-01

    A highly sensitive and specific LC-MS/MS method has been developed for simultaneous estimation of itraconazole (ITZ) and hydroxyitraconazole (OH-ITZ) with 500 microL of human plasma using fluconazole as an internal standard (IS). The API-4000 LC-MS/MS was operated under the multiple reaction-monitoring mode (MRM) using the electrospray ionization technique. Solid phase extraction process was used to extract ITZ, OH-ITZ and IS from human plasma. The total run time was 3.0 min and the elution of ITZ, OH-ITZ and IS occurred at 2.08 min, 1.85 min and 1.29 min, respectively; this was achieved with a mobile phase consisting of 0.2% (v/v) ammonia solution:acetonitrile (20:80, v/v) at a flow rate of 0.50 mL/min on a HyPurity C(18) (50 mm x 4.6 mm, 5 microm) column. The developed method was validated in human plasma with a lower limit of quantitation of 0.50 ng/mL for both ITZ and OH-ITZ. A linear response function was established for the range of concentrations 0.5-263 ng/mL (r>0.998) for both ITZ and OH-ITZ. The intra- and inter-day precision values for ITZ and OH-ITZ met the acceptance as per FDA guidelines. ITZ and OH-ITZ were stable in the battery of stability studies, viz., bench-top, auto-sampler, dry extract and freeze/thaw cycles. The developed assay method was applied to an oral bioequivalence study in humans.

  12. Correlation of Quantitative PCR for a Poultry-Specific Brevibacterium Marker Gene with Bacterial and Chemical Indicators of Water Pollution in a Watershed Impacted by Land Application of Poultry Litter▿

    Science.gov (United States)

    Weidhaas, Jennifer L.; Macbeth, Tamzen W.; Olsen, Roger L.; Harwood, Valerie J.

    2011-01-01

    The impact of fecal contamination from human and agricultural animal waste on water quality is a major public health concern. Identification of the dominant source(s) of fecal pollution in a watershed is necessary for assessing the safety of recreational water and protecting water resources. A field study was conducted using quantitative PCR (qPCR) for the 16S rRNA gene of Brevibacterium sp. LA35 to track feces-contaminated poultry litter in environmental samples. Based on sensitivity and specificity characteristics of the qPCR method, the Bayesian conditional probability that detection of the LA35 marker gene in a water sample represented a true-positive result was 93%. The marker's covariance with fecal indicator bacteria (FIB) and metals associated with poultry litter was also assessed in litter, runoff, surface water, and groundwater samples. LA35 was detected in water and soil samples collected throughout the watershed, and its concentration covaried with concentrations of Escherichia coli, enterococci, As, Cu, P, and Zn. Significantly greater concentrations of FIB, As, Cu, P, and Zn were observed in edge-of-field runoff samples in which LA35 was detected, compared to samples in which it was not detected. Furthermore, As, Cu, P, and Zn concentrations covaried in environmental samples in which LA35 was detected and typically did not in samples in which the marker gene was not detected. The covariance of the poultry-specific LA35 marker gene with these known contaminants from poultry feces provides further evidence that it is a useful tool for assessing the impact of poultry-derived fecal pollution in environmental waters. PMID:21278274

  13. An LC-MS/MS method for quantitation of cyanidin-3-O-glucoside in rat plasma: Application to a comparative pharmacokinetic study in normal and streptozotocin-induced diabetic rats.

    Science.gov (United States)

    Yang, Chunxia; Wang, Qiuhua; Yang, Shenbao; Yang, Qiong; Wei, Ying

    2018-02-01

    A sensitive and reliable liquid chromatography tandem mass spectrometry (LC-MS/MS) method was developed to determine cyanidin-3-O-glucoside (Cy-3G) in normal and streptozotocin-induced diabetic rat plasma. Chromatographic separation was carried out on a Zorbax SB-C18 (50 × 4.6 mm, 5 μm) column and mass spectrometric analysis was performed using a Thermo Finnigan TSQ Quantum Ultra triple-quadrupole mass spectrometer coupled with an ESI source in the negative ion mode. Selected reaction monitoring mode was applied for quantification using target fragment ions m/z 447.3 → 285.2 for Cy-3G and m/z 463.0 → 300.1 for quercetin-3-O-glucoside (internal standard). The calibration curve was linear over the range 3.00-2700 ng/mL (r2  ≥ 0.99) with the lower limit of quantitation at 3.00 ng/mL. Intra- and inter-day precision was <14.5% and mean accuracy was from -11.5 to 13.6%. Stability testing showed that Cy-3G remained stable during the whole analytical procedure. After validation, the assay was successfully used to support a preclinical pharmacokinetic comparison of Cy-3G between normal and diabetic rats. Results indicated that diabetes mellitus significantly altered the in vivo pharmacokinetic characteristics of Cy-3G after oral administration in rats. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Adaptation of a quantitative programme for the X-ray analysis of solubilized tissue as microdroplets in the transmission electron microscope: application to the moult cycle of the shrimp Crangon crangon (L).

    Science.gov (United States)

    Nott, J A; Mavin, L J

    1986-09-01

    A quantitative programme for X-ray microanalysis is used in a non-standard manner on solubilized tissue which has been spiked with cobalt and sprayed as microdroplets on electron microscope grids. During the procedure the count time and the concentration of cobalt is related to the peak integral and, from the relative efficiencies, the concentrations of other elements are computed from the peak integrals. Absorption is taken into account but the X-ray background is not used to estimate the total mass and the beam current is not measured. The method is applied to the hepatopancreas and blood from individual shrimps, Crangon crangon, to give the concentrations of sodium, magnesium, silicon, phosphorus, sulphur, potassium and calcium at different stages of the moult cycle. In the hepatopancreas the absolute and relative quantities of phosphorus, sulphur and other elements change in phase with the moult cycle. This situation must be linked with fluctuations in levels of metabolic activity and may affect the metal-binding capacity of the tissue which is known to fluctuate. The hepatopancreas accumulates lipid and phosphorus during the intermoult period, but the level of phospholipid phosphorus remains as a constant proportion of the tissue wet weight. The gland does not store calcium for hardening the new exoskeleton after ecdysis. Magnesium is a more important and variable component and could be linked with metabolic activity. The blood composition remains more stable. However, sulphur concentration is high and variable and this may, to some extent, reflect changes in the concentration of taurine. The concentration of copper increases towards the end of the moult cycle and decreases during moulting; opposite changes occur in the hepatopancreas.

  15. Correlation of quantitative PCR for a poultry-specific brevibacterium marker gene with bacterial and chemical indicators of water pollution in a watershed impacted by land application of poultry litter.

    Science.gov (United States)

    Weidhaas, Jennifer L; Macbeth, Tamzen W; Olsen, Roger L; Harwood, Valerie J

    2011-03-01

    The impact of fecal contamination from human and agricultural animal waste on water quality is a major public health concern. Identification of the dominant source(s) of fecal pollution in a watershed is necessary for assessing the safety of recreational water and protecting water resources. A field study was conducted using quantitative PCR (qPCR) for the 16S rRNA gene of Brevibacterium sp. LA35 to track feces-contaminated poultry litter in environmental samples. Based on sensitivity and specificity characteristics of the qPCR method, the Bayesian conditional probability that detection of the LA35 marker gene in a water sample represented a true-positive result was 93%. The marker's covariance with fecal indicator bacteria (FIB) and metals associated with poultry litter was also assessed in litter, runoff, surface water, and groundwater samples. LA35 was detected in water and soil samples collected throughout the watershed, and its concentration covaried with concentrations of Escherichia coli, enterococci, As, Cu, P, and Zn. Significantly greater concentrations of FIB, As, Cu, P, and Zn were observed in edge-of-field runoff samples in which LA35 was detected, compared to samples in which it was not detected. Furthermore, As, Cu, P, and Zn concentrations covaried in environmental samples in which LA35 was detected and typically did not in samples in which the marker gene was not detected. The covariance of the poultry-specific LA35 marker gene with these known contaminants from poultry feces provides further evidence that it is a useful tool for assessing the impact of poultry-derived fecal pollution in environmental waters.

  16. Application of Ultra-High-Performance Liquid Chromatography Coupled with LTQ-Orbitrap Mass Spectrometry for the Qualitative and Quantitative Analysis of Polygonum multiflorum Thumb. and Its Processed Products.

    Science.gov (United States)

    Wang, Teng-Hua; Zhang, Jing; Qiu, Xiao-Hui; Bai, Jun-Qi; Gao, You-Heng; Xu, Wen

    2015-12-26

    In order to quickly and simultaneously obtain the chemical profiles and control the quality of the root of Polygonum multiflorum Thumb. and its processed form, a rapid qualitative and quantitative method, using ultra-high-performance liquid chromatography coupled with electrospray ionization-linear ion trap-Orbitrap hybrid mass spectrometry (UHPLC-LTQ-Orbitrap MS(n)) has been developed. The analysis was performed within 10 min on an AcQuity UPLC™ BEH C18 column with a gradient elution of 0.1% formic acid-acetonitrile at flow rate of 400 μL/min. According to the fragmentation mechanism and high resolution MS(n) data, a diagnostic ion searching strategy was used for rapid and tentative identification of main phenolic components and 23 compounds were simultaneously identified or tentatively characterized. The difference in chemical profiles between P. multiflorum and its processed preparation were observed by comparing the ions abundances of main constituents in the MS spectra and significant changes of eight metabolite biomarkers were detected in the P. multiflorum samples and their preparations. In addition, four of the representative phenols, namely gallic acid, trans-2,3,5,4'-tetra-hydroxystilbene-2-O-β-d-glucopyranoside, emodin and emodin-8-O-β-d-glucopyranoside were quantified by the validated UHPLC-MS/MS method. These phenols are considered to be major bioactive constituents in P. multiflorum, and are generally regarded as the index for quality assessment of this herb. The method was successfully used to quantify 10 batches of P. multiflorum and 10 batches of processed P. multiflorum. The results demonstrated that the method is simple, rapid, and suitable for the discrimination and quality control of this traditional Chinese herb.

  17. Application of Ultra-High-Performance Liquid Chromatography Coupled with LTQ-Orbitrap Mass Spectrometry for the Qualitative and Quantitative Analysis of Polygonum multiflorum Thumb. and Its Processed Products

    Directory of Open Access Journals (Sweden)

    Teng-Hua Wang

    2015-12-01

    Full Text Available In order to quickly and simultaneously obtain the chemical profiles and control the quality of the root of Polygonum multiflorum Thumb. and its processed form, a rapid qualitative and quantitative method, using ultra-high-performance liquid chromatography coupled with electrospray ionization-linear ion trap-Orbitrap hybrid mass spectrometry (UHPLC-LTQ-Orbitrap MSn has been developed. The analysis was performed within 10 min on an AcQuity UPLC™ BEH C18 column with a gradient elution of 0.1% formic acid-acetonitrile at flow rate of 400 μL/min. According to the fragmentation mechanism and high resolution MSn data, a diagnostic ion searching strategy was used for rapid and tentative identification of main phenolic components and 23 compounds were simultaneously identified or tentatively characterized. The difference in chemical profiles between P. multiflorum and its processed preparation were observed by comparing the ions abundances of main constituents in the MS spectra and significant changes of eight metabolite biomarkers were detected in the P. multiflorum samples and their preparations. In addition, four of the representative phenols, namely gallic acid, trans-2,3,5,4′-tetra-hydroxystilbene-2-O-β-d-glucopyranoside, emodin and emodin-8-O-β-d-glucopyranoside were quantified by the validated UHPLC-MS/MS method. These phenols are considered to be major bioactive constituents in P. multiflorum, and are generally regarded as the index for quality assessment of this herb. The method was successfully used to quantify 10 batches of P. multiflorum and 10 batches of processed P. multiflorum. The results demonstrated that the method is simple, rapid, and suitable for the discrimination and quality control of this traditional Chinese herb.

  18. Quantitative sociodynamics stochastic methods and models of social interaction processes

    CERN Document Server

    Helbing, Dirk

    1995-01-01

    Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioural changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics but they have very often proved their explanatory power in chemistry, biology, economics and the social sciences. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces the most important concepts from nonlinear dynamics (synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches a very fundamental dynamic model is obtained which seems to open new perspectives in the social sciences. It includes many established models as special cases, e.g. the log...

  19. Quantitative Sociodynamics Stochastic Methods and Models of Social Interaction Processes

    CERN Document Server

    Helbing, Dirk

    2010-01-01

    This new edition of Quantitative Sociodynamics presents a general strategy for interdisciplinary model building and its application to a quantitative description of behavioral changes based on social interaction processes. Originally, the crucial methods for the modeling of complex systems (stochastic methods and nonlinear dynamics) were developed in physics and mathematics, but they have very often proven their explanatory power in chemistry, biology, economics and the social sciences as well. Quantitative Sociodynamics provides a unified and comprehensive overview of the different stochastic methods, their interrelations and properties. In addition, it introduces important concepts from nonlinear dynamics (e.g. synergetics, chaos theory). The applicability of these fascinating concepts to social phenomena is carefully discussed. By incorporating decision-theoretical approaches, a fundamental dynamic model is obtained, which opens new perspectives in the social sciences. It includes many established models a...

  20. Quantitative imaging methods in osteoporosis.

    Science.gov (United States)

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  1. Application of a Low-Level Laser Therapy and the Purified Protein from Natural Latex (Hevea brasiliensis in the Controlled Crush Injury of the Sciatic Nerve of Rats: A Morphological, Quantitative, and Ultrastructural Study

    Directory of Open Access Journals (Sweden)

    Fernando José Dias

    2013-01-01

    Full Text Available This study analyzed the effects of a low-level laser therapy (LLLT, 15 J/cm2, 780 nm wavelength and the natural latex protein (P1, 0.1% in sciatic nerve after crush injury (15 Kgf, axonotmesis in rats. Sixty rats (male, 250 g were allocated into the 6 groups (n=10: CG—control group; EG—nerve exposed; IG—injured nerve without treatment; LG—crushed nerve treated with LLLT; PG—injured nerve treated with P1; and LPG—injured nerve treated with LLLT and P1. After 4 or 8 weeks, the nerve samples were processed for morphological, histological quantification and ultrastructural analysis. After 4 weeks, the myelin density and morphological characteristics improved in groups LG, PG, and LPG compared to IG. After 8 weeks, PG, and LPG were similar to CG and the capillary density was higher in the LG, PG, and LPG. In the ultrastructural analysis the PG and LPG had characteristics that were similar to the CG. The application of LLLT and/or P1 improved the recovery from the nerve crush injury, and in the long term, the P1 protein was the better treatment used, since only the application of LLLT has not reached the same results, and these treatments applied together did not potentiate the recovery.

  2. Estimation des quantités d'émissions azotées et des courbes de coût marginal d'épuration associées dans les secteurs et les régions du bassin d'un cours d'eau : une application pour le bassin rhénan

    OpenAIRE

    Saulnier, J.

    2008-01-01

    Dans cet article, nous nous intéressons aux questions liées à l'estimation des quantités d'émissions azotées et des courbes de coût marginal d'épuration dans le bassin d'un cours d'eau. L'application et les calculs empiriques sont réalisés pour les secteurs d'activité et les régions du bassin rhénan. Dans un premier temps, nous revenons sur les objectifs environnementaux formulés par la Commission Internationale pour la Protection du Rhin. L'intégration des contraintes de régulation existante...

  3. Evaluation of postprocessing dual-energy quantitative computed tomography

    NARCIS (Netherlands)

    C. van Kuijk (Cornelis)

    1991-01-01

    textabstractCT scanners can be used to provide quantitative information on body composition. Its main application is for bone mineral content estimation within the lumbar vertebral body. This is usually done with a single-energy technique. The estimates obtained with this technique are influenced

  4. Quantitative Indicators for Behaviour Drift Detection from Home Automation Data.

    Science.gov (United States)

    Veronese, Fabio; Masciadri, Andrea; Comai, Sara; Matteucci, Matteo; Salice, Fabio

    2017-01-01

    Smart Homes diffusion provides an opportunity to implement elderly monitoring, extending seniors' independence and avoiding unnecessary assistance costs. Information concerning the inhabitant behaviour is contained in home automation data, and can be extracted by means of quantitative indicators. The application of such approach proves it can evidence behaviour changes.

  5. Quantitative Live Cell FLIM Imaging in Three Dimensions.

    Science.gov (United States)

    Le Marois, Alix; Suhling, Klaus

    2017-01-01

    In this chapter, the concept of fluorescence lifetime and its utility in quantitative live cell imaging will be introduced, along with methods to record and analyze FLIM data. Relevant applications in 3D tissue and live cell imaging, including multiplexed FLIM detection, will also be detailed.

  6. A potential quantitative method for assessing individual tree performance

    Science.gov (United States)

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  7. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  8. Factors Influencing Students' Perceptions of Their Quantitative Skills

    Science.gov (United States)

    Matthews, Kelly E.; Hodgson, Yvonne; Varsavsky, Cristina

    2013-01-01

    There is international agreement that quantitative skills (QS) are an essential graduate competence in science. QS refer to the application of mathematical and statistical thinking and reasoning in science. This study reports on the use of the Science Students Skills Inventory to capture final year science students' perceptions of their QS across…

  9. Quantitative evaluation of the enamel caries which were treated with ...

    African Journals Online (AJOL)

    Objectives: The aim of this in vivo study was to quantitatively evaluate the remineralization of the enamel caries on smooth and occlusal surfaces using DIAGNOdent, after daily application of casein phosphopeptide‑amorphous calcium fluoride phosphate (CPP‑ACFP). Materials and Methods: Thirty volunteers, aged 18–30 ...

  10. An introduction to quantitative remote sensing. [data processing

    Science.gov (United States)

    Lindenlaub, J. C.; Russell, J.

    1974-01-01

    The quantitative approach to remote sensing is discussed along with the analysis of remote sensing data. Emphasis is placed on the application of pattern recognition in numerically oriented remote sensing systems. A common background and orientation for users of the LARS computer software system is provided.

  11. Quantitative effects of phosphorus on maize canopy photosynthesis and biomass

    Science.gov (United States)

    Simulation models for maize can assess the uptake and utilization of nitrogen (N) and phosphorus (P) and help in better managing application rates to improve nutrient use efficiency. Quantitative data, however, are needed to develop and test these models. The purpose of this experiment was to quan...

  12. Development of an UPLC-MS/MS method for simultaneous quantitation of 11 d-amino acids in different regions of rat brain: Application to a study on the associations of d-amino acid concentration changes and Alzheimer's disease.

    Science.gov (United States)

    Li, Zhe; Xing, Yuping; Guo, Xingjie; Cui, Yan

    2017-07-15

    There are significant differences in d-amino acid concentrations between healthy people and Alzheimer's disease patients. In order to investigate the potential correlation between d-amino acids and Alzheimer's disease, a simple and sensitive ultra high performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) method has been developed. The method was applied to simultaneous determination of 11 d-amino acids in different regions of rat brain. Rat brain homogenates were firstly pretreated with protein precipitation procedure and then derivatized with (S)-N-(4-nitrophenoxycarbonyl) phenylalanine methoxyethyl ester [(S)-NIFE]. Baseline separation of the derivatives was achieved on an ACQUITY UPLC BEH C 18 column (2.1 mm×50mm, 1.7μm). The mobile phase consisted of acetonitrile and water (containing 8mM ammonium hydrogen carbonate) and the flow rate was 0.6mLmin -1 . The derived analytes were sensitively detected by multiple reaction monitoring in the positive ion mode. The lower limits of quantitation ranged from 0.06 to 10ngmL -1 with excellent linearity (r≥0.9909). The intra- and inter-day RSD were in the range of 3.6-12% and 5.7-12%, respectively. The recovery rate was 82.5%-95.3%. With this UPLC-MS/MS method, the 11 d-amino acids in hippocampus, cerebral cortex, olfactory bulb and cerebellum from Alzheimer's disease rats and age-matched controls could be simultaneously determined. Compared with the normal controls, the concentrations of d-serine, d-alanine, d-leucine, and d-proline in hippocampus and cerebral cortex of Alzheimer's disease rat brain were significantly decreased, while no differences in olfactory bulb and cerebellum of all the d-amino acids were observed. The different amounts and distribution of d-amino acids in brain between the two groups, which regulated by particular pathological changes of Alzheimer's disease, would give new insights into further study in neuropathogenesis and provide novel therapeutic targets of Alzheimer

  13. Quantitative sexing (Q-Sexing) and relative quantitative sexing (RQ ...

    African Journals Online (AJOL)

    samer

    4Instituto de Investigación en Recursos Cinegéticos, IREC (CSIC-UCLM-JCCM), Ronda de Toledo s/n,. 13005 Ciudad Real, Spain. 5Khankaisky State Biosphere Nature Reserve, Russia. Accepted 26 September, 2012. Here we report a mammal sexing procedure based on the detection of quantitative differences between.

  14. Quantitative Easing and Bank Lending

    OpenAIRE

    Assenga-Amara, Edgar

    2015-01-01

    There is a growing body of literature currently analysing the effects of Quantitative easing especially in the wake of the 2008 financial crisis as large asset purchase programmes were implemented in the USA and the UK. Traditionally, the studies analysing the effects of QE have tended to focus on its impact on the financial markets as a whole or on interest rates. However, the manner in which quantitative easing works puts banks at the centre of how they operate. Therefore it is important to...

  15. The effect of plant growth promoting rhizobacteria (PGPR on quantitative and qualitative characteristics of Sesamum indicum L. with application of cover crops of Lathyrus sp. and Persian clover (Trifolium resopinatum L.

    Directory of Open Access Journals (Sweden)

    M. Jahan

    2016-05-01

    Full Text Available Cover crops cultivation and application of plant growth rhizobacteria are the key factors to enhance agroecosystem health. A field experiment was conducted at the Research Farm of Faculty of Agriculture, Ferdowsi University of Mashhad, Iran, during growing season of 2009-2010. A split plot arrangement based on a complete randomized block design with three replications was used. Cultivation and no cultivation of Lathyrus sp. and Persian clover (Trifolium resopinatum in autumn assigned to the main plots. The sub plot factor consisted of three different types of biofertilizers plus control, including 1-nitroxin (containing of Azotobacter sp. and Azospirillum sp., 2- phosphate solubilizing bacteria (PSB (containing of Bacillus sp. and Pseudomonas sp., 3- biosulfur (containing of Thiobacillus ssp. and 4- control (no fertilizer. The results showed the effect of cover crops on seed number and seed weight per plant, biological and seed yield was significant, as the seed yield increased of 9 %. In general, biofertilizers showed superiority due to the most studied traits compared to control. Nitroxin, PSB and biosulfur increased biological yield of 44, 28 and 26 % compared to control, respectively. Cover crops and biofertilizers interactions, showed significant effect on all studied traits, as the highest and the lowest harvest index resulted in cover crop combined with biofertilizers (22.1% and cultivation and no cultivation of cover crops combined with control (15.3%, respectively. The highest seed oil and protein content resulted from cover crops plus biofertilizers (42.4% and cover crops plus PSB (22.5%, respectively. In general, the results showed cover crops cultivation in combination with biofertilizers application could be an ecological alternative for chemical fertilizers, in addition of achieving advantages of cover crops. According to the results, it should be possible to design an ecological cropping system and produce appropriate and healthy

  16. EFEITOS QUALITATIVO E QUANTITATIVO DE APLICAÇÃO DO ZINCO NO CAPIM TANZÂNIA-1 QUALITIVE AND QUANTITATIVE EFFECTS OF THE ZINC SULPHATE APPLICATION ON TANZÂNIA-1 GRASS

    Directory of Open Access Journals (Sweden)

    Cideon Donizete de Faria

    2007-09-01

    latosol, having as objective to evaluate the effect of the doses 0, 10, 20, 40 and 80 kg ha-1 of zinc sulphate in the productivity, quality and leaf chemical composition of the Tanzânia-1 grass. The soil was prepared with a heavy grade bar in the beginning of the rainy station. As basic fertilization were applied 20 kg of N, 50 kg of P2O5 and 30 kg of K2O ha-1 as ammonium sulphate, commercial Yoorin and potassium chloride, respectively. The plant height and number of budding, gross protein, and fiber in neutral detergent and leaf mineral nutrient were determined just after were evaluated at 60 days after germination. Green mass, dry matter, gross the crop. Although no significant, the dose of 20 kg ha-1 of zinc sulphate influenced qualitative and quantitatively the forage produced.

    KEY-WORDS: Pasture; soil; fertility; sudding; dry matter.

  17. EFEITOS QUALITATIVO E QUANTITATIVO DA APLICAÇÃO DE FÓSFORO NO CAPIM TANZÂNIA-1 QUALITATIVE AND QUANTITATIVE EFFECTS OF PHOSPHORUS APPLICATION ON TANZÂNIA-1 GRASS

    Directory of Open Access Journals (Sweden)

    Renato Sérgio Mota dos Santos

    2007-09-01

    Full Text Available

    Para a correção das carências de fósforo do solo, e considerada imprescindível para elevar a capacidade de suporte animal de uma pastagem, para conhecimento do efeito qualitativo e quantitativo do fósforo na forrageira Tanzânia-1, foi realizado um experimento em um latossolo vermelho-escuro, em Santo Antônio de Goiás, no Estado de Goiás. Para isso, preparou-se o solo com uma gradagem e uma aração e, após uma semana, realizou-se a semeadura. Os tratamentos incluíram 0,50 e 100 kg de P ha-1 de termofosfato comercial, em cobertura, em terreno previamente corrigido com 3 t ha-1 calcário dolomítico. Com o incremento de fósforo, houve um aumento dos valores médios de altura das plantas, número de perfilhos, massa verde e matéria seca, mas uma diminuição dos teores de proteína bruta, como também um aumento das concentrações de fibra. O fósforo não interferiu nos teores de potássio, cálcio, zinco e manganês no tecido foliar das plantas, contudo reduziu os teores de fósforo, cobre e ferro e aumentou os teores de magnésio.

    PALAVRAS-CHAVE: Cerrado; nutrição animal; Panicum maximum; pastagem.

    The amendment of phosphorus shortage is indispensable to elevate the capacity of animal support capacity of a pasture. To improve P status of these soils and to know the qualitative and quantitative effect of phosphorus fertilizer on the forage Tanzânia-1, an experiment was conducted in one dark red latosol , at

  18. Quantitative reactive modeling and verification.

    Science.gov (United States)

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  19. Time-resolved quantitative phosphoproteomics

    DEFF Research Database (Denmark)

    Verano-Braga, Thiago; Schwämmle, Veit; Sylvester, Marc

    2012-01-01

    proteins involved in the Ang-(1-7) signaling, we performed a mass spectrometry-based time-resolved quantitative phosphoproteome study of human aortic endothelial cells (HAEC) treated with Ang-(1-7). We identified 1288 unique phosphosites on 699 different proteins with 99% certainty of correct peptide...

  20. Compositional and Quantitative Model Checking

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand

    2010-01-01

    This paper gives a survey of a composition model checking methodology and its succesfull instantiation to the model checking of networks of finite-state, timed, hybrid and probabilistic systems with respect; to suitable quantitative versions of the modal mu-calculus [Koz82]. The method is based...

  1. Quantitative Characterisation of Surface Texture

    DEFF Research Database (Denmark)

    De Chiffre, Leonardo; Lonardo, P.M.; Trumpold, H.

    2000-01-01

    This paper reviews the different methods used to give a quantitative characterisation of surface texture. The paper contains a review of conventional 2D as well as 3D roughness parameters, with particular emphasis on recent international standards and developments. It presents new texture...

  2. La quantite en islandais modern

    Directory of Open Access Journals (Sweden)

    Magnús Pétursson

    1978-12-01

    Full Text Available La réalisation phonétique de la quantité en syllabe accentuée dans la lecture de deux textes continus. Le problème de la quantité est un des problèmes les plus étudiés dans la phonologie de l'islandais moderne. Du point de vue phonologique il semble qu'on ne peut pas espérer apporter du nouveau, les possibilités théoriques ayant été pratiquement épuisées comme nous 1'avons rappelé dans notre étude récente (Pétursson 1978, pp. 76-78. Le résultat le plus inattendu des recherches des dernières années est sans doute la découverte d'une différenciation quantitative entre le Nord et le Sud de l'Islande (Pétursson 1976a. Il est pourtant encore prématuré de parler de véritables zones quantitatives puisqu'on n' en connaît ni les limites ni l' étendue sur le plan géographique.

  3. Protein Quantitation Using Mass Spectrometry

    Science.gov (United States)

    Zhang, Guoan; Ueberheide, Beatrix M.; Waldemarson, Sofia; Myung, Sunnie; Molloy, Kelly; Eriksson, Jan; Chait, Brian T.; Neubert, Thomas A.; Fenyö, David

    2013-01-01

    Mass spectrometry is a method of choice for quantifying low-abundance proteins and peptides in many biological studies. Here, we describe a range of computational aspects of protein and peptide quantitation, including methods for finding and integrating mass spectrometric peptide peaks, and detecting interference to obtain a robust measure of the amount of proteins present in samples. PMID:20835801

  4. Equilibria in Quantitative Reachability Games

    Science.gov (United States)

    Brihaye, Thomas; Bruyère, Véronique; de Pril, Julie

    In this paper, we study turn-based quantitative multiplayer non zero-sum games played on finite graphs with reachability objectives. In this framework each player aims at reaching his own goal as soon as possible. We prove existence of finite-memory Nash (resp. secure) equilibria in multiplayer (resp. two-player) games.

  5. Quantitative analysis of human behavior.

    Science.gov (United States)

    Iacovitti, G

    2010-01-01

    Many aspects of individual as well as social behaviours of human beings can be analyzed in a quantitative way using typical scientific methods, based on empirical measurements and mathematical inference. Measurements are made possible today by the large variety of sensing devices, while formal models are synthesized using modern system and information theories.

  6. Group Affiliation and Quantitative Sociolinguistics.

    Science.gov (United States)

    Rampton, M. B. H.

    Five prominent issues in quantitative sociolinguistic discussions of speaker classification are summarized and discussed, and a case study that attempts to extend the available methodology is examined. The five issues include the following: (1) to what extent are speaker categories emic or etic? (2) do speaker categories encode local inter-speaker…

  7. Some experiments on the high-low transition of quartz; Recherches experimentales sur une transformation du quartz

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, G. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1959-12-15

    First section. - We expose on the one hand a theory of specific heat, thermal expansion and variations of elastic constants as functions of temperature, which is applicable only in the absence of transformation phenomena affecting symmetry or periodicity of the crystal lattice. On the other hand, we discuss some theories relative to the phenomena which accompany phase transformations. Second section. - We have gathered together numerical results concerning elastic, piezoelectric and optical properties of quartz. Some have been collected from the literature, other have been obtained in our laboratories with the help of experimental methods which we describe. As a result, we are able to present a complete picture of the evolution of these constants in a large temperature range containing the critical temperature of 574 deg. C at which these constants exhibit discontinuities. New phenomena have been observed, in the course of these studies. Third section. - We show that the evolution of the two piezoelectric and elastic constants which cancel out in the high temperature form is described by the same function. With the inclusion of one other function, it is possible to explain quantitatively the behaviour in the transformation range of all the other constants under study. With the help of crystallographic considerations and of hypotheses concerning the nature of the transformation entropy, we finally try to account for the experimental values of these two functions. (author) [French] Dans une premiere partie, nous exposons d'une part une theorie de la chaleur specifique, de la dilatation thermique et des variations des constantes elastiques des solides avec la temperature qui n'est valable qu'en l'absence de phenomenes de transformation affectant la symetrie ou la periodicite de l'edifice cristallin, et nous rappelons d'autre part quelques theories relatives aux phenomenes qui accompagnent les changements de phase. Dans une seconde partie

  8. Quantitative imaging with PET. Performance and applications of sup 7 sup 6 Br, sup 5 sup 2 F sup 1 sup 1 sup 0 sup m In and sup 1 sup 3 sup 4 La

    CERN Document Server

    Lubberink, M

    2001-01-01

    count rates are considerably lower for sup 7 sup 6 Br than for sup 1 sup 8 F at clinically relevant radioactivity concentrations. A method to correct sup 5 sup 2 Fe patient data for the contribution of the decay daughter sup 5 sup 2 sup m Mn is discussed. The use of sup 1 sup 1 sup 0 sup m In is evaluated in a patient study and compared to SPECT imaging with sup 1 sup 1 sup 1 In. A dosimetric and PET evaluation of the of use sup 1 sup 3 sup 4 Ce/ sup 1 sup 3 sup 4 La for radionuclide therapy and dosimetry is presented. Dosimetry of sup 7 sup 6 Br-labelled antibodies is evaluated in an animal study. Finally, the possibility to use PET for dosimetry during radionuclide therapy is studied and a dose image calculation program, based on PET measurements, is presented The use of positron emission tomography (PET) has so far been limited to a few nuclides with short half-lives such as sup 1 sup 8 F and sup 1 sup 1 C. Certain applications require nuclides with longer half-lives, such as sup 7 sup 6 Br and sup 5 sup 2...

  9. Quantiprot - a Python package for quantitative analysis of protein sequences.

    Science.gov (United States)

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  10. Quantitative Adverse Outcome Pathways and Their ...

    Science.gov (United States)

    A quantitative adverse outcome pathway (qAOP) consists of one or more biologically based, computational models describing key event relationships linking a molecular initiating event (MIE) to an adverse outcome. A qAOP provides quantitative, dose–response, and time-course predictions that can support regulatory decision-making. Herein we describe several facets of qAOPs, including (a) motivation for development, (b) technical considerations, (c) evaluation of confidence, and (d) potential applications. The qAOP used as an illustrative example for these points describes the linkage between inhibition of cytochrome P450 19A aromatase (the MIE) and population-level decreases in the fathead minnow (FHM; Pimephales promelas). The qAOP consists of three linked computational models for the following: (a) the hypothalamic-pitutitary-gonadal axis in female FHMs, where aromatase inhibition decreases the conversion of testosterone to 17β-estradiol (E2), thereby reducing E2-dependent vitellogenin (VTG; egg yolk protein precursor) synthesis, (b) VTG-dependent egg development and spawning (fecundity), and (c) fecundity-dependent population trajectory. While development of the example qAOP was based on experiments with FHMs exposed to the aromatase inhibitor fadrozole, we also show how a toxic equivalence (TEQ) calculation allows use of the qAOP to predict effects of another, untested aromatase inhibitor, iprodione. While qAOP development can be resource-intensive, the quan

  11. Quantitative analysis method for ship construction quality

    Directory of Open Access Journals (Sweden)

    FU Senzong

    2017-03-01

    Full Text Available The excellent performance of a ship is assured by the accurate evaluation of its construction quality. For a long time, research into the construction quality of ships has mainly focused on qualitative analysis due to a shortage of process data, which results from limited samples, varied process types and non-standardized processes. Aiming at predicting and controlling the influence of the construction process on the construction quality of ships, this article proposes a reliability quantitative analysis flow path for the ship construction process and fuzzy calculation method. Based on the process-quality factor model proposed by the Function-Oriented Quality Control (FOQC method, we combine fuzzy mathematics with the expert grading method to deduce formulations calculating the fuzzy process reliability of the ordinal connection model, series connection model and mixed connection model. The quantitative analysis method is applied in analyzing the process reliability of a ship's shaft gear box installation, which proves the applicability and effectiveness of the method. The analysis results can be a useful reference for setting key quality inspection points and optimizing key processes.

  12. Review of progress in quantitative nondestructive evaluation

    CERN Document Server

    Chimenti, Dale

    1999-01-01

    This series provides a comprehensive review of the latest research results in quantitative nondestructive evaluation (NDE). Leading investigators working in government agencies, major industries, and universities present a broad spectrum of work extending from basic research to early engineering applications. An international assembly of noted authorities in NDE thoroughly cover such topics as: elastic waves, guided waves, and eddy-current detection, inversion, and modeling; radiography and computed tomography, thermal techniques, and acoustic emission; laser ultrasonics, optical methods, and microwaves; signal processing and image analysis and reconstruction, with an emphasis on interpretation for defect detection; and NDE sensors and fields, both ultrasonic and electromagnetic; engineered materials and composites, bonded joints, pipes, tubing, and biomedical materials; linear and nonlinear properties, ultrasonic backscatter and microstructure, coatings and layers, residual stress and texture, and constructi...

  13. Quantitative self-powered electrochromic biosensors.

    Science.gov (United States)

    Pellitero, Miguel Aller; Guimerà, Anton; Kitsara, Maria; Villa, Rosa; Rubio, Camille; Lakard, Boris; Doche, Marie-Laure; Hihn, Jean-Yves; Javier Del Campo, F

    2017-03-01

    Self-powered sensors are analytical devices able to generate their own energy, either from the sample itself or from their surroundings. The conventional approaches rely heavily on silicon-based electronics, which results in increased complexity and cost, and prevents the broader use of these smart systems. Here we show that electrochromic materials can overcome the existing limitations by simplifying device construction and avoiding the need for silicon-based electronics entirely. Electrochromic displays can be built into compact self-powered electrochemical sensors that give quantitative information readable by the naked eye, simply controlling the current path inside them through a combination of specially arranged materials. The concept is validated by a glucose biosensor coupled horizontally to a Prussian blue display designed as a distance-meter proportional to (glucose) concentration. This approach represents a breakthrough for self-powered sensors, and extends the application of electrochromic materials beyond smart windows and displays, into sensing and quantification.

  14. Quantitative dual-channel FRET microscopy.

    Science.gov (United States)

    Wei, Lichun; Zhang, Jiang; Mai, Zihao; Yang, Fangfang; Du, Mengyan; Lin, Fangrui; Qu, Junle; Chen, Tongsheng

    2017-10-16

    Acceptor-sensitized quantitative Förster resonance energy transfer (FRET) measurement (E-FRET) is mainly impeded by donor emission crosstalk and acceptor direct excitation crosstalk. In this paper, we develop a novel E-FRET approach (Lux-E-FRET) based on linear unmixing (Lux) of the fluorescence intensity ratio between two detection channels with each excitation of two different wavelengths. The two detection channels need not to selectively collect the emission of donor or acceptor, and the excitation wavelengths need not to selectively excite donor or acceptor. For a tandem FRET sensor, Lux-E-FRET only needs single excitation wavelength. We performed Lux-E-FRET measurements on our dual-channel wide-field fluorescence microscope for FRET constructs in living cells, and obtained consistent FRET efficiencies with those measured by other methods. Collectively, Lux-E-FRET completely overcomes all spectral crosstalks and thus is applicable to the donor-acceptor pair with larger spectral overlapping.

  15. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  16. Quantitative Fluorescence Measurements with Multicolor Flow Cytometry.

    Science.gov (United States)

    Wang, Lili; Gaigalas, Adolfas K; Wood, James

    2018-01-01

    Multicolor flow cytometer assays are routinely used in clinical laboratories for immunophenotyping, monitoring disease and treatment, and determining prognostic factors. However, existing methods for quantitative measurements have not yet produced satisfactory results independent of flow cytometers used. This chapter details a procedure for quantifying surface and intracellular protein biomarkers by calibrating the output of a multicolor flow cytometer in units of antibodies bound per cell (ABC). The procedure includes the following critical steps: (a) quality control (QC) and performance characterization of the multicolor flow cytometer, (b) fluorescence calibration using hard dyed microspheres assigned with fluorescence intensity values in equivalent number of reference fluorophores (ERF), (c) compensation for correction of fluorescence spillover, and (d) application of a biological reference standard for translating the ERF scale to the ABC scale. The chapter also points out current efforts for implementing quantification of biomarkers in a manner which is independent of instrument platforms and reagent differences.

  17. Quantitative linking hypotheses for infant eye movements.

    Directory of Open Access Journals (Sweden)

    Daniel Yurovsky

    Full Text Available The study of cognitive development hinges, largely, on the analysis of infant looking. But analyses of eye gaze data require the adoption of linking hypotheses: assumptions about the relationship between observed eye movements and underlying cognitive processes. We develop a general framework for constructing, testing, and comparing these hypotheses, and thus for producing new insights into early cognitive development. We first introduce the general framework--applicable to any infant gaze experiment--and then demonstrate its utility by analyzing data from a set of experiments investigating the role of attentional cues in infant learning. The new analysis uncovers significantly more structure in these data, finding evidence of learning that was not found in standard analyses and showing an unexpected relationship between cue use and learning rate. Finally, we discuss general implications for the construction and testing of quantitative linking hypotheses. MATLAB code for sample linking hypotheses can be found on the first author's website.

  18. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  19. Rational Formulation of Alternative Fuels using QSPR Methods: Application to Jet Fuels Développement d’un outil d’aide à la formulation des carburants alternatifs utilisant des méthodes QSPR (Quantitative Structure Property Relationship: application aux carburéacteurs

    Directory of Open Access Journals (Sweden)

    Saldana D.A.

    2013-06-01

    éveloppement et l’application de méthodes QSPR (Quantitative Structure Property Relationship permettant de relier la structure aux propriétés d’une molécule. Les produits étudiés sont les hydrocarbures (normal et iso-paraffines, naphtènes, aromatiques, etc. et les oxygénés du type alcools et esters. Les propriétés ciblées sont celles figurant dans les spécifications carburants telles que le point d’éclair, l’indice de cétane, la masse volumique et la viscosité. Les modèles prédictifs des propriétés des corps purs ont été établis à partir de données expérimentales de référence provenant en grande partie de la littérature. L’utilité de tels modèles dans la sélection de composés d’intérêt peut être montrée par exemple pour trouver le meilleur compromis pour satisfaire les critères de tenue à froid et de masse volumique des paraffines. Ainsi, si la chaîne carbonée est trop longue alors le critère de tenue à froid risque de ne pas être satisfait. Il est alors nécessaire de favoriser la ramification ou d’ajouter des bases ayant une bonne tenue à froid comme certains naphtènes ou monoaromatiques alkylés. Cependant, cela entraîne bien souvent une masse volumique trop basse par rapport à la spécification. Là encore l’ajout de naphtènes ou de monoaromatiques alkylés issus de la biomasse peut être intéressant.

  20. Studying learning in the healthcare setting: the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples’ experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the

  1. Does Homework Really Matter for College Students in Quantitatively-Based Courses?

    Science.gov (United States)

    Young, Nichole; Dollman, Amanda; Angel, N. Faye

    2016-01-01

    This investigation was initiated by two students in an Advanced Computer Applications course. They sought to examine the influence of graded homework on final grades in quantitatively-based business courses. They were provided with data from three quantitatively-based core business courses over a period of five years for a total of 10 semesters of…

  2. Studying learning in the healthcare setting : the potential of quantitative diary methods

    NARCIS (Netherlands)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-01-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the

  3. Label-free quantitative cell division monitoring of endothelial cells by digital holographic microscopy

    Science.gov (United States)

    Kemper, Björn; Bauwens, Andreas; Vollmer, Angelika; Ketelhut, Steffi; Langehanenberg, Patrik; Müthing, Johannes; Karch, Helge; von Bally, Gert

    2010-05-01

    Digital holographic microscopy (DHM) enables quantitative multifocus phase contrast imaging for nondestructive technical inspection and live cell analysis. Time-lapse investigations on human brain microvascular endothelial cells demonstrate the use of DHM for label-free dynamic quantitative monitoring of cell division of mother cells into daughter cells. Cytokinetic DHM analysis provides future applications in toxicology and cancer research.

  4. An Inside View: The Utility of Quantitative Observation in Understanding College Educational Experiences

    Science.gov (United States)

    Campbell, Corbin M.

    2017-01-01

    This article describes quantitative observation as a method for understanding college educational experiences. Quantitative observation has been used widely in several fields and in K-12 education, but has had limited application to research in higher education and student affairs to date. The article describes the central tenets of quantitative…

  5. Quantitative phase imaging of arthropods

    Science.gov (United States)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Abstract. Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy. PMID:26334858

  6. The Joy of Quantitative Reasoning

    Directory of Open Access Journals (Sweden)

    Caren Diefenderfer

    2012-01-01

    Full Text Available One of the advantages of focusing on quantitative reasoning is that it spans a wide variety of topics. As incoming president of the National Numeracy Network, I would like to take the opportunity of this editorial to tell my story of intellectual reward from finding common purpose in quantitative reasoning with colleagues from disciplines outside of mathematics. The story starts with an NSF-funded faculty development project (DUE-9952807 to further a QR across-the-curriculum program and the finding from that program that merging authentic context with mathematics brings interaction and collaboration. That joy in learning from and working with colleagues in other disciplines has now expanded to seeking authentic context for all of my mathematics courses and being open to new ways of thinking.

  7. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    National Research Council Canada - National Science Library

    Karpievitch, Yuliya; Stanley, Jeff; Taverner, Thomas; Huang, Jianhua; Adkins, Joshua N; Ansong, Charles; Heffron, Fred; Metz, Thomas O; Qian, Wei-Jun; Yoon, Hyunjin; Smith, Richard D; Dabney, Alan R

    2009-01-01

    .... The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values...

  8. Quantitative multimodality imaging in cancer research and therapy.

    Science.gov (United States)

    Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad

    2014-11-01

    Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.

  9. Maths meets myths quantitative approaches to ancient narratives

    CERN Document Server

    MacCarron, Máirín; MacCarron, Pádraig

    2017-01-01

    With an emphasis on exploring measurable aspects of ancient narratives, Maths Meets Myths sets out to investigate age-old material with new techniques. This book collects, for the first time, novel quantitative approaches to studying sources from the past, such as chronicles, epics, folktales, and myths. It contributes significantly to recent efforts in bringing together natural scientists and humanities scholars in investigations aimed at achieving greater understanding of our cultural inheritance. Accordingly, each contribution reports on a modern quantitative approach applicable to narrative sources from the past, or describes those which would be amenable to such treatment and why they are important. This volume is a unique state-of-the-art compendium on an emerging research field which also addresses anyone with interests in quantitative approaches to humanities.

  10. Quantitative SLM-based Differential Interference Contrast imaging.

    Science.gov (United States)

    McIntyre, Timothy J; Maurer, Christian; Fassl, Stephanie; Khan, Saranjam; Bernet, Stefan; Ritsch-Marte, Monika

    2010-06-21

    We describe the implementation of quantitative Differential Interference Contrast (DIC) Microscopy using a spatial light modulator (SLM) as a flexible Fourier filter in the optical path. The experimental arrangement allows for the all-electronic acquisition of multiple phase shifted DIC-images at video rates which are analyzed to yield the optical path length variation of the sample. The resolution of the technique is analyzed by retrieving the phase profiles of polystyrene spheres in immersion oil, and the method is then applied for quantitative imaging of biological samples. By reprogramming the diffractive structure displayed at the SLM it is possible to record the whole set of phase shifted DIC images simultaneously in different areas of the same camera chip. This allows for quantitative snap-shot imaging of a sample, which has applications for the investigation of dynamic processes.

  11. The Emergence of Quantitative Sintering Theory from 1945 to 1955

    Science.gov (United States)

    German, Randall M.

    2017-04-01

    Particles flow and pack under stress, allowing shaping of the particles into target engineering geometries. Subsequently, in a process termed sintering, the particles are heated to induce bonding that results in a strong solid. Although first practiced 26,000 years ago, sintering was largely unexplained until recent times. Sintering science moved from an empirical and largely qualitative notion into a quantitative theory over a relatively short time period following World War II. That conceptual transition took place just as commercial applications for sintered materials underwent significant growth. This article highlights the key changes in sintering concepts that occurred in the 1945-1955 time period. This time span starts with the first quantitative neck growth model from Frenkel and ends with the quantitative shrinkage model from Kingery and Berg that includes several transport mechanisms.

  12. Quantitative biological imaging by ptychographic X-ray diffraction microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Giewekemeyer, Klaus; Kalbfleisch, Sebastian; Beerlink, Andre; Salditt, Tim [Institut fuer Roentgenphysik, Georg-August-Universitaet Goettingen (Germany); Thibault, Pierre; Dierolf, Martin; Pfeiffer, Franz [Department Physik (E17), Technische Universitaet Muenchen, Garching (Germany); Kewish, Cameron M. [Paul Scherrer Institut, Villigen PSI (Switzerland)

    2010-07-01

    Mesoscopic structures with specific functions are abundant in many cellular systems and have been well characterized by electron microscopy in the past. However, the quantitative study of the three-dimensional structure and density of subcellular components remains a difficult problem. In this contribution we show how these limitations could be overcome in the future by the application of recently introduced and now rapidly evolving coherent X-ray imaging techniques for quantitative biological imaging on the nanoscale. More specifically, we report on a recent scanning (ptychographic) diffraction experiment on unstained and unsliced freeze-dried cells of the bacterium Deinococcus radiourans using only a pinhole as beam defining optical element. As a result quantitative density projections well below optical resolution have been achieved.

  13. [Research progress of real-time quantitative PCR method for group A rotavirus detection].

    Science.gov (United States)

    Guo, Yan-Qing; Li, Dan-Di; Duan, Zhao-Jun

    2013-11-01

    Group A rotavirus is one of the most significant etiological agents which causes acute gastroenteritis among infants and young children worldwide. So far, several method which includes electron microscopy (EM), enzyme immunoassay (EIA), reverse transcription-polymerase chain reaction (RT-PCR)and Real-time Quantitative PCR has been established for the detection of rotavirus. Compared with other methods, Real-time quantitative PCR have advantages in specificity, sensitivity, genotyping and quantitative accuracy. This article shows a overview of the application of real-time quantitative PCR technique to detecte group A rotavirus.

  14. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    Science.gov (United States)

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  15. Quantitative rotating frame relaxometry methods in MRI.

    Science.gov (United States)

    Gilani, Irtiza Ali; Sepponen, Raimo

    2016-06-01

    Macromolecular degeneration and biochemical changes in tissue can be quantified using rotating frame relaxometry in MRI. It has been shown in several studies that the rotating frame longitudinal relaxation rate constant (R1ρ ) and the rotating frame transverse relaxation rate constant (R2ρ ) are sensitive biomarkers of phenomena at the cellular level. In this comprehensive review, existing MRI methods for probing the biophysical mechanisms that affect the rotating frame relaxation rates of the tissue (i.e. R1ρ and R2ρ ) are presented. Long acquisition times and high radiofrequency (RF) energy deposition into tissue during the process of spin-locking in rotating frame relaxometry are the major barriers to the establishment of these relaxation contrasts at high magnetic fields. Therefore, clinical applications of R1ρ and R2ρ MRI using on- or off-resonance RF excitation methods remain challenging. Accordingly, this review describes the theoretical and experimental approaches to the design of hard RF pulse cluster- and adiabatic RF pulse-based excitation schemes for accurate and precise measurements of R1ρ and R2ρ . The merits and drawbacks of different MRI acquisition strategies for quantitative relaxation rate measurement in the rotating frame regime are reviewed. In addition, this review summarizes current clinical applications of rotating frame MRI sequences. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  16. Magnetic force microscopy: quantitative issues in biomaterials.

    Science.gov (United States)

    Passeri, Daniele; Dong, Chunhua; Reggente, Melania; Angeloni, Livia; Barteri, Mario; Scaramuzzo, Francesca A; De Angelis, Francesca; Marinelli, Fiorenzo; Antonelli, Flavia; Rinaldi, Federica; Marianecci, Carlotta; Carafa, Maria; Sorbo, Angela; Sordi, Daniela; Arends, Isabel Wce; Rossi, Marco

    2014-01-01

    Magnetic force microscopy (MFM) is an atomic force microscopy (AFM) based technique in which an AFM tip with a magnetic coating is used to probe local magnetic fields with the typical AFM spatial resolution, thus allowing one to acquire images reflecting the local magnetic properties of the samples at the nanoscale. Being a well established tool for the characterization of magnetic recording media, superconductors and magnetic nanomaterials, MFM is finding constantly increasing application in the study of magnetic properties of materials and systems of biological and biomedical interest. After reviewing these latter applications, three case studies are presented in which MFM is used to characterize: (i) magnetoferritin synthesized using apoferritin as molecular reactor; (ii) magnetic nanoparticles loaded niosomes to be used as nanocarriers for drug delivery; (iii) leukemic cells labeled using folic acid-coated core-shell superparamagnetic nanoparticles in order to exploit the presence of folate receptors on the cell membrane surface. In these examples, MFM data are quantitatively analyzed evidencing the limits of the simple analytical models currently used. Provided that suitable models are used to simulate the MFM response, MFM can be used to evaluate the magnetic momentum of the core of magnetoferritin, the iron entrapment efficiency in single vesicles, or the uptake of magnetic nanoparticles into cells.

  17. Quantitative methods for the analysis of zoosporic fungi.

    Science.gov (United States)

    Marano, Agostina V; Gleason, Frank H; Bärlocher, Felix; Pires-Zottarelli, Carmen L A; Lilje, Osu; Schmidt, Steve K; Rasconi, Serena; Kagami, Maiko; Barrera, Marcelo D; Sime-Ngando, Télesphore; Boussiba, Sammy; de Souza, José I; Edwards, Joan E

    2012-04-01

    Quantitative estimations of zoosporic fungi in the environment have historically received little attention, primarily due to methodological challenges and their complex life cycles. Conventional methods for quantitative analysis of zoosporic fungi to date have mainly relied on direct observation and baiting techniques, with subsequent fungal identification in the laboratory using morphological characteristics. Although these methods are still fundamentally useful, there has been an increasing preference for quantitative microscopic methods based on staining with fluorescent dyes, as well as the use of hybridization probes. More recently however PCR based methods for profiling and quantification (semi- and absolute) have proven to be rapid and accurate diagnostic tools for assessing zoosporic fungal assemblages in environmental samples. Further application of next generation sequencing technologies will however not only advance our quantitative understanding of zoosporic fungal ecology, but also their function through the analysis of their genomes and gene expression as resources and databases expand in the future. Nevertheless, it is still necessary to complement these molecular-based approaches with cultivation-based methods in order to gain a fuller quantitative understanding of the ecological and physiological roles of zoosporic fungi. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Qualitative and Quantitative Sentiment Proxies

    DEFF Research Database (Denmark)

    Zhao, Zeyan; Ahmad, Khurshid

    2015-01-01

    Sentiment analysis is a content-analytic investigative framework for researchers, traders and the general public involved in financial markets. This analysis is based on carefully sourced and elaborately constructed proxies for market sentiment and has emerged as a basis for analysing movements...... and trading volumes. The case study we use is a small market index (Danish Stock Exchange Index, OMXC 20, together with prevailing sentiment in Denmark, to evaluate the impact of sentiment on OMXC 20. Furthermore, we introduce a rather novel and quantitative sentiment proxy, that is the use of the index...

  19. Quantitative assessment of increasing complexity

    Science.gov (United States)

    Csernai, L. P.; Spinnangr, S. F.; Velle, S.

    2017-05-01

    We study the build up of complexity on the example of 1 kg matter in different forms. We start with the simplest example of ideal gases, and then continue with more complex chemical, biological, life, social and technical structures. We assess the complexity of these systems quantitatively, based on their entropy. We present a method to attribute the same entropy to known physical systems and to complex organic molecules, up to a DNA molecule. The important steps in this program and the basic obstacles are discussed.

  20. A quantitative framework for assessing ecological resilience

    Science.gov (United States)

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be dec...

  1. Compact, common path quantitative phase microscopic techniques ...

    Indian Academy of Sciences (India)

    2014-01-05

    Jan 5, 2014 ... Quantitative phase contrast techniques, which directly provide informa- tion about the phase of the object wavefront, can be used to quantitatively image the object under investigation. Typically, interferometric techniques are used for quantitative phase imaging. 2. Digital holographic microscopy. Holograms ...

  2. Quality control for quantitative geophysical logging

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Kyu; Hwang, Se Ho; Hwang, Hak Soo; Park, In Hwa [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    Despite the great availability of geophysical data obtained from boreholes, the interpretation is subject to significant uncertainties. More accurate data with less statistical uncertainties should require an employment of more quantitative techniques in log acquisition and interpretation technique. The long-term objective of this project is the development of techniques in both quality control of log measurement and the quantitative interpretation. In the first year, the goals of the project will include establishing the procedure of log acquisition using various tests, analysing the effect of logging velocity change on the logging data, examining the repeatability and reproducibility, analyzing of filtering effect on the log measurements, and finally the zonation and the correlation of single-and inter-well log data. For the establishment of logging procedure, we have tested the multiple factors affecting the accuracy in depth. The factors are divided into two parts: human and mechanical. These factors include the zero setting of depth, the calculation of offset for the sonde, the stretching effect of cable, and measuring wheel accuracy. We conclude that the error in depth setting results primarily from human factor, and also in part from the stretching of cable. The statistical fluctuation of log measurements increases according to increasing the logging speed for the zone of lower natural gamma. Thus, the problem related with logging speed is a trifling matter in case of the application of resources exploration, the logging speed should run more slowly to reduce the statistical fluctuation of natural gamma with lithologic correlation in mind. The repeatability and reproducibility of logging measurements are tested. The results of repeatability test for the natural gamma sonde are qualitatively acceptable in the reproducibility test, the errors occurs in logging data between two operators and successive trials. We conclude that the errors result from the

  3. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    Science.gov (United States)

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  4. Quantitative genetics in conservation biology.

    Science.gov (United States)

    Frankham, R

    1999-12-01

    Most of the major genetic concerns in conservation biology, including inbreeding depression, loss of evolutionary potential, genetic adaptation to captivity and outbreeding depression, involve quantitative genetics. Small population size leads to inbreeding and loss of genetic diversity and so increases extinction risk. Captive populations of endangered species are managed to maximize the retention of genetic diversity by minimizing kinship, with subsidiary efforts to minimize inbreeding. There is growing evidence that genetic adaptation to captivity is a major issue in the genetic management of captive populations of endangered species as it reduces reproductive fitness when captive populations are reintroduced into the wild. This problem is not currently addressed, but it can be alleviated by deliberately fragmenting captive populations, with occasional exchange of immigrants to avoid excessive inbreeding. The extent and importance of outbreeding depression is a matter of controversy. Currently, an extremely cautious approach is taken to mixing populations. However, this cannot continue if fragmented populations are to be adequately managed to minimize extinctions. Most genetic management recommendations for endangered species arise directly, or indirectly, from quantitative genetic considerations.

  5. Quantitative criticism of literary relationships

    Science.gov (United States)

    Dexter, Joseph P.; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A.; Bonilla Lopez, Jorge A.; Schroeder, Lea A.; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-01-01

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships (“intertextuality”) and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term “quantitative criticism,” focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca’s main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions. PMID:28373557

  6. Quantitative evaluation of Alzheimer's disease

    Science.gov (United States)

    Duchesne, S.; Frisoni, G. B.

    2009-02-01

    We propose a single, quantitative metric called the disease evaluation factor (DEF) and assess its efficiency at estimating disease burden in normal, control subjects (CTRL) and probable Alzheimer's disease (AD) patients. The study group consisted in 75 patients with a diagnosis of probable AD and 75 age-matched normal CTRL without neurological or neuropsychological deficit. We calculated a reference eigenspace of MRI appearance from reference data, in which our CTRL and probable AD subjects were projected. We then calculated the multi-dimensional hyperplane separating the CTRL and probable AD groups. The DEF was estimated via a multidimensional weighted distance of eigencoordinates for a given subject and the CTRL group mean, along salient principal components forming the separating hyperplane. We used quantile plots, Kolmogorov-Smirnov and χ2 tests to compare the DEF values and test that their distribution was normal. We used a linear discriminant test to separate CTRL from probable AD based on the DEF factor, and reached an accuracy of 87%. A quantitative biomarker in AD would act as an important surrogate marker of disease status and progression.

  7. Quantitative criticism of literary relationships.

    Science.gov (United States)

    Dexter, Joseph P; Katz, Theodore; Tripuraneni, Nilesh; Dasgupta, Tathagata; Kannan, Ajay; Brofos, James A; Bonilla Lopez, Jorge A; Schroeder, Lea A; Casarez, Adriana; Rabinovich, Maxim; Haimson Lushkov, Ayelet; Chaudhuri, Pramit

    2017-04-18

    Authors often convey meaning by referring to or imitating prior works of literature, a process that creates complex networks of literary relationships ("intertextuality") and contributes to cultural evolution. In this paper, we use techniques from stylometry and machine learning to address subjective literary critical questions about Latin literature, a corpus marked by an extraordinary concentration of intertextuality. Our work, which we term "quantitative criticism," focuses on case studies involving two influential Roman authors, the playwright Seneca and the historian Livy. We find that four plays related to but distinct from Seneca's main writings are differentiated from the rest of the corpus by subtle but important stylistic features. We offer literary interpretations of the significance of these anomalies, providing quantitative data in support of hypotheses about the use of unusual formal features and the interplay between sound and meaning. The second part of the paper describes a machine-learning approach to the identification and analysis of citational material that Livy loosely appropriated from earlier sources. We extend our approach to map the stylistic topography of Latin prose, identifying the writings of Caesar and his near-contemporary Livy as an inflection point in the development of Latin prose style. In total, our results reflect the integration of computational and humanistic methods to investigate a diverse range of literary questions.

  8. Quantitative approach of Min protein researches and applications ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-12-29

    Dec 29, 2009 ... The Min protein system consists of MinC, MinD and. MinE expressed from the MinB operon (de Boer et al.,. 1989). They are ..... models were based on macroscopic nonlinear reaction- diffusion equations (RDE) and were solved using conven- tional grid-based finite difference method (Strikwerda,. 1989).

  9. Application of evolutionary computation on ensemble forecast of quantitative precipitation

    Science.gov (United States)

    Dufek, Amanda S.; Augusto, Douglas A.; Dias, Pedro L. S.; Barbosa, Helio J. C.

    2017-09-01

    An evolutionary computation algorithm known as genetic programming (GP) has been explored as an alternative tool for improving the ensemble forecast of 24-h accumulated precipitation. Three GP versions and six ensembles' languages were applied to several real-world datasets over southern, southeastern and central Brazil during the rainy period from October to February of 2008-2013. According to the results, the GP algorithms performed better than two traditional statistical techniques, with errors 27-57% lower than simple ensemble mean and the MASTER super model ensemble system. In addition, the results revealed that GP algorithms outperformed the best individual forecasts, reaching an improvement of 34-42%. On the other hand, the GP algorithms had a similar performance with respect to each other and to the Bayesian model averaging, but the former are far more versatile techniques. Although the results for the six ensembles' languages are almost indistinguishable, our most complex linear language turned out to be the best overall proposal. Moreover, some meteorological attributes, including the weather patterns over Brazil, seem to play an important role in the prediction of daily rainfall amount.

  10. Quantitative Application Data Flow Characterization for Heterogeneous Multicore Architectures

    NARCIS (Netherlands)

    Ostadzadeh, S.A.

    2012-01-01

    Recent trends show a steady increase in the utilization of heterogeneous multicore architectures in order to address the ever-growing need for computing performance. These emerging architectures pose specific challenges with regard to their programmability. In addition, they require efficient

  11. Quantitative Analysis of Musculoskeletal Ultrasound: Techniques and Clinical Applications

    National Research Council Canada - National Science Library

    Qing Wang; Qing-Hua Huang; John T. W. Yeow; Mark R. Pickering; Simo Saarakkala

    2017-01-01

      [...]with advanced imaging modalities such as CT and MRI, advantages of ultrasound include the readily available bedside ultrasound equipment, the relatively low cost of the exam procedure, the capacity...

  12. Statistical Applications and Quantitative Design for Injury Prevention ...

    African Journals Online (AJOL)

    The Medical Research Council and University of South Africa's Crime Violence and Injury Lead Programme (MRC–UNISA CVILP) was privileged to host renowned scholar, Professor Shrikant Bangdiwala, Research Professor in Biostatistics from the University of North Carolina, Chapel Hill, United States of America.

  13. Quantitative Assessment of Interutterance Stability: Application to Dysarthria

    Science.gov (United States)

    Cummins, Fred; Lowit, Anja; van Brenk, Frits

    2014-01-01

    Purpose: Following recent attempts to quantify articulatory impairment in speech, the present study evaluates the usefulness of a novel measure of motor stability to characterize dysarthria. Method: The study included 8 speakers with ataxic dysarthria (AD), 16 speakers with hypokinetic dysarthria (HD) as a result of Parkinson's disease, and…

  14. Clinical applications of quantitative acid-base chemistry.

    Science.gov (United States)

    Whitehair, K J; Haskins, S C; Whitehair, J G; Pascoe, P J

    1995-01-01

    Stewart used physicochemical principles of aqueous solutions to develop an understanding of variables that control hydrogen ion concentration (H+) in body fluids. He proposed that H+ concentration in body fluids was determined by PCO2, strong ion difference (SID = sum of strong positive ion concentrations minus the sum of the strong anion concentrations) and the total concentration of nonvolatile weak acid (Atot) under normal circumstances. Albumin is the major weak acid in plasma and represents the majority of Atot. These 3 variables were defined as independent variables, which determined the values of all other relevant variables (dependent) in plasma, including H+. The major strong ions in plasma are sodium and chloride. The difference between Na+ and Cl- may be used as an estimation of SID. A decrease in SID below normal results in acidosis (increase in H+) and an increase in SID above normal results in alkalosis (decrease in H+). Unidentified strong anions such as lactate will decrease the SID, if present. Equations developed by Fencl allow Stewart's work to be easily applied clinically for evaluating the metabolic (nonrespiratory) contribution to acid-base balance. This approach separates the net metabolic abnormality into components, and allows one to easily detect mixed metabolic acid-base abnormalities. The Fencl approach provides insight into the nature and severity of the disturbances that exist in the patient. Sodium, chloride, protein, and unidentified anion derangements may contribute to the observed metabolic acid-base imbalance.

  15. Quantitative methods and socio-economic applications in GIS

    CERN Document Server

    Wang, Fahui

    2014-01-01

    GIS AND BASIC SPATIAL ANALYSIS TASKSGetting Started with ArcGIS: Data Management and Basic Spatial Analysis ToolsSpatial and Attribute Data Management in ArcGISSpatial Analysis Tools in ArcGIS: Queries, Spatial Joins, and Map OverlaysCase Study 1: Mapping and Analyzing Population Density Pattern in Baton Rouge, LouisianaSummaryIdentifying Contiguous Polygons by Spatial Analysis ToolsMeasuring Distance and TimeMeasures of DistanceComputing Network Distance and TimeThe Distance Decay RuleCase Study 2: Computing Distances and Tra

  16. Trois essais de méthodologie quantitative

    CERN Document Server

    Laurencelle, Louis; Allaire, Denis

    1994-01-01

    Ce volume témoigne à sa façon de la fertilité du domaine de la méthodologie quantitative, en présentant trois textes qui relèvent d'applications très différentes : la corrélation, l'analyse de variance, la précision d'un estimateur statistique.

  17. Quantitative autoradiographic microimaging in the development and evaluation of radiopharmaceuticals

    Energy Technology Data Exchange (ETDEWEB)

    Som, P. [Brookhaven National Lab., Upton, NY (United States); Oster, Z.H. [State Univ. of New York, Stony Brook, NY (United States)

    1994-04-01

    Autoradiographic (ARG) microimaging is the method for depicting biodistribution of radiocompounds with highest spatial resolution. ARG is applicable to gamma, positron and negatron emitting radiotracers. Dual or multiple-isotope studies can be performed using half-lives and energies for discrimination of isotopes. Quantitation can be performed by digital videodensitometry and by newer filmless technologies. ARG`s obtained at different time intervals provide the time dimension for determination of kinetics.

  18. Quantitative fabrication of functional polymer surfaces

    Science.gov (United States)

    Rengifo, Hernan R.

    Polymeric surfaces and films have very broad applications in industry. They have been employed as anticorrosive, abrasive and decorative coatings for many years. More recently, the applications of functional polymer films in microelectronics, optics, nanocomposites, DNA microarrays, and enzyme immobilizations has drawn a lot of attention. There are a number of challenges associated with the implementation of functional polymeric surfaces, and these challenges are especially important in the field of surface modification. In this thesis, three different challenges in the field of polymeric functional surfaces are addressed: first of all, a set of rules for the molecular design are presented in chapters 3 and 4 according to the surface needs. Second, some latent energy source must be incorporated into the material design to quantitative modify a surface. Third, the morphology of the surface, the method use to fabricate the design surface and their new applications are presented in chapters 4 and 5. The new polymeric surface functionalization method described in Chapter 3 is based upon an end-functionalized diblock copolymer design to self-assemble at the surface of both hard and soft surfaces. It is demonstrated that alkyne end-functional diblock copolymers can be used to provide precise control over areal densities of reactive functionality. The areal density of alkyne functional groups is precisely controlled by adjusting the thickness of the block copolymer monolayer, which is accomplished by changing either the spin coating conditions (i.e., rotational speed and solution concentration) or the copolymer molecular weight. The modified surfaces are characterized by atomic force microscopy (AFM), contact angle, ellipsometry, fluorescent imaging and angle-dependent X-ray photoelectron spectroscopy (ADXPS) measurements. In Chapter 4, a simple means is demonstrated to covalently bond DNA to polymer-modified substrates; the method provides quantitative control of the DNA

  19. Computer architecture a quantitative approach

    CERN Document Server

    Hennessy, John L

    2019-01-01

    Computer Architecture: A Quantitative Approach, Sixth Edition has been considered essential reading by instructors, students and practitioners of computer design for over 20 years. The sixth edition of this classic textbook is fully revised with the latest developments in processor and system architecture. It now features examples from the RISC-V (RISC Five) instruction set architecture, a modern RISC instruction set developed and designed to be a free and openly adoptable standard. It also includes a new chapter on domain-specific architectures and an updated chapter on warehouse-scale computing that features the first public information on Google's newest WSC. True to its original mission of demystifying computer architecture, this edition continues the longstanding tradition of focusing on areas where the most exciting computing innovation is happening, while always keeping an emphasis on good engineering design.

  20. Quantitative analysis of qualitative images

    Science.gov (United States)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  1. Quantitative Evaluation of Veto Power

    Directory of Open Access Journals (Sweden)

    Michela Chessa

    2011-01-01

    Full Text Available The decisiveness index and the loose protectionism index for a single player have been introduced, starting from the decisiveness and the loose protectionism indices for a collective decisionmaking mechanism defined by Carreras. Attention was mainly focused on the latter index, being proposed as a quantitative measure of the power of veto of each agent. According to this index, a veto player has veto power equal to one, while each other player has a fractional power according to her/his likelihood of blocking a given proposal. Such an index coincides with the expected payoff at the Bayesian equilibrium of a suitable Bayesian game, which illustrates the non-cooperative point of view of a decision-making mechanism. (original abstract

  2. Quantitative techniques for musculoskeletal MRI at 7 Tesla.

    Science.gov (United States)

    Bangerter, Neal K; Taylor, Meredith D; Tarbox, Grayson J; Palmer, Antony J; Park, Daniel J

    2016-12-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems.

  3. Quantitative wood anatomy - practical guidelines

    Directory of Open Access Journals (Sweden)

    Georg evon Arx

    2016-06-01

    Full Text Available Quantitative wood anatomy analyzes the variability of xylem anatomical features in trees, shrubs and herbaceous species to address research questions related to plant functioning, growth and environment. Among the more frequently considered anatomical features are lumen dimensions and wall thickness of conducting cells, fibers and several ray properties. The structural properties of each xylem anatomical feature are mostly fixed once they are formed, and define to a large extent its functionality, including transport and storage of water, nutrients, sugars and hormones, and providing mechanical support. The anatomical features can often be localized within an annual growth ring, which allows to establish intra-annual past and present structure-function relationships and its sensitivity to environmental variability. However, there are many methodological obstacles to overcome when aiming at producing (large data sets of xylem anatomical data.Here we describe the different steps from wood sample collection to xylem anatomical data, provide guidance and identify pitfalls, and present different image-analysis tools for the quantification of anatomical features, in particular conducting cells. We show that each data production step from sample collection in the field, microslide preparation in the lab, image capturing through an optical microscope and image analysis with specific tools can readily introduce measurement errors between 5 to 30% and more, whereby the magnitude usually increases the smaller the anatomical features. Such measurement errors – if not avoided or corrected – may make it impossible to extract meaningful xylem anatomical data in light of the rather small range of variability in many anatomical features as observed, for example, within time series of individual plants. Following a rigid protocol and quality control as proposed in this paper is thus mandatory to use quantitative data of xylem anatomical features as a powerful

  4. Statistical significance of quantitative PCR

    Directory of Open Access Journals (Sweden)

    Mazza Christian

    2007-04-01

    Full Text Available Abstract Background PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. Results Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. Conclusion Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.

  5. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  6. Semi-automatic quantitative measurements of intracranial internal carotid artery stenosis and calcification using CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Bleeker, Leslie; Berg, Rene van den; Majoie, Charles B. [Academic Medical Center, Department of Radiology, Amsterdam (Netherlands); Marquering, Henk A. [Academic Medical Center, Department of Radiology, Amsterdam (Netherlands); Academic Medical Center, Department of Biomedical Engineering and Physics, Amsterdam (Netherlands); Nederkoorn, Paul J. [Academic Medical Center, Department of Neurology, Amsterdam (Netherlands)

    2012-09-15

    Intracranial carotid artery atherosclerotic disease is an independent predictor for recurrent stroke. However, its quantitative assessment is not routinely performed in clinical practice. In this diagnostic study, we present and evaluate a novel semi-automatic application to quantitatively measure intracranial internal carotid artery (ICA) degree of stenosis and calcium volume in CT angiography (CTA) images. In this retrospective study involving CTA images of 88 consecutive patients, intracranial ICA stenosis was quantitatively measured by two independent observers. Stenoses were categorized with cutoff values of 30% and 50%. The calcification in the intracranial ICA was qualitatively categorized as absent, mild, moderate, or severe and quantitatively measured using the semi-automatic application. Linear weighted kappa values were calculated to assess the interobserver agreement of the stenosis and calcium categorization. The average and the standard deviation of the quantitative calcium volume were calculated for the calcium categories. For the stenosis measurements, the CTA images of 162 arteries yielded an interobserver correlation of 0.78 (P < 0.001). Kappa values of the categorized stenosis measurements were moderate: 0.45 and 0.58 for cutoff values of 30% and 50%, respectively. The kappa value for the calcium categorization was 0.62, with a good agreement between the qualitative and quantitative calcium assessment. Quantitative degree of stenosis measurement of the intracranial ICA on CTA is feasible with a good interobserver agreement ICA. Qualitative calcium categorization agrees well with quantitative measurements. (orig.)

  7. PPINGUIN: Peptide Profiling Guided Identification of Proteins improves quantitation of iTRAQ ratios

    Directory of Open Access Journals (Sweden)

    Bauer Chris

    2012-02-01

    Full Text Available Abstract Background Recent development of novel technologies paved the way for quantitative proteomics. One of the most important among them is iTRAQ, employing isobaric tags for relative or absolute quantitation. Despite large progress in technology development, still many challenges remain for derivation and interpretation of quantitative results. One of these challenges is the consistent assignment of peptides to proteins. Results We have developed Peptide Profiling Guided Identification of Proteins (PPINGUIN, a statistical analysis workflow for iTRAQ data addressing the problem of ambiguous peptide quantitations. Motivated by the assumption that peptides uniquely derived from the same protein are correlated, our method employs clustering as a very early step in data processing prior to protein inference. Our method increases experimental reproducibility and decreases variability of quantitations of peptides assigned to the same protein. Giving further support to our method, application to a type 2 diabetes dataset identifies a list of protein candidates that is in very good agreement with previously performed transcriptomics meta analysis. Making use of quantitative properties of signal patterns identified, PPINGUIN can reveal new isoform candidates. Conclusions Regarding the increasing importance of quantitative proteomics we think that this method will be useful in practical applications like model fitting or functional enrichment analysis. We recommend to use this method if quantitation is a major objective of research.

  8. PPINGUIN: Peptide Profiling Guided Identification of Proteins improves quantitation of iTRAQ ratios.

    Science.gov (United States)

    Bauer, Chris; Kleinjung, Frank; Rutishauser, Dorothea; Panse, Christian; Chadt, Alexandra; Dreja, Tanja; Al-Hasani, Hadi; Reinert, Knut; Schlapbach, Ralph; Schuchhardt, Johannes

    2012-02-16

    Recent development of novel technologies paved the way for quantitative proteomics. One of the most important among them is iTRAQ, employing isobaric tags for relative or absolute quantitation. Despite large progress in technology development, still many challenges remain for derivation and interpretation of quantitative results. One of these challenges is the consistent assignment of peptides to proteins. We have developed Peptide Profiling Guided Identification of Proteins (PPINGUIN), a statistical analysis workflow for iTRAQ data addressing the problem of ambiguous peptide quantitations. Motivated by the assumption that peptides uniquely derived from the same protein are correlated, our method employs clustering as a very early step in data processing prior to protein inference. Our method increases experimental reproducibility and decreases variability of quantitations of peptides assigned to the same protein. Giving further support to our method, application to a type 2 diabetes dataset identifies a list of protein candidates that is in very good agreement with previously performed transcriptomics meta analysis. Making use of quantitative properties of signal patterns identified, PPINGUIN can reveal new isoform candidates. Regarding the increasing importance of quantitative proteomics we think that this method will be useful in practical applications like model fitting or functional enrichment analysis. We recommend to use this method if quantitation is a major objective of research.

  9. Quantitative NDE of Composite Structures at NASA

    Science.gov (United States)

    Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.

    2015-01-01

    The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.

  10. A Quantitative Index of Forest Structural Sustainability

    Directory of Open Access Journals (Sweden)

    Jonathan A. Cale

    2014-07-01

    Full Text Available Forest health is a complex concept including many ecosystem functions, interactions and values. We develop a quantitative system applicable to many forest types to assess tree mortality with respect to stable forest structure and composition. We quantify impacts of observed tree mortality on structure by comparison to baseline mortality, and then develop a system that distinguishes between structurally stable and unstable forests. An empirical multivariate index of structural sustainability and a threshold value (70.6 derived from 22 nontropical tree species’ datasets differentiated structurally sustainable from unsustainable diameter distributions. Twelve of 22 species populations were sustainable with a mean score of 33.2 (median = 27.6. Ten species populations were unsustainable with a mean score of 142.6 (median = 130.1. Among them, Fagus grandifolia, Pinus lambertiana, P. ponderosa, and Nothofagus solandri were attributable to known disturbances; whereas the unsustainability of Abies balsamea, Acer rubrum, Calocedrus decurrens, Picea engelmannii, P. rubens, and Prunus serotina populations were not. This approach provides the ecological framework for rational management decisions using routine inventory data to objectively: determine scope and direction of change in structure and composition, assess excessive or insufficient mortality, compare disturbance impacts in time and space, and prioritize management needs and allocation of scarce resources.

  11. Expert judgement models in quantitative risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rosqvist, T. [VTT Automation, Helsinki (Finland); Tuominen, R. [VTT Automation, Tampere (Finland)

    1999-12-01

    Expert judgement is a valuable source of information in risk management. Especially, risk-based decision making relies significantly on quantitative risk assessment, which requires numerical data describing the initiator event frequencies and conditional probabilities in the risk model. This data is seldom found in databases and has to be elicited from qualified experts. In this report, we discuss some modelling approaches to expert judgement in risk modelling. A classical and a Bayesian expert model is presented and applied to real case expert judgement data. The cornerstone in the models is the log-normal distribution, which is argued to be a satisfactory choice for modelling degree-of-belief type probability distributions with respect to the unknown parameters in a risk model. Expert judgements are qualified according to bias, dispersion, and dependency, which are treated differently in the classical and Bayesian approaches. The differences are pointed out and related to the application task. Differences in the results obtained from the different approaches, as applied to real case expert judgement data, are discussed. Also, the role of a degree-of-belief type probability in risk decision making is discussed.

  12. Light-induced quantitative microprinting of biomolecules

    Science.gov (United States)

    Strale, Pierre-Olivier; Azioune, Ammar; Bugnicourt, Ghislain; Lecomte, Yohan; Chahid, Makhlad; Studer, Vincent

    2017-02-01

    Printing of biomolecules on substrates has developed tremendously in the past few years. The existing methods either rely on slow serial writing processes or on parallelized photolithographic techniques where cumbersome mask alignment procedures usually impair the ability to generate multi-protein patterns. We recently developed a new technology allowing for high resolution multi protein micro-patterning. This technology named "Light-Induced Molecular Adsorption of Proteins (LIMAP)" is based on a water-soluble photo-initiator able to reverse the antifouling property of polymer brushes when exposed to UV light. We developed a wide-field pattern projection system based on a DMD coupled to a conventional microscope which permits to generate arbitrary grayscale patterns of UV light at the micron scale. Interestingly, the density of adsorbed molecules scales with the dose of UV light thus allowing the quantitative patterning of biomolecules. The very low non specific background of biomolecules outside of the UV-exposed areas allows for the sequential printing of multiple proteins without alignment procedures. Protein patterns ranging from 500 nm up to 1 mm can be performed within seconds, as well as gradients of arbitrary shapes. The range of applications of the LIMAP approach extends from the single molecule up to the multicellular scale with an exquisite control over local protein density. We show that it can be used to generate complex protein landscapes useful to study protein-protein, cell-cell and cell-matrix interactions.

  13. Quantitative approach of speleothems fluorescence

    Science.gov (United States)

    Quiers, Marine; Perrette, Yves; Poulenard, Jérôme; Chalmin, Emilie; Revol, Morgane

    2014-05-01

    In this study, we propose a framework to interpret quantitatively the fluorescence of speleothems organic matter (OM) by the way of a bank of water-extracted organic matter. Due to its efficiency to described dissolved organic matter (DOM) characteritics, fluorescence has been used to determined DOM signatures in natural systems, water circulations, OM transfer from soils, OM evolution in soils or recently, DOM changes in engineered treatment systems. Fluorescence has also been used in speleothems studies, mainly as a growth indicator. Only few studies interpret it as an environmental proxy. Indeed, the fluorescence of OM provides information on the type of organic molecules trapped in speleothems and their evolutions. But the most direct information given by fluorescence is the variation of OM quantities. Actually, increase of fluorescence intensity is generally related to an increase in OM quantity but may also be induced by calcite optical effect or qualitative change of OM. However, analytical technics used in water environments cannot be used for speleothem samples. In this study we propose to give a frame to interpret quantitatively the fluorescence signal of speleothems. 3 different samples of stalagmites from french northern Prealps were used. To allow the quantification of the fluorescence signal, we need to measure the fluorescence and the quantity of organic matter on the same sample. OM of speleothems was extracted by an acid digestion method and analysed with a spectrofluorimeter. However, it was not possible to quantify directly the OM, as the extract solvant was a high-concentrated acid. To solve this problem, a calibration using soil extracts was realised. Soils were chosen in order to represent the diversity of OM present in the environment above the caves. Attention was focused on soil and vegetation types, and landuse. Organic material was water extracted from soils and its fluorescence was also measured. Total organic carbon was performed on the

  14. Quantitative phase-field model for phase transformations in multi-component alloys

    Energy Technology Data Exchange (ETDEWEB)

    Choudhury, Abhik Narayan

    2013-08-01

    Phase-field modeling has spread to a variety of applications involving phase transformations. While the method has wide applicability, derivation of quantitative predictions requires deeper understanding of the coupling between the system and model parameters. The present work highlights a novel phase-field model based on a grand-potential formalism allowing for an elegant and efficient solution to the problems in phase transformations. In particular, applications involving single and multi-phase, multi-component solidification have been investigated and a thorough study into the quantitative modeling of these problems have been examined.

  15. A transformative model for undergraduate quantitative biology education.

    Science.gov (United States)

    Usher, David C; Driscoll, Tobin A; Dhurjati, Prasad; Pelesko, John A; Rossi, Louis F; Schleiniger, Gilberto; Pusecker, Kathleen; White, Harold B

    2010-01-01

    The BIO2010 report recommended that students in the life sciences receive a more rigorous education in mathematics and physical sciences. The University of Delaware approached this problem by (1) developing a bio-calculus section of a standard calculus course, (2) embedding quantitative activities into existing biology courses, and (3) creating a new interdisciplinary major, quantitative biology, designed for students interested in solving complex biological problems using advanced mathematical approaches. To develop the bio-calculus sections, the Department of Mathematical Sciences revised its three-semester calculus sequence to include differential equations in the first semester and, rather than using examples traditionally drawn from application domains that are most relevant to engineers, drew models and examples heavily from the life sciences. The curriculum of the B.S. degree in Quantitative Biology was designed to provide students with a solid foundation in biology, chemistry, and mathematics, with an emphasis on preparation for research careers in life sciences. Students in the program take core courses from biology, chemistry, and physics, though mathematics, as the cornerstone of all quantitative sciences, is given particular prominence. Seminars and a capstone course stress how the interplay of mathematics and biology can be used to explain complex biological systems. To initiate these academic changes required the identification of barriers and the implementation of solutions.

  16. Automated identification of pathways from quantitative genetic interaction data

    Science.gov (United States)

    Battle, Alexis; Jonikas, Martin C; Walter, Peter; Weissman, Jonathan S; Koller, Daphne

    2010-01-01

    High-throughput quantitative genetic interaction (GI) measurements provide detailed information regarding the structure of the underlying biological pathways by reporting on functional dependencies between genes. However, the analytical tools for fully exploiting such information lag behind the ability to collect these data. We present a novel Bayesian learning method that uses quantitative phenotypes of double knockout organisms to automatically reconstruct detailed pathway structures. We applied our method to a recent data set that measures GIs for endoplasmic reticulum (ER) genes, using the unfolded protein response as a quantitative phenotype. The results provided reconstructions of known functional pathways including N-linked glycosylation and ER-associated protein degradation. It also contained novel relationships, such as the placement of SGT2 in the tail-anchored biogenesis pathway, a finding that we experimentally validated. Our approach should be readily applicable to the next generation of quantitative GI data sets, as assays become available for additional phenotypes and eventually higher-level organisms. PMID:20531408

  17. Fully automated quantitative cephalometry using convolutional neural networks.

    Science.gov (United States)

    Arık, Sercan Ö; Ibragimov, Bulat; Xing, Lei

    2017-01-01

    Quantitative cephalometry plays an essential role in clinical diagnosis, treatment, and surgery. Development of fully automated techniques for these procedures is important to enable consistently accurate computerized analyses. We study the application of deep convolutional neural networks (CNNs) for fully automated quantitative cephalometry for the first time. The proposed framework utilizes CNNs for detection of landmarks that describe the anatomy of the depicted patient and yield quantitative estimation of pathologies in the jaws and skull base regions. We use a publicly available cephalometric x-ray image dataset to train CNNs for recognition of landmark appearance patterns. CNNs are trained to output probabilistic estimations of different landmark locations, which are combined using a shape-based model. We evaluate the overall framework on the test set and compare with other proposed techniques. We use the estimated landmark locations to assess anatomically relevant measurements and classify them into different anatomical types. Overall, our results demonstrate high anatomical landmark detection accuracy ([Formula: see text] to 2% higher success detection rate for a 2-mm range compared with the top benchmarks in the literature) and high anatomical type classification accuracy ([Formula: see text] average classification accuracy for test set). We demonstrate that CNNs, which merely input raw image patches, are promising for accurate quantitative cephalometry.

  18. Quantitative methods for analyzing cell-cell adhesion in development.

    Science.gov (United States)

    Kashef, Jubin; Franz, Clemens M

    2015-05-01

    During development cell-cell adhesion is not only crucial to maintain tissue morphogenesis and homeostasis, it also activates signalling pathways important for the regulation of different cellular processes including cell survival, gene expression, collective cell migration and differentiation. Importantly, gene mutations of adhesion receptors can cause developmental disorders and different diseases. Quantitative methods to measure cell adhesion are therefore necessary to understand how cells regulate cell-cell adhesion during development and how aberrations in cell-cell adhesion contribute to disease. Different in vitro adhesion assays have been developed in the past, but not all of them are suitable to study developmentally-related cell-cell adhesion processes, which usually requires working with low numbers of primary cells. In this review, we provide an overview of different in vitro techniques to study cell-cell adhesion during development, including a semi-quantitative cell flipping assay, and quantitative single-cell methods based on atomic force microscopy (AFM)-based single-cell force spectroscopy (SCFS) or dual micropipette aspiration (DPA). Furthermore, we review applications of Förster resonance energy transfer (FRET)-based molecular tension sensors to visualize intracellular mechanical forces acting on cell adhesion sites. Finally, we describe a recently introduced method to quantitate cell-generated forces directly in living tissues based on the deformation of oil microdroplets functionalized with adhesion receptor ligands. Together, these techniques provide a comprehensive toolbox to characterize different cell-cell adhesion phenomena during development. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Quantitative Skills as a Graduate Learning Outcome of University Science Degree Programmes: Student Performance Explored through the "Planned-Enacted-Experienced" Curriculum Model

    Science.gov (United States)

    Matthews, Kelly E.; Adams, Peter; Goos, Merrilyn

    2016-01-01

    Application of mathematical and statistical thinking and reasoning, typically referred to as quantitative skills, is essential for university bioscience students. First, this study developed an assessment task intended to gauge graduating students' quantitative skills. The Quantitative Skills Assessment of Science Students (QSASS) was the result,…

  20. Quantitative magnetic resonance imaging phantoms: A review and the need for a system phantom.

    Science.gov (United States)

    Keenan, Kathryn E; Ainslie, Maureen; Barker, Alex J; Boss, Michael A; Cecil, Kim M; Charles, Cecil; Chenevert, Thomas L; Clarke, Larry; Evelhoch, Jeffrey L; Finn, Paul; Gembris, Daniel; Gunter, Jeffrey L; Hill, Derek L G; Jack, Clifford R; Jackson, Edward F; Liu, Guoying; Russek, Stephen E; Sharma, Samir D; Steckner, Michael; Stupic, Karl F; Trzasko, Joshua D; Yuan, Chun; Zheng, Jie

    2018-01-01

    The MRI community is using quantitative mapping techniques to complement qualitative imaging. For quantitative imaging to reach its full potential, it is necessary to analyze measurements across systems and longitudinally. Clinical use of quantitative imaging can be facilitated through adoption and use of a standard system phantom, a calibration/standard reference object, to assess the performance of an MRI machine. The International Society of Magnetic Resonance in Medicine AdHoc Committee on Standards for Quantitative Magnetic Resonance was established in February 2007 to facilitate the expansion of MRI as a mainstream modality for multi-institutional measurements, including, among other things, multicenter trials. The goal of the Standards for Quantitative Magnetic Resonance committee was to provide a framework to ensure that quantitative measures derived from MR data are comparable over time, between subjects, between sites, and between vendors. This paper, written by members of the Standards for Quantitative Magnetic Resonance committee, reviews standardization attempts and then details the need, requirements, and implementation plan for a standard system phantom for quantitative MRI. In addition, application-specific phantoms and implementation of quantitative MRI are reviewed. Magn Reson Med 79:48-61, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  1. Quantitative relationships in delphinid neocortex.

    Science.gov (United States)

    Mortensen, Heidi S; Pakkenberg, Bente; Dam, Maria; Dietz, Rune; Sonne, Christian; Mikkelsen, Bjarni; Eriksen, Nina

    2014-01-01

    Possessing large brains and complex behavioral patterns, cetaceans are believed to be highly intelligent. Their brains, which are the largest in the Animal Kingdom and have enormous gyrification compared with terrestrial mammals, have long been of scientific interest. Few studies, however, report total number of brain cells in cetaceans, and even fewer have used unbiased counting methods. In this study, using stereological methods, we estimated the total number of cells in the neocortex of the long-finned pilot whale (Globicephala melas) brain. For the first time, we show that a species of dolphin has more neocortical neurons than any mammal studied to date including humans. These cell numbers are compared across various mammals with different brain sizes, and the function of possessing many neurons is discussed. We found that the long-finned pilot whale neocortex has approximately 37.2 × 10(9) neurons, which is almost twice as many as humans, and 127 × 10(9) glial cells. Thus, the absolute number of neurons in the human neocortex is not correlated with the superior cognitive abilities of humans (at least compared to cetaceans) as has previously been hypothesized. However, as neuron density in long-finned pilot whales is lower than that in humans, their higher cell number appears to be due to their larger brain. Accordingly, our findings make an important contribution to the ongoing debate over quantitative relationships in the mammalian brain.

  2. Quantitative relationships in delphinid neocortex

    Directory of Open Access Journals (Sweden)

    Heidi S Mortensen

    2014-11-01

    Full Text Available Possessing large brains and complex behavioural patterns, cetaceans are believed to be highly intelligent. Their brains, which are the largest in the Animal Kingdom and have enormous gyrification compared with terrestrial mammals, have long been of scientific interest. Few studies, however, report total number of brain cells in cetaceans, and even fewer have used unbiased counting methods. In this study, using stereological methods, we estimated the total number of cells in the long-finned pilot whale (Globicephala melas brain. For the first time, we show that a species of dolphin has more neocortical neurons than in any mammal studied to date including humans. These cell numbers are compared across various mammals with different brain sizes, and the function of possessing many neurons is discussed. We found that the long-finned pilot whale neocortex has approximately 37.2 × 109 neurons, which is almost twice as many as humans, and 127 × 109 glial cells. Thus, the absolute number of neurons in the human neocortex is not correlated with the superior cognitive abilities of humans (at least compared to cetaceans as has previously been hypothesized. However, as neuron density in long-finned pilot whales is lower than that in humans, their higher cell number appears to be due to their larger brain. Accordingly, our findings make an important contribution to the ongoing debate over quantitative relationships in the mammalian brain.

  3. A quantitative philology of introspection

    Directory of Open Access Journals (Sweden)

    Carlos eDiuk

    2012-09-01

    Full Text Available The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the ``Axial Age'', saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy - which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single ``arrow of time'' in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the 20th century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus.

  4. Corporate qualitative and quantitative assessment

    Directory of Open Access Journals (Sweden)

    Maria – Monica Haralambie

    2016-08-01

    After the financial crisis, two key concerns have been raised regarding banks’ activities: “too little, too late” provisioning for loan losses and “too big to fail”. The credit risk management subject became not only a compliance exercise for banks, but also a key item considered when establishing the strategy and execution path. Our intention within this paper is to discuss some of the specific issues related to credit risk management, considered by commercial banks when analysing a corporate client. The result of this research is a web application named CISS (Credit Institution Scoring System, which represents a proof of concept for a bank credit scoring system. The application was developed using HTML + MySQL + PHP solutions.

  5. Number Sense: The Underpinning Understanding for Early Quantitative Literacy

    Directory of Open Access Journals (Sweden)

    Effie Maclellan

    2012-07-01

    Full Text Available The fundamental meaning of Quantitative Literacy (QL as the application of quantitative knowledge or reasoning in new/unfamiliar contexts is problematic because how we acquire knowledge, and transfer it to new situations, is not straightforward. This article argues that in the early development of QL, there is a specific corpus of numerical knowledge which learners need to integrate into their thinking, and to which teachers should attend. The paper is a rebuttal to historically prevalent (and simplistic views that the terrain of early numerical understanding is little more than simple counting devoid of cognitive complexity. Rather, the knowledge upon which early QL develops comprises interdependent dimensions: Number Knowledge, Counting Skills and Principles, Nonverbal Calculation, Number Combinations and Story Problems - summarised as Number Sense. In order to derive the findings for this manuscript, a realist synthesis of recent Education and Psychology literature was conducted. The findings are of use not only when teaching very young children, but also when teaching learners who are experiencing learning difficulties through the absence of prerequisite numerical knowledge. As well, distilling fundamental quantitative knowledge for teachers to integrate into practice, the review emphasises that improved pedagogy is less a function of literal applications of reported interventions, on the grounds of perceived efficacy elsewhere, but based in refinements of teachers' understandings. Because teachers need to adapt instructional sequences to the actual thinking and learning of learners in their charge, they need knowledge that allows them to develop their own theoretical understanding rather than didactic exhortations.

  6. MR urography: Anatomical and quantitative information on ...

    African Journals Online (AJOL)

    MR urography: Anatomical and quantitative information on congenital malformations in children. Maria Karaveli, Dimitrios Katsanidis, Ioannis Kalaitzoglou, Afroditi Haritanti, Anastasios Sioundas, Athanasios Dimitriadis, Kyriakos Psarrakos ...

  7. Quantitative Wood Anatomy-Practical Guidelines

    National Research Council Canada - National Science Library

    von Arx, Georg; Crivellaro, Alan; Prendin, Angela L; Čufar, Katarina; Carrer, Marco

    2016-01-01

    Quantitative wood anatomy analyzes the variability of xylem anatomical features in trees, shrubs, and herbaceous species to address research questions related to plant functioning, growth, and environment...

  8. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    Science.gov (United States)

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  9. A new trend to determine biochemical parameters by quantitative FRET assays.

    Science.gov (United States)

    Liao, Jia-yu; Song, Yang; Liu, Yan

    2015-12-01

    Förster resonance energy transfer (FRET) has been widely used in biological and biomedical research because it can determine molecule or particle interactions within a range of 1-10 nm. The sensitivity and efficiency of FRET strongly depend on the distance between the FRET donor and acceptor. Historically, FRET assays have been used to quantitatively deduce molecular distances. However, another major potential application of the FRET assay has not been fully exploited, that is, the use of FRET signals to quantitatively describe molecular interactive events. In this review, we discuss the use of quantitative FRET assays for the determination of biochemical parameters, such as the protein interaction dissociation constant (K(d)), enzymatic velocity (k(cat)) and K(m). We also describe fluorescent microscopy-based quantitative FRET assays for protein interaction affinity determination in cells as well as fluorimeter-based quantitative FRET assays for protein interaction and enzymatic parameter determination in solution.

  10. Quantitative diagnostics of stratospheric mixing

    Science.gov (United States)

    Sobel, Adam Harrison

    1998-12-01

    This thesis addresses the planetary-scale mixing of tracers along isentropic surfaces in the extratropical winter stratosphere. The primary goal is a more fully quantitative understanding of the mixing than is available at present. The general problem of representing eddy mixing in a one- dimensional mean representation of a two-dimensional flow is discussed. The limitations of the eddy diffusion model are reviewed, and alternatives explored. The stratosphere may, for some purposes, be viewed as consisting of relatively well-mixed regions separated by moving, internal transport barriers. Methods for diagnosing transport across moving surfaces, such as tracer isosurfaces, from given flow and tracer fields are reviewed. The central results of the thesis involve diagnostic studies of output from a shallow water model of the stratosphere. It is first proved that in an inviscid shallow water atmosphere subject to mass sources and sinks, if the mass enclosed by a potential vorticity (PV) contour is steady in time, then the integral of the mass source over the area enclosed by the contour must be zero. Next, two different approaches are used to diagnose the time-averaged transport across PV contours in the model simulations. The first is the modified Lagrangian mean (MLM) approach, which relates the transport across PV contours to PV sources and sinks. The second is called 'local gradient reversal' (LGR), and is similar to contour advection with surgery. The model includes a sixth-order hyperdiffusion on the vorticity field. Except in a thin outer 'entrainment zone', the hyperdiffusion term has only a very weak effect on the MLM mass budget of the polar vortex edge. In the entrainment zone, the hyperdiffusion term has a significant effect. The LGR results capture this behavior, providing good quantitative estimates of the hyperdiffusion term, which is equivalent to the degree of radiative disequilibrium at a PV contour. This agreement shows that the main role of the

  11. Gold nanoparticle immunochromatographic assay for quantitative detection of urinary RBP

    Directory of Open Access Journals (Sweden)

    XU Kuan

    2013-04-01

    Full Text Available A rapid quantitative detection of urinary RBP was established by using nano-gold immunochromatography (sandwich method and trisodium citrate reduction method and a rapid immunochromatographic test strip was developed. Theimmunochromatographic test strip can quantitatively detect RBP within 15 minutes. The detection limit was 150ng/mL and detection range was from 150 to 5000 ng/mL. There were no cross-reactions with others kidney disease markers,such as urinary albumin (ALB,transferrin protein (TRF,β2-microglobulin (β2-MG,urinary fiber connecting protein (FN,and lysozyme (LZM. The results indicate that it is a quick and simple method with strong specificity,high sensitivity,and wide detection range. The rapid detection method will have extensive clinical applications in the early diagnosis of proximal tubular damage,kidney disease,diabetic nephropathy,and process monitoring.

  12. Quantitative single-molecule imaging by confocal laser scanning microscopy.

    Science.gov (United States)

    Vukojevic, Vladana; Heidkamp, Marcus; Ming, Yu; Johansson, Björn; Terenius, Lars; Rigler, Rudolf

    2008-11-25

    A new approach to quantitative single-molecule imaging by confocal laser scanning microscopy (CLSM) is presented. It relies on fluorescence intensity distribution to analyze the molecular occurrence statistics captured by digital imaging and enables direct determination of the number of fluorescent molecules and their diffusion rates without resorting to temporal or spatial autocorrelation analyses. Digital images of fluorescent molecules were recorded by using fast scanning and avalanche photodiode detectors. In this way the signal-to-background ratio was significantly improved, enabling direct quantitative imaging by CLSM. The potential of the proposed approach is demonstrated by using standard solutions of fluorescent dyes, fluorescently labeled DNA molecules, quantum dots, and the Enhanced Green Fluorescent Protein in solution and in live cells. The method was verified by using fluorescence correlation spectroscopy. The relevance for biological applications, in particular, for live cell imaging, is discussed.

  13. DySCo: Quantitating Associations of Membrane Proteins Using Two-Color Single-Molecule Tracking

    OpenAIRE

    Dunne, Paul D; Fernandes, Ricardo A; McColl, James; Yoon, Ji Won; James, John R.; Davis, Simon J.; Klenerman, David

    2009-01-01

    Abstract We present a general method called dynamic single-molecule colocalization for quantitating the associations of single cell surface molecules labeled with distinct autofluorescent proteins. The chief advantages of the new quantitative approach are that, in addition to stable interactions, it is capable of measuring nonconstitutive associations, such as those induced by the cytoskeleton, and it is applicable to situations where the number of molecules is small.

  14. Hyperspectral and differential CARS microscopy for quantitative chemical imaging in human adipocytes

    OpenAIRE

    Di Napoli, Claudia; Pope, Iestyn; Masia, Francesco; Watson, Peter; Langbein, Wolfgang; Borri, Paola

    2014-01-01

    In this work, we demonstrate the applicability of coherent anti-Stokes Raman scattering (CARS) micro-spectroscopy for quantitative chemical imaging of saturated and unsaturated lipids in human stem-cell derived adipocytes. We compare dual-frequency/differential CARS (D-CARS), which enables rapid imaging and simple data analysis, with broadband hyperspectral CARS microscopy analyzed using an unsupervised phase-retrieval and factorization method recently developed by us for quantitative chemica...

  15. Relative quantitation of proteins fractionated by the ProteomeLab PF 2D system using isobaric tags for relative and absolute quantitation (iTRAQ).

    Science.gov (United States)

    Skalnikova, Helena; Rehulka, Pavel; Chmelik, Josef; Martinkova, Jirina; Zilvarova, Michaela; Gadher, Suresh Jivan; Kovarova, Hana

    2007-11-01

    We describe an optimised protocol for application of isobaric tags for relative and absolute quantitation (iTRAQ) and tandem mass spectrometry to obtain relative quantitative data from peptides derived from tryptic digestions of proteins fractionated by using the 2D liquid-phase ProteomeLab PF 2D technique. This methodology is suitable for the quantitation of proteins from a pool of co-eluting proteins which are often difficult to identify for the purpose of candidate protein selection for biologically relevant qualitative/quantitative changes under experimental conditions or in disease states. iTRAQ quantitation also facilitates the possibility of result to result comparison using other methodologies such as UV protein quantitation via the ProteomeLab PF 2D technique. The optimised protocol outlined here allows relative quantitation by MALDI-TOF/TOF mass spectrometry with high sensitivity and without the need to perform 2D HPLC separation of labelled peptides. The overall outcome is the simplification in the data complexity and the ease of use of the labelling protocol.

  16. 78 FR 64202 - Quantitative Messaging Research

    Science.gov (United States)

    2013-10-28

    ... COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures Trading Commission. ACTION: Notice... comments using only one method and identify that it is for the ``Quantitative Messaging Research.'' All... message testing research (for which CFTC received fast- track OMB approval) and is necessary to identify...

  17. Compact, common path quantitative phase microscopic techniques ...

    Indian Academy of Sciences (India)

    2014-01-05

    Jan 5, 2014 ... One of the ways to retrieve object height/thickness information is to employ quantitative phase microscopic (QPM) techniques. Interferometric QPM techniques are widely used for this. Digital holographic microscopy (DHM) is one of the stateof-the-art methods for quantitative three-dimensional (3D) imaging.

  18. Quantitative Resistance to Biotrophic Filamentous Plant Pathogens

    NARCIS (Netherlands)

    Niks, R.E.; Qi, Xiaoquan; Marcel, T.C.

    2015-01-01

    Quantitative resistance (QR) refers to a resistance that is phenotypically incomplete and is based on the joined effect of several genes, each contributing quantitatively to the level of plant defense. Often, QR remains durably effective, which is the primary driver behind the interest in it. The

  19. Quantitative genetics in the age of omics

    NARCIS (Netherlands)

    Keurentjes, J.J.B.; Koornneef, M.; Vreugdenhil, D.

    2008-01-01

    The use of natural variation in the genetic dissection of quantitative traits has a long-standing tradition. Recent advances in high-throughput technologies for the quantification of biological molecules have shifted the focus in quantitative genetics from single traits to comprehensive large-scale

  20. Applying Knowledge of Quantitative Design and Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…