WorldWideScience

Sample records for quantitative systems analysis

  1. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided...... by the environment in which they are embedded. This thesis studies the semantics and properties of a model-based framework for re- active systems, in which models and specifications are assumed to contain quantifiable information, such as references to time or energy. Our goal is to develop a theory of approximation......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  2. Quantitative Models and Analysis for Reactive Systems

    DEFF Research Database (Denmark)

    Thrane, Claus

    phones and websites. Acknowledging that now more than ever, systems come in contact with the physical world, we need to revise the way we construct models and verification algorithms, to take into account the behavior of systems in the presence of approximate, or quantitative information, provided......, allowing verification procedures to quantify judgements, on how suitable a model is for a given specification — hence mitigating the usual harsh distinction between satisfactory and non-satisfactory system designs. This information, among other things, allows us to evaluate the robustness of our framework......, by studying how small changes to our models affect the verification results. A key source of motivation for this work can be found in The Embedded Systems Design Challenge [HS06] posed by Thomas A. Henzinger and Joseph Sifakis. It contains a call for advances in the state-of-the-art of systems verification...

  3. QUANTITATIVE METHODOLOGY FOR STABILITY ANALYSIS OF NONLINEAR ROTOR SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    ZHENG Hui-ping; XUE Yu-sheng; CHEN Yu-shu

    2005-01-01

    Rotor-bearings systems applied widely in industry are nonlinear dynamic systems of multi-degree-of-freedom. Modem concepts on design and maintenance call for quantitative stability analysis. Using trajectory based stability-preserving and dimensional-reduction, a quanttative stability analysis method for rotor systems is presented. At first, an n-dimensional nonlinear non-autonomous rotor system is decoupled into n subsystems after numerical integration. Each of them has only onedegree-of-freedom and contains time-varying parameters to represent all other state variables. In this way, n-dimensional trajectory is mapped into a set of one-dimensional trajectories. Dynamic central point (DCP) of a subsystem is then defined on the extended phase plane, namely, force-position plane. Characteristics of curves on the extended phase plane and the DCP's kinetic energy difference sequence for general motion in rotor systems are studied. The corresponding stability margins of trajectory are evaluated quantitatively. By means of the margin and its sensitivity analysis, the critical parameters of the period doubling bifurcation and the Hopf bifurcation in a flexible rotor supported by two short journal beatings with nonlinear suspensionare are determined.

  4. Segregation Analysis on Genetic System of Quantitative Traits in Plants

    Institute of Scientific and Technical Information of China (English)

    Gai Junyi

    2006-01-01

    Based on the traditional polygene inheritance model of quantitative traits,the author suggests the major gene and polygene mixed inheritance model.The model was considered as a general one,while the pure major gene and pure polygene inheritance model was a specific case of the general model.Based on the proposed theory,the author established the segregation analysis procedure to study the genetic system of quantitative traits of plants.At present,this procedure can be used to evaluate the genetic effect of individual major genes (up to two to three major genes),the collective genetic effect of polygene,and their heritability value.This paper introduces how to establish the procedure,its main achievements,and its applications.An example is given to illustrate the steps,methods,and effectiveness of the procedure.

  5. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  6. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  7. Quantitative analysis with the optoacoustic/ultrasound system OPUS

    Science.gov (United States)

    Haisch, Christoph; Zell, Karin; Sperl, Jonathan; Vogel, Mika W.; Niessner, Reinhard

    2009-02-01

    The OPUS (Optoacoustic plus Ultrasound) system is a combination of a medical ultrasound scanner with a highrepetition rate, wavelength-tunable laser system and a suitable triggering interface to synchronize the laser and the ultrasound system. The pulsed laser generates an optoacoustic (OA), or photoacoustic (PA), signal which is detected by the ultrasound system. Alternatively, imaging in conventional ultrasound mode can be performed. Both imaging modes can be superimposed. The laser light is coupled into the tissue laterally, parallel to the ultrasound transducer, which does not require for any major modification to the transducer or the ultrasound beam forming. This was a basic requirement on the instrument, as the intention of the project was to establish the optoacoustic imaging modality as add-on to a conventional standard ultrasound instrument. We believe that this approach may foster the introduction of OA imaging as routine tool in medical diagnosis. Another key aspect of the project was to exploit the capabilities of OA imaging for quantitative analysis. The intention of the presented work is to summarize all steps necessary to extract the significant information from the PA raw data, which are required for the quantification of local absorber distributions. We show results of spatially resolved absorption measurements in scattering samples and a comparison of four different image reconstruction algorithms, regarding their influence on lateral resolution as well as on the signal to noise ratio for different sample depths and absorption values. The reconstruction algorithms are based on Fourier transformation, on a generalized 2D Hough transformation, on circular back-projection and the classical delay-and-sum approach which is implemented in most ultrasound scanners. Furthermore, we discuss the influence of a newly developed laser source, combining diode and flash lamp pumping. Compared to all-flash-lamp pumped systems it features a significantly higher

  8. Quantitative Analysis of AGV System in FMS Cell Layout

    Directory of Open Access Journals (Sweden)

    B. Ramana

    1997-01-01

    Full Text Available Material handling is a specialised activity for a modern manufacturing concern. Automated guided vehicles (AGVs are invariably used for material handling in flexible manufacturing Systems (FMSs due to their flexibility. The quantitative analysis of an AGV system is useful for determining the material flow rates, operation times, length of delivery, length of empty move of AGV and the number of AGVs required for a typical FMS cell layout. The efficiency of the material handling system, such as AGV can be improved by reducing the length of empty move. The length of empty move of AGV depends upon despatching and scheduling methods. If these methods of AGVs are not properly planned, the length of empty move of AGV is greater than the length of delivery .This results in increase in material handling time which in turn increases the number of AGVs required in FMS cell. This paper presents a method for optimising the length of empty travel of AGV in a typical FMS cell layout.

  9. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  10. A microcomputer-based system for quantitative petrographic analysis

    Science.gov (United States)

    Starkey, John; Samantaray, Abani Kanta

    1994-11-01

    An imaging system based on a videocamera and frame grabber is described which is capable of capturing and analyzing composite images. Individual images are captured interactively, this permits manipulation of the illumination to emphasize selected features of interest in sequentially captured images. Data from the sequential images are accumulated to form a synoptic image, which allows analysis to proceed in a manner which emulates the techniques of manual, polarized light microscopy. The effects of rotating a thin section in plane and crossed polarized light can be simulated so that mineral boundaries can be detected across which there is a lack of contrast at some orientations. The imaging system implements algorithms for digital filtering and boundary identification and incorporates facilities for image editing. Mathematical functions are provided for the interpolation of boundaries which are not detected in their entirety, in a way analogous to visual interpretation. The image data are written to 256-color PCX image files which can be manipulated by other software or transmitted electronically. The locations of the boundaries of the features of interest are available as lists of ( x, y) coordinates and as chain codes. From these the size, shape, and spatial parameters are computed. In addition, the gray-level and segmented images are used to obtain texture information. The imaging system is illustrated by application to the analysis of grain boundaries, modal composition, and grain shapes in petrographic thin sections. The analytical results are compared with results obtained by traditional petrographic analyses.

  11. Public Library System in Ankara: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Bülent Yılmaz

    2014-12-01

    Full Text Available This study investigates 42 public libraries in 25 central districts within the boundaries of Ankara Metropolitan Municipality in respect of five factors according to national and international standards quantitatively. The findings show that public libraries in Ankara are insufficient with respect to the number of buildings, users, staff and collection and also in terms of standards. Therefore, it has been suggested that an urgent planning is necessary for public libraries in Ankara.

  12. Experimental validation and clinical comparison of quantitative coronary analysis systems

    NARCIS (Netherlands)

    J. Haase (Jürgen)

    1993-01-01

    textabstractThe kernel topic of this thesis is the validation of QCA systems by a new experimental approach involving the percutaneous insertion of coronary stenosis phantoms in swine coronary arteries. The reliability of digital as well as cinefilm-based QCA systems has been compared on the basis

  13. Quantitative Analysis of Strategic Voting in Anonymous Voting Systems

    Science.gov (United States)

    Wang, Tiance

    2016-01-01

    Democratically choosing a single preference from three or more candidate options is not a straightforward matter. There are many competing ideas on how to aggregate rankings of candidates. However, the Gibbard-Satterthwaite theorem implies that no fair voting system (equality among voters and equality among candidates) is immune to strategic…

  14. Quantitative investment analysis

    CERN Document Server

    DeFusco, Richard

    2007-01-01

    In the "Second Edition" of "Quantitative Investment Analysis," financial experts Richard DeFusco, Dennis McLeavey, Jerald Pinto, and David Runkle outline the tools and techniques needed to understand and apply quantitative methods to today's investment process.

  15. Quantitative analysis of transformed ray transferences of optical systems in a space of augmented Hamiltonian matrices*

    Directory of Open Access Journals (Sweden)

    W. F. Harris

    2007-01-01

    Full Text Available There is a need for methods for quantitative analysis of the first-order optical character of optical systems including the eye and components of the eye.  Because of their symplectic nature ray transferences themselves are not closed under addition and multiplication by ascalar and, hence, are not amenable to conventional quantitative analysis such as the calculation of an arithmetic mean.  However transferences can be transformed into augmented Hamiltonian matrices which are amenable to such analysis.  This paper provides a general methodology and in particular shows how to calculate means and variance-covariances representing the first-order optical character of optical systems.  The systems may be astigmatic and may have decentred elements.  An accompanying paper shows application to the cornea of the human eye with allowance for thickness.

  16. Quantitative analysis of saltwater-freshwater relationships in groundwater systems-A historical perspective

    Science.gov (United States)

    Reilly, T.E.; Goodman, A.S.

    1985-01-01

    Although much progress has been made toward the mathematical description of saltwater-freshwater relationships in groundwater systems since the late 19th century, the advective and dispersive mechanisms involved are still incompletely understood. This article documents the major historical advances in this subject and summarizes the major direction of current studies. From the time of Badon Ghyben and Herzberg, it has been recognized that density is important in mathematically describing saltwater-freshwater systems. Other mechanisms, such as hydrodynamic dispersion, were identified later and are still not fully understood. Quantitative analysis of a saltwater-freshwater system attempts to mathematically describe the physical system and the important mechanisms using reasonable simplifications and assumptions. This paper, in developing the history of quantitative analysis discusses many of these simplifications and assumptions and their effect on describing and understanding the phenomenon. ?? 1985.

  17. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  18. A New 3-Dimensional Dynamic Quantitative Analysis System of Facial Motion: An Establishment and Reliability Test

    Science.gov (United States)

    Feng, Guodong; Zhao, Yang; Tian, Xu; Gao, Zhiqiang

    2014-01-01

    This study aimed to establish a 3-dimensional dynamic quantitative facial motion analysis system, and then determine its accuracy and test-retest reliability. The system could automatically reconstruct the motion of the observational points. Standardized T-shaped rod and L-shaped rods were used to evaluate the static and dynamic accuracy of the system. Nineteen healthy volunteers were recruited to test the reliability of the system. The average static distance error measurement was 0.19 mm, and the average angular error was 0.29°. The measuring results decreased with the increase of distance between the cameras and objects, 80 cm of which was considered to be optimal. It took only 58 seconds to perform the full facial measurement process. The average intra-class correlation coefficient for distance measurement and angular measurement was 0.973 and 0.794 respectively. The results demonstrated that we successfully established a practical 3-dimensional dynamic quantitative analysis system that is accurate and reliable enough to meet both clinical and research needs. PMID:25390881

  19. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  20. Multivariate Quantitative Chemical Analysis

    Science.gov (United States)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  1. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    Science.gov (United States)

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. © 2013 Elsevier Ltd. All rights reserved.

  2. Quantitative Hydrocarbon Surface Analysis

    Science.gov (United States)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  3. An imaging system for standardized quantitative analysis of C. elegans behavior

    Directory of Open Access Journals (Sweden)

    Sternberg Paul W

    2004-08-01

    Full Text Available Abstract Background The nematode Caenorhabditis elegans is widely used for the genetic analysis of neuronal cell biology, development, and behavior. Because traditional methods for evaluating behavioral phenotypes are qualitative and imprecise, there is a need for tools that allow quantitation and standardization of C. elegans behavioral assays. Results Here we describe a tracking and imaging system for the automated analysis of C. elegans morphology and behavior. Using this system, it is possible to record the behavior of individual nematodes over long time periods and quantify 144 specific phenotypic parameters. Conclusions These tools for phenotypic analysis will provide reliable, comprehensive scoring of a wide range of behavioral abnormalities, and will make it possible to standardize assays such that behavioral data from different labs can readily be compared. In addition, this system will facilitate high-throughput collection of phenotypic data that can ultimately be used to generate a comprehensive database of C. elegans phenotypic information. Availability The hardware configuration and software for the system are available from wschafer@ucsd.edu.

  4. Prediction of intracellular storage polymers using quantitative image analysis in enhanced biological phosphorus removal systems.

    Science.gov (United States)

    Mesquita, Daniela P; Leal, Cristiano; Cunha, Jorge R; Oehmen, Adrian; Amaral, A Luís; Reis, Maria A M; Ferreira, Eugénio C

    2013-04-03

    The present study focuses on predicting the concentration of intracellular storage polymers in enhanced biological phosphorus removal (EBPR) systems. For that purpose, quantitative image analysis techniques were developed for determining the intracellular concentrations of PHA (PHB and PHV) with Nile blue and glycogen with aniline blue staining. Partial least squares (PLS) were used to predict the standard analytical values of these polymers by the proposed methodology. Identification of the aerobic and anaerobic stages proved to be crucial for improving the assessment of PHA, PHB and PHV intracellular concentrations. Current Nile blue based methodology can be seen as a feasible starting point for further enhancement. Glycogen detection based on the developed aniline blue staining methodology combined with the image analysis data proved to be a promising technique, toward the elimination of the need for analytical off-line measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Domestic Systemically Important Banks: A Quantitative Analysis for the Chinese Banking System

    Directory of Open Access Journals (Sweden)

    Yibing Chen

    2014-01-01

    Full Text Available This paper serves as a response to the official assessment approach proposed by Basel Committee to identify domestic systemically important banks (D-SIBs in China. Our analysis presents not only current levels of domestic systemic importance of individual banks but also the changes. We also consider the systemic risk of the whole banking system, by investigating how D-SIBs and non-D-SIBs are correlated before and after the recent financial crises using Copula. We find that the systemic importance of major banks is decreasing, while some banks becoming more systemically important should require tight regulations. D-SIBs as a whole subsystem display stronger correlation with non-D-SIBs than the individual D-SIBs, which alerts the regulatory to pay attention to “too-many-to-fail” problems. Contagion effects between D-SIBs and non-D-SIBs exist during the subprime crisis, but did not exist during the European debt crisis. This yields good signal of a more balanced banking system in China.

  6. A novel benzene quantitative analysis method using miniaturized metal ionization gas sensor and non-linear bistable dynamic system.

    Science.gov (United States)

    Tang, Xuxiang; Liu, Fuqi

    2015-01-01

    In this paper, a novel benzene quantitative analysis method utilizing miniaturized metal ionization gas sensor and non-linear bistable dynamic system was investigated. Al plate anodic gas-ionization sensor was installed for electrical current-voltage data measurement. Measurement data was analyzed by non-linear bistable dynamics system. Results demonstrated that this method realized benzene concentration quantitative determination. This method is promising in laboratory safety management in benzene leak detection.

  7. Quantitative Analysis Method of Output Loss due to Restriction for Grid-connected PV Systems

    Science.gov (United States)

    Ueda, Yuzuru; Oozeki, Takashi; Kurokawa, Kosuke; Itou, Takamitsu; Kitamura, Kiyoyuki; Miyamoto, Yusuke; Yokota, Masaharu; Sugihara, Hiroyuki

    Voltage of power distribution line will be increased due to reverse power flow from grid-connected PV systems. In the case of high density grid connection, amount of voltage increasing will be higher than the stand-alone grid connection system. To prevent the over voltage of power distribution line, PV system's output will be restricted if the voltage of power distribution line is close to the upper limit of the control range. Because of this interaction, amount of output loss will be larger in high density case. This research developed a quantitative analysis method for PV systems output and losses to clarify the behavior of grid connected PV systems. All the measured data are classified into the loss factors using 1 minute average of 1 second data instead of typical 1 hour average. Operation point on the I-V curve is estimated to quantify the loss due to the output restriction using module temperature, array output voltage, array output current and solar irradiance. As a result, loss due to output restriction is successfully quantified and behavior of output restriction is clarified.

  8. Systems analysis of quantitative shRNA-library screens identifies regulators of cell adhesion

    Directory of Open Access Journals (Sweden)

    Huang XiaoDong

    2008-06-01

    Full Text Available Abstract Background High throughput screens with RNA interference technology enable loss-of-function analyses of gene activities in mammalian cells. While the construction of genome-scale shRNA libraries has been successful, results of large-scale screening of those libraries can be difficult to analyze because of the relatively high noise levels and the fact that not all shRNAs in a library are equally effective in silencing gene expression. Results We have screened a library consisting of 43,828 shRNAs directed against 8,500 human genes for functions that are necessary in cell detachment induced by a constitutively activated c-Abl tyrosine kinase. To deal with the issues of noise and uncertainty of knockdown efficiencies, we employed an analytical strategy that combines quantitative data analysis with biological knowledge, i.e. Gene Ontology and pathway information, to increase the power of the RNAi screening technique. Using this strategy we found 16 candidate genes to be involved in Abl-induced disruption of cell adhesion, and verified that the knockdown of IL6ST is associated with enhanced cell attachment. Conclusion Our results suggest that the power of genome-wide quantitative shRNA screens can be significantly increased when analyzed using a systems biology-based approach to identify functional gene networks.

  9. Monitoring intracellular polyphosphate accumulation in enhanced biological phosphorus removal systems by quantitative image analysis.

    Science.gov (United States)

    Mesquita, Daniela P; Amaral, A Luís; Leal, Cristiano; Carvalheira, Mónica; Cunha, Jorge R; Oehmen, Adrian; Reis, Maria A M; Ferreira, Eugénio C

    2014-01-01

    A rapid methodology for intracellular storage polyphosphate (poly-P) identification and monitoring in enhanced biological phosphorus removal (EBPR) systems is proposed based on quantitative image analysis (QIA). In EBPR systems, 4',6-diamidino-2-phenylindole (DAPI) is usually combined with fluorescence in situ hybridization to evaluate the microbial community. The proposed monitoring technique is based on a QIA procedure specifically developed for determining poly-P inclusions within a biomass suspension using solely DAPI by epifluorescence microscopy. Due to contradictory literature regarding DAPI concentrations used for poly-P detection, the present work assessed the optimal DAPI concentration for samples acquired at the end of the EBPR aerobic stage when the accumulation occurred. Digital images were then acquired and processed by means of image processing and analysis. A correlation was found between average poly-P intensity values and the analytical determination. The proposed methodology can be seen as a promising alternative procedure for quantifying intracellular poly-P accumulation in a faster and less labour-intensive way.

  10. Quantitative analysis of L-SPECT system for small animal brain imaging

    Science.gov (United States)

    Rahman, Tasneem; Tahtali, Murat; Pickering, Mark R.

    2016-03-01

    This paper aims to investigate the performance of a newly proposed L-SPECT system for small animal brain imaging. The L-SPECT system consists of an array of 100 × 100 micro range diameter pinholes. The proposed detector module has a 48 mm by 48 mm active area and the system is based on a pixelated array of NaI crystals (10×10×10 mm elements) coupled with an array of position sensitive photomultiplier tubes (PSPMTs). The performance of this system was evaluated with pinhole radii of 50 μm, 60 μm and 100 μm. Monte Carlo simulation studies using the Geant4 Application for Tomographic Emission (GATE) software package validate the performance of this novel dual head L-SPECT system where a geometric mouse phantom is used to investigate its performance. All SPECT data were obtained using 120 projection views from 0° to 360° with a 3° step. Slices were reconstructed using conventional filtered back projection (FBP) algorithm. We have evaluated the quality of the images in terms of spatial resolution (FWHM) based on line spread function, the system sensitivity, the point source response function and the image quality. The sensitivity of our newly proposed L- SPECT system was about 4500 cps/μCi at 6 cm along with excellent full width at half-maximum (FWHM) using 50 μm pinhole aperture at several radii of rotation. The analysis results show the combination of excellent spatial resolution and high detection efficiency over an energy range between 20-160 keV. The results demonstrate that SPECT imaging using a pixelated L-SPECT detector module is applicable in a quantitative study of mouse brain imaging.

  11. A Creative Helicobacter pylori Diagnosis Scheme Based on Multiple Genetic Analysis System: Qualification and Quantitation.

    Science.gov (United States)

    Zhou, Lifang; Zhao, Fuju; Hu, Binjie; Fang, Yi; Miao, Yingxin; Huang, Yiqin; Ji, Da'nian; Zhang, Jinghao; Xu, Lingli; Zhang, Yanmei; Bao, Zhijun; Zhao, Hu

    2015-10-01

    Currently, several diagnostic assays for Helicobacter pylori (H. pylori) are available, but each has some limitations. Further, a high-flux quantitative assay is required to assist clinical diagnosis and monitor the effectiveness of therapy and novel vaccine candidates. Three hundred and eighty-seven adult patients [nonulcer dyspepsia (NUD) 295, peptic ulcer disease (PUD) 77, gastric cancer (GC) 15] were enrolled for gastrointestinal endoscopies. Three biopsy samples from gastric antrum were collected for the following tests: culture, rapid urease test (RUT), histopathology, conventional polymerase chain reaction (PCR), and Multiple Genetic Analysis System (MGAS). The diagnostic capability of H. pylori for all methods was evaluated through the receiver operating characteristic (ROC) curves. Based on the gold standard, the sensitivity and specificity of MGAS were 92.9 and 92.4%, and positive predict value (PPV) and negative predict value (NPV) were 96.0 and 87.1%, respectively. All the above parameters of MGAS were higher than that of culture (except its specificity), RUT and histopathology, and nearly closed to that of conventional PCR. The area under curve (AUC) was 0.7575 (Culture), 0.8870 (RUT), 0.9000 (Histopathology), 0.9496 (Conventional PCR), and 0.9277 (MGAS). No significant statistical difference was observed for the H. pylori DNA load in different disease groups (p = .067). In contrast, a statistically significant difference in the H. pylori DNA copy number was observed based on age (p = .043) and gender (p = .021). The data showed that MGAS performed well in detecting H. pylori infection. Furthermore, the quantitative analysis showed that the load of H. pylori was significantly different within both age and gender groups. These results suggested that MGAS could be a potential alternative method for clinical detection and monitoring of the effectiveness of H. pylori therapy. © 2015 John Wiley & Sons Ltd.

  12. The quantitative analysis of instantaneous floating-point amplifier in DFS-V seismic system

    Energy Technology Data Exchange (ETDEWEB)

    Xianiu, H.; Wenze, Y.; Yanyun, D.; Qi, F.

    1988-01-01

    On the basis of analysing the principle and circuit of instantaneous floating-point amplifier (IFP amplifier) in DFS-V seismic system, the authors developed a set of mathematic models for quantitative analysis of IFP amplifier, and designed the program for computing and plotting. The amplitude characteristic curves were computed and plotted on DUAL-6800 microcomputer by making use of the original parameters of DFS-V seismic system. The computation result shows that this system can effectively restrain the overflow distortion caused by zero-passage dipping-top sample to make high fidelity transmission of seismic signals, but its measurement accuracy is on the low side. In order to overcome the demerit, the authors changed the original parameters of IFP amplifier circuit and made a great deal of sweep computation, so that they found these ideal new parameters: prediction time(c) = 3..mu..s, upper reference level(V) = 6.8V and 7.2V, lower reference level(V) = 1.6V and 1.7V. As a result, no overflow distortion occurs, furthermore IFP amplifier shows much higher measurement accuracy than before, these favour the ameliorating of seismic signals.

  13. A quantitative analysis of hydraulic interaction processes in stream-aquifer systems.

    Science.gov (United States)

    Wang, Wenke; Dai, Zhenxue; Zhao, Yaqian; Li, Junting; Duan, Lei; Wang, Zhoufeng; Zhu, Lin

    2016-01-28

    The hydraulic relationship between the stream and aquifer can be altered from hydraulic connection to disconnection when the pumping rate exceeds the maximum seepage flux of the streambed. This study proposes to quantitatively analyze the physical processes of stream-aquifer systems from connection to disconnection. A free water table equation is adopted to clarify under what conditions a stream starts to separate hydraulically from an aquifer. Both the theoretical analysis and laboratory tests have demonstrated that the hydraulic connectedness of the stream-aquifer system can reach a critical disconnection state when the horizontal hydraulic gradient at the free water surface is equal to zero and the vertical is equal to 1. A boundary-value problem for movement of the critical point of disconnection is established for an analytical solution of the inverted water table movement beneath the stream. The result indicates that the maximum distance or thickness of the inverted water table is equal to the water depth in the stream, and at a steady state of disconnection, the maximum hydraulic gradient at the streambed center is 2. This study helps us to understand the hydraulic phenomena of water flow near streams and accurately assess surface water and groundwater resources.

  14. A novel strategy for quantitative analysis of the formulated complex system using chromatographic fingerprints combined with some chemometric techniques.

    Science.gov (United States)

    Zhong, Xuan; Yan, Jun; Li, Yan-Chun; Kong, Bo; Lu, Hong-Bing; Liang, Yi-Zeng

    2014-11-28

    In this work, a novel strategy based on chromatographic fingerprints and some chemometric techniques is proposed for quantitative analysis of the formulated complex system. Here, the formulated complex system means a formulated type of complicated analytical system containing more than one kind of raw material under some concentration composition according to a certain formula. The strategy is elaborated by an example of quantitative determination of mixtures consist of three essential oils. Three key steps of the strategy are as follows: (1) remove baselines of the chromatograms; (2) align retention time; (3) conduct quantitative analysis using multivariate regression with entire chromatographic profiles. Through the determination of concentration compositions of nine mixtures arranged by uniform design, the feasibility of the proposed strategy is validated and the factors that influence the quantitative result are also discussed. This strategy is proved to be viable and the validation indicates that quantitative result obtained using this strategy mainly depends on the efficiency of the alignment method as well as chromatographic peak shape of the chromatograms. Previously, chromatographic fingerprints were only used for identification and/or recognition of some products. This work demonstrates that with the assistance of some effective chemometric techniques, chromatographic fingerprints are also potential and promising in solving quantitative problems of complex analytical systems.

  15. On-Orbit Quantitative Real-Time Gene Expression Analysis Using the Wetlab-2 System

    Science.gov (United States)

    Parra, Macarena; Jung, Jimmy; Almeida, Eduardo; Boone, Travis; Tran, Luan; Schonfeld, Julie

    2015-01-01

    NASA Ames Research Center's WetLab-2 Project enables on-orbit quantitative Reverse Transcriptase PCR (qRT-PCR) analysis without the need for sample return. The WetLab-2 system is capable of processing sample types ranging from microbial cultures to animal tissues dissected on-orbit. The project developed a RNA preparation module that can lyse cells and extract RNA of sufficient quality and quantity for use as templates in qRT-PCR reactions. Our protocol has the advantage of using non-toxic chemicals and does not require alcohols or other organics. The resulting RNA is dispensed into reaction tubes that contain all lyophilized reagents needed to perform qRT-PCR reactions. System operations require simple and limited crew actions including syringe pushes, valve turns and pipette dispenses. The project selected the Cepheid SmartCycler (TradeMark), a Commercial-Off-The-Shelf (COTS) qRT-PCR unit, because of its advantages including rugged modular design, low power consumption, rapid thermal ramp times and four-color multiplex detection. Single tube multiplex assays can be used to normalize for RNA concentration and integrity, and to study multiple genes of interest in each module. The WetLab-2 system can downlink data from the ISS to the ground after a completed run and uplink new thermal cycling programs. The ability to conduct qRT-PCR and generate results on-orbit is an important step towards utilizing the ISS as a National Laboratory facility. Specifically, the ability to get on-orbit data will provide investigators with the opportunity to adjust experimental parameters in real time without the need for sample return and re-flight. On orbit gene expression analysis can also eliminate the confounding effects on gene expression of reentry stresses and shock acting on live cells and organisms or the concern of RNA degradation of fixed samples and provide on-orbit gene expression benchmarking prior to sample return. Finally, the system can also be used for analysis of

  16. Quantitative analysis of structural variations in corpus callosum in adults with multiple system atrophy (MSA)

    Science.gov (United States)

    Bhattacharya, Debanjali; Sinha, Neelam; Saini, Jitender

    2017-03-01

    Multiple system atrophy (MSA) is a rare, non-curable, progressive neurodegenerative disorder that affects nervous system and movement, poses a considerable diagnostic challenge to medical researchers. Corpus callosum (CC) being the largest white matter structure in brain, enabling inter-hemispheric communication, quantification of callosal atrophy may provide vital information at the earliest possible stages. The main objective is to identify the differences in CC structure for this disease, based on quantitative analysis on the pattern of callosal atrophy. We report results of quantification of structural changes in regional anatomical thickness, area and length of CC between patient-groups with MSA with respect to healthy controls. The method utilizes isolating and parcellating the mid-sagittal CC into 100 segments along the length - measuring the width of each segment. It also measures areas within geometrically defined five callosal compartments of the well-known Witelson, and Hofer-Frahma schemes. For quantification, statistical tests are performed on these different callosal measurements. From the statistical analysis, it is concluded that compared to healthy controls, width is reduced drastically throughout CC for MSA group and as well as changes in area and length are also significant for MSA. The study is further extended to check if any significant difference in thickness is found between the two variations of MSA, Parkinsonian MSA and Cerebellar MSA group, using the same methodology. However area and length of this two sub-MSA group, no substantial difference is obtained. The study is performed on twenty subjects for each control and MSA group, who had T1-weighted MRI.

  17. Submarine Pipeline Routing Risk Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐慧; 于莉; 胡云昌; 王金英

    2004-01-01

    A new method for submarine pipeline routing risk quantitative analysis was provided, and the study was developed from qualitative analysis to quantitative analysis.The characteristics of the potential risk of the submarine pipeline system were considered, and grey-mode identification theory was used. The study process was composed of three parts: establishing the indexes system of routing risk quantitative analysis, establishing the model of grey-mode identification for routing risk quantitative analysis, and establishing the standard of mode identification result. It is shown that this model can directly and concisely reflect the hazard degree of the routing through computing example, and prepares the routing selection for the future.

  18. Geothermal Power Plant Maintenance: Evaluating Maintenance System Needs Using Quantitative Kano Analysis

    Directory of Open Access Journals (Sweden)

    Reynir S. Atlason

    2014-07-01

    Full Text Available A quantitative Kano model is used in this study to identify which features are preferred by top-level maintenance engineers within Icelandic geothermal power plants to be implemented in a maintenance tool or software. Visits were conducted to the largest Icelandic energy companies operating geothermal power plants. Thorough interviews with chiefs of operations and maintenance were used as a basis for a quantitative Kano analysis. Thirty seven percent of all maintenance engineers at Reykjavik Energy and Landsvirkjun, responsible for 71.5% of the total energy production from geothermal resources in Iceland, answered the Kano questionnaire. Findings show that solutions focusing on (1 planning maintenance according to condition; (2 shortening documentation times; and (3 risk analysis are sought after by the energy companies but not provided for the geothermal sector specifically.

  19. Quantitative luminescence imaging system

    Science.gov (United States)

    Batishko, C. R.; Stahl, K. A.; Fecht, B. A.

    The goal of the Measurement of Chemiluminescence project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  20. The Isotropic Fractionator as a Tool for Quantitative Analysis in Central Nervous System Diseases.

    Science.gov (United States)

    Repetto, Ivan E; Monti, Riccardo; Tropiano, Marta; Tomasi, Simone; Arbini, Alessia; Andrade-Moraes, Carlos-Humberto; Lent, Roberto; Vercelli, Alessandro

    2016-01-01

    One major aim in quantitative and translational neuroscience is to achieve a precise and fast neuronal counting method to work on high throughput scale to obtain reliable results. Here, we tested the isotropic fractionator (IF) method for evaluating neuronal and non-neuronal cell loss in different models of central nervous system (CNS) pathologies. Sprague-Dawley rats underwent: (i) ischemic brain damage; (ii) intraperitoneal injection with kainic acid (KA) to induce epileptic seizures; and (iii) monolateral striatal injection with quinolinic acid (QA) mimicking human Huntington's disease. All specimens were processed for IF method and cell loss assessed. Hippocampus from KA-treated rats and striatum from QA-treated rats were carefully dissected using a dissection microscope and a rat brain matrix. Ischemic rat brains slices were first processed for TTC staining and then for IF. In the ischemic group the cell loss corresponded to the neuronal loss suggesting that hypoxia primarily affects neurons. Combining IF with TTC staining we could correlate the volume of lesion to the neuronal loss; by IF, we could assess that neuronal loss also occurs contralaterally to the ischemic side. In the epileptic group we observed a reduction of neuronal cells in treated rats, but also evaluated the changes in the number of non-neuronal cells in response to the hippocampal damage. In the QA model, there was a robust reduction of neuronal cells on ipsilateral striatum. This neuronal cell loss was not related to a drastic change in the total number of cells, being overcome by the increase in non-neuronal cells, thus suggesting that excitotoxic damage in the striatum strongly activates inflammation and glial proliferation. We concluded that the IF method could represent a simple and reliable quantitative technique to evaluate the effects of experimental lesions mimicking human diseases, and to consider the neuroprotective/anti-inflammatory effects of different treatments in the whole

  1. Quantitative analysis of active pharmaceutical ingredients (APIs) using a potentiometric electronic tongue in a SIA flow system

    OpenAIRE

    2016-01-01

    An advanced potentiometric electronic tongue and Sequential Injection Analysis (SIA) measurement system was applied for the quantitative analysis of mixtures containing three active pharmaceutical ingredients (APIs): acetaminophen, ascorbic acid and acetylsalicylic acid, in the presence of various amounts of caffeine as interferent. The flow-through sensor array was composed of miniaturized classical ion-selective electrodes based on plasticized PVC membranes containing only ion exchangers. P...

  2. Improved Protein Arrays for Quantitative Systems Analysis of the Dynamics of Signaling Pathway Interactions

    Energy Technology Data Exchange (ETDEWEB)

    YANG, CHIN-RANG [NHLBI, NIH

    2013-12-11

    Astronauts and workers in nuclear plants who repeatedly exposed to low doses of ionizing radiation (IR, <10 cGy) are likely to incur specific changes in signal transduction and gene expression in various tissues of their body. Remarkable advances in high throughput genomics and proteomics technologies enable researchers to broaden their focus from examining single gene/protein kinetics to better understanding global gene/protein expression profiling and biological pathway analyses, namely Systems Biology. An ultimate goal of systems biology is to develop dynamic mathematical models of interacting biological systems capable of simulating living systems in a computer. This Glue Grant is to complement Dr. Boothman’s existing DOE grant (No. DE-FG02-06ER64186) entitled “The IGF1/IGF-1R-MAPK-Secretory Clusterin (sCLU) Pathway: Mediator of a Low Dose IR-Inducible Bystander Effect” to develop sensitive and quantitative proteomic technology that suitable for low dose radiobiology researches. An improved version of quantitative protein array platform utilizing linear Quantum dot signaling for systematically measuring protein levels and phosphorylation states for systems biology modeling is presented. The signals are amplified by a confocal laser Quantum dot scanner resulting in ~1000-fold more sensitivity than traditional Western blots and show the good linearity that is impossible for the signals of HRP-amplification. Therefore this improved protein array technology is suitable to detect weak responses of low dose radiation. Software is developed to facilitate the quantitative readout of signaling network activities. Kinetics of EGFRvIII mutant signaling was analyzed to quantify cross-talks between EGFR and other signaling pathways.

  3. A pre-sample charge measurement system for quantitative NMP-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kristiansson, P., E-mail: Per.Kristiansson@nuclear.lu.s [Division of Nuclear Physics, Department of Physics, Lund University, Box 118, SE-221 00 Lund (Sweden); Borysiuk, M.; Arteaga-Marrero, N.; Elfman, M.; Nilsson, E.J.C.; Nilsson, C.; Pallon, J. [Division of Nuclear Physics, Department of Physics, Lund University, Box 118, SE-221 00 Lund (Sweden)

    2010-06-15

    In many IBA applications the main aim is to obtain quantitative figures characterizing the sample. Normally charge, i.e. number of probe particles, is used for normalization and is measured either by collecting the charge deposited in the sample or by collecting the particle in a post-sample Faraday cup or in combination. Both these techniques have drawbacks and results can be difficult to compare for samples with different matrix composition. In this work, we present an upgraded design and test results from the Lund NMP pre-sample charge measurement system. The system presented is based on a pre-sample beam deflection controlled by the beam scanning system for the nuclear microprobe. It can be operated in different modes, but during normal operation the beam is blanked once per pixel and the corresponding charge is collected during the beam-off period. The system does not only measure an average of the beam current during data collection, but actually a pixel-by-pixel normalization is possible. Data of the system performance are presented and in addition illustrations of how quantitative measurements both for PIXE and elastic scattering can be made more reliable.

  4. Quantitative analysis of Hedgehog gradient formation using an inducible expression system

    Directory of Open Access Journals (Sweden)

    Brodsky Michael

    2007-05-01

    Full Text Available Abstract Background The Hedgehog (Hh family of secreted growth factors are morphogens that act in development to direct growth and patterning. Mutations in human Hh and other Hh pathway components have been linked to human diseases. Analysis of Hh distribution during development indicates that cholesterol modification and receptor mediated endocytosis affect the range of Hh signaling and the cellular localization of Hh. Results We have used an inducible, cell type-specific expression system to characterize the three-dimensional distribution of newly synthesized, GFP-tagged Hh in the developing Drosophila wing. Following induction of Hh-GFP expression in posterior producing cells, punctate structures containing Hh-GFP were observed in the anterior target cells. The distance of these particles from the expressing cells was quantified to determine the shape of the Hh gradient at different time points following induction. The majority of cholesterol-modified Hh-GFP was found associated with cells near the anterior/posterior (A/P boundary, which express high levels of Hh target genes. Without cholesterol, the Hh gradient was flatter, with a lower percentage of particles near the source and a greater maximum distance. Inhibition of Dynamin-dependent endocytosis blocked formation of intracellular Hh particles, but did not prevent movement of newly synthesized Hh to the apical or basolateral surfaces of target cells. In the absence of both cholesterol and endocytosis, Hh particles accumulated in the extracellular space. Staining for the Hh receptor Ptc revealed four categories of Hh particles: cytoplasmic with and without Ptc, and cell surface with and without Ptc. Interestingly, mainly cholesterol-modified Hh is detected in the cytoplasmic particles lacking Ptc. Conclusion We have developed a system to quantitatively analyze Hh distribution during gradient formation. We directly demonstrate that inhibition of Dynamin-dependent endocytosis is not

  5. Interdiffusion of the aluminum magnesium system. Quantitative analysis and numerical model; Interdiffusion des Aluminium-Magnesium-Systems. Quantitative Analyse und numerische Modellierung

    Energy Technology Data Exchange (ETDEWEB)

    Seperant, Florian

    2012-03-21

    Aluminum coatings are a promising approach to protect magnesium alloys against corrosion and thereby making them accessible to a variety of technical applications. Thermal treatment enhances the adhesion of the aluminium coating on magnesium by interdiffusion. For a deeper understanding of the diffusion process at the interface, a quantitative description of the Al-Mg system is necessary. On the basis of diffusion experiments with infinite reservoirs of aluminum and magnesium, the interdiffusion coefficients of the intermetallic phases of the Al-Mg-system are calculated with the Sauer-Freise method for the first time. To solve contradictions in the literature concerning the intrinsic diffusion coefficients, the possibility of a bifurcation of the Kirkendall plane is considered. Furthermore, a physico-chemical description of interdiffusion is provided to interpret the observed phase transitions. The developed numerical model is based on a temporally varied discretization of the space coordinate. It exhibits excellent quantitative agreement with the experimentally measured concentration profile. This confirms the validity of the obtained diffusion coefficients. Moreover, the Kirkendall shift in the Al-Mg system is simulated for the first time. Systems with thin aluminum coatings on magnesium also exhibit a good correlation between simulated and experimental concentration profiles. Thus, the diffusion coefficients are also valid for Al-coated systems. Hence, it is possible to derive parameters for a thermal treatment by simulation, resulting in an optimized modification of the magnesium surface for technical applications.

  6. Quantitative analysis of vocal fold vibration during register change by high-speed digital imaging system

    Science.gov (United States)

    Kumada, Masanobu; Kobayashi, Noriko; Hirose, Hajime; Tayama, Niro; Imagawa, Hiroshi; Sakakibara, Ken-Ichi; Nito, Takaharu; Kakurai, Shin'ichi; Kumada, Chieko; Wada, Mamiko; Niimi, Seiji

    2002-05-01

    The physiological study of prosody is indispensable in terms not only of the physiological interest but also of the evaluation and treatment for pathological cases of prosody. In free talk, the changes of vocal fold vibration are found frequently and these phenomena are very important prosodic events. To analyze quantitatively the vocal fold vibration at the register change as the model of prosodic event, our high-speed digital imaging system was used at a rate of 4500 images of 256-256 pixels per second. Four healthy Japanese adults (2 males and 2 females) were served as subjects. Tasks were sustained phonation containing register changes. Two major categories (Category A and B) were found in the ways of changing of vocal fold vibrations at the register change. In Category A, changes were very smooth in terms of the vocal fold vibration. In Category B, changes were not so smooth with some additional events at the register change, such as the anterior-posterior phase difference of the vibration, the abduction of the vocal folds, or the interruption of the phonation. The number of the subtypes for Category B is thought to increase if more subjects with a wider range of variety are analyzed. For the study of prosody, our high-speed digital imaging system is a very powerful tool by which physiological information can be obtained.

  7. Quantitative Synthesis and Component Analysis of Single-Participant Studies on the Picture Exchange Communication System

    Science.gov (United States)

    Tincani, Matt; Devis, Kathryn

    2011-01-01

    The "Picture Exchange Communication System" (PECS) has emerged as the augmentative communication intervention of choice for individuals with autism spectrum disorder (ASD), with a supporting body of single-participant studies. This report describes a meta-analysis of 16 single-participant studies on PECS with percentage of nonoverlapping data…

  8. Study on Sintering System of Calcium Barium Sulphoaluminate by XRD Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Jun Chang

    2015-11-01

    Full Text Available Calcium barium sulphoaluminate (CBSA, derived from calcium sulphoaluminate (CSA, has excellent cementitious properties. In this study, the sintering system of CBSA with a theoretical stoichiometric Ca3BaAl6SO16 was investigated. Rietveld refinement was performed using TOPAS 4.2 software to quantitatively calculate the content of CBSA and the actual ionic site occupancy of Ba2+. The results indicate that the content of Ca4−xBaxAl6SO16 increases with increasing sintering temperature in the 1200–1400 °C ranges. When sintered at 1400 °C for 180 min, the content of CBSA reaches 88.4%. However, CBSA begins to decompose at 1440 °C, after which the content decreases. The replacement rate of Ba2+ was also enlarged by increasing sintering temperature and prolonged sintering time. Sintering at 1400 °C for 180 min is considered as the optimum when replacement rate of Ba2+ and the content of CBSA were taken into account. Ca3.2Ba0.8Al6SO16 with a content of 88.4% was synthesized.

  9. Quantitative Risk Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Helms, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investments or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.

  10. Quantitative proteomic analysis of Burkholderia pseudomallei Bsa type III secretion system effectors using hypersecreting mutants.

    Science.gov (United States)

    Vander Broek, Charles W; Chalmers, Kevin J; Stevens, Mark P; Stevens, Joanne M

    2015-04-01

    Burkholderia pseudomallei is an intracellular pathogen and the causative agent of melioidosis, a severe disease of humans and animals. One of the virulence factors critical for early stages of infection is the Burkholderia secretion apparatus (Bsa) Type 3 Secretion System (T3SS), a molecular syringe that injects bacterial proteins, called effectors, into eukaryotic cells where they subvert cellular functions to the benefit of the bacteria. Although the Bsa T3SS itself is known to be important for invasion, intracellular replication, and virulence, only a few genuine effector proteins have been identified and the complete repertoire of proteins secreted by the system has not yet been fully characterized. We constructed a mutant lacking bsaP, a homolog of the T3SS "gatekeeper" family of proteins that exert control over the timing and magnitude of effector protein secretion. Mutants lacking BsaP, or the T3SS translocon protein BipD, were observed to hypersecrete the known Bsa effector protein BopE, providing evidence of their role in post-translational control of the Bsa T3SS and representing key reagents for the identification of its secreted substrates. Isobaric Tags for Relative and Absolute Quantification (iTRAQ), a gel-free quantitative proteomics technique, was used to compare the secreted protein profiles of the Bsa T3SS hypersecreting mutants of B. pseudomallei with the isogenic parent strain and a bsaZ mutant incapable of effector protein secretion. Our study provides one of the most comprehensive core secretomes of B. pseudomallei described to date and identified 26 putative Bsa-dependent secreted proteins that may be considered candidate effectors. Two of these proteins, BprD and BapA, were validated as novel effector proteins secreted by the Bsa T3SS of B. pseudomallei.

  11. Quantitative analysis of a transpressional system, El Biod Arch, Ghadames Basin, Algeria

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.R.; Krantz, R.W. (ARCO International Oil and Gas Co., Plano, TX (United States)); Akkache, K.; Messaoudi, M.

    1996-01-01

    Trap definition within the northern extension of the Hassi Touareg - Rhourde El Baguel fault zone in the western Ghadames Basin of Algeria is difficult due to complex structural geometries. The fault zone consists of a narrow system of discontinuous. locally en echelon faults. Although north-trending to the south, the zone curves to a northeast trend to the north. Reserves associated with the southern portion of the system total 1500 MMBOR and 2 TCFG. Several lines of evidence support a strike-slip component of motion for the northern segment. Horizontal slickensides have been described in cores taken from wells within the fault trend. Fracture patterns measured from logs taken within the NE-SW fault trend show clusters expected for right-lateral Reidel shears. Although complicated by all evaporate sequence at mid-level in the stratigraphic section, we interpret downward converging faults imaged on recent 2D seismic as positive flower profiles. Map patterns are also interpreted as right-lateral, recognizing that the 2D grid cannot resolve all of the structural complexity. To confirm the component of strike-slip fault displacement, we applied a new quantitative method relating map view structural orientations to the shear magnitude, the degree of convergence or divergence, and the magnitudes of horizontal and vertical strains. Strike-slip to convergence ratios ranging from 2:1 to 3:1 were measured in the study area. Higher ratios (10:1) measured above the salt may indicate a detachment. These ratios also fit the regional tectonic pattern: to the south, where the fault zone trends due north, structural geometries support dip-slip inversion indicative of east-west shortening. Applying the same shortening vector to the northeast-trending part of the zone suggests oblique right-lateral motion, with a strike-slip to convergence ratio of 2:1.

  12. Quantitative analysis of a transpressional system, El Biod Arch, Ghadames Basin, Algeria

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.R.; Krantz, R.W. [ARCO International Oil and Gas Co., Plano, TX (United States); Akkache, K.; Messaoudi, M.

    1996-12-31

    Trap definition within the northern extension of the Hassi Touareg - Rhourde El Baguel fault zone in the western Ghadames Basin of Algeria is difficult due to complex structural geometries. The fault zone consists of a narrow system of discontinuous. locally en echelon faults. Although north-trending to the south, the zone curves to a northeast trend to the north. Reserves associated with the southern portion of the system total 1500 MMBOR and 2 TCFG. Several lines of evidence support a strike-slip component of motion for the northern segment. Horizontal slickensides have been described in cores taken from wells within the fault trend. Fracture patterns measured from logs taken within the NE-SW fault trend show clusters expected for right-lateral Reidel shears. Although complicated by all evaporate sequence at mid-level in the stratigraphic section, we interpret downward converging faults imaged on recent 2D seismic as positive flower profiles. Map patterns are also interpreted as right-lateral, recognizing that the 2D grid cannot resolve all of the structural complexity. To confirm the component of strike-slip fault displacement, we applied a new quantitative method relating map view structural orientations to the shear magnitude, the degree of convergence or divergence, and the magnitudes of horizontal and vertical strains. Strike-slip to convergence ratios ranging from 2:1 to 3:1 were measured in the study area. Higher ratios (10:1) measured above the salt may indicate a detachment. These ratios also fit the regional tectonic pattern: to the south, where the fault zone trends due north, structural geometries support dip-slip inversion indicative of east-west shortening. Applying the same shortening vector to the northeast-trending part of the zone suggests oblique right-lateral motion, with a strike-slip to convergence ratio of 2:1.

  13. Analyse quantitative des effluents de pyrolyse en milieu ouvert et fermé Quantitative Analysis of Pyrolysis Effluents in an Open and Closed System

    Directory of Open Access Journals (Sweden)

    Behar F.

    2006-11-01

    Full Text Available Dans la première partie de l'article, nous décrivons une technique de pyrolyse en milieu ouvert qui permet de caractériser les matières organiques complexes comme le kérogène, le charbon, les asphaltènes de roche et d'huiles, les substances humiques et fulviques etc. Les effluents de pyrolyse sont récupérés et fractionnés quantitativement puis analysés par des techniques spécifiques comme la chromatographie en phase gazeuse et le couplage chromatographie/spectrométrie de masse. Dans la deuxième partie, est présentée une technique de pyrolyse en milieu fermé pour simuler au laboratoire l'évolution thermique des kérogènes, asphaltènes ou huiles. Nous nous sommes surtout attachés à dresser des bilans massiques et des bilans de l'hydrogène sur l'ensemble des produits de pyrolyse. Pour cela, nous avons distingué cinq classes de poids moléculaire croissant : C1, C2-C5, C6-C13, C14+ et coke. La récupération quantitative et la séparation de chacune des cinq fractions permet une analyse moléculaire détaillée de chacune d'elles. The first part of this article describes an open pyrolysis system in order to characterize complex organic matter such as kerogen, coal, rock and oil asphaltenes and humic substances, etc. Pyrolysis effluents are recovered, fractionated quantitatively by liquid chromatography, and then they are analyzed by specific techniques such as gas chromatography and chromatography/mass-spectrometry coupling. The second part describes a pyrolysis technique in a closed system, used for the laboratory simulation of the thermal evolution of kerogens, asphaltenes or oils. A special effort has been made to give the mass and hydrogen balances for all pyrolysis products. For this, five classes have been distinguised with increasing molecular weight: C1, C2-C5, C6-C13, C14+ and coke. The quantitative recovery and the separation of each of the five fractions is used to make a detailed molecular analysis of each of

  14. Quantitative analysis of the brain-targeted delivery of drugs and model compounds using nano-delivery systems.

    Science.gov (United States)

    Kozlovskaya, Luba; Stepensky, David

    2013-10-10

    The blood-brain barrier (BBB) prevents drugs' permeability into the brain and limits management of brain diseases. Specialized drug delivery systems (DDSs) are utterly required to overcome this barrier and to achieve efficient delivery of therapeutic agents to the brain. For this purpose, drug-encapsulating nanoparticles or vesicles, drug conjugates and other types of DDSs are being developed by many research groups worldwide. However, efficiency of the brain drug/DDS delivery and targeting is usually presented in indirect and vague form and it is hard to quantitatively estimate it based on the reported data. We searched for the scientific papers that were published in 1970-2012 that reported delivery of drugs or model compounds to the brain following systemic administration of DDSs via parenteral routes and contained quantitative data on brain drug/DDS delivery and targeting efficiency. We identified 123 publications that matched the search criteria and analyzed their experimental settings, formulation types, analytical methods, and the claimed efficiencies of drug/DDS brain targeting (brain/plasma or brain/tissue concentration ratios) and brain accumulation (% of the administered dose that accumulated in the brain). Based on the outcomes of this analysis, we describe the major research trends, discuss the efficiencies of the different drug/DDS brain targeting approaches, and provide recommendations for quantitative assessment of brain-targeting DDSs in the appropriately designed studies. © 2013.

  15. Salivary gland branching morphogenesis: a quantitative systems analysis of the Eda/Edar/NFκB paradigm

    Directory of Open Access Journals (Sweden)

    Melnick Michael

    2009-06-01

    Full Text Available Abstract Background Ectodysplasin-A appears to be a critical component of branching morphogenesis. Mutations in mouse Eda or human EDA are associated with absent or hypoplastic sweat glands, sebaceous glands, lacrimal glands, salivary glands (SMGs, mammary glands and/or nipples, and mucous glands of the bronchial, esophageal and colonic mucosa. In this study, we utilized EdaTa (Tabby mutant mice to investigate how a marked reduction in functional Eda propagates with time through a defined genetic subcircuit and to test the proposition that canonical NFκB signaling is sufficient to account for the differential expression of developmentally regulated genes in the context of Eda polymorphism. Results The quantitative systems analyses do not support the stated hypothesis. For most NFκB-regulated genes, the observed time course of gene expression is nearly unchanged in Tabby (EdaTa as compared to wildtype mice, as is NFκB itself. Importantly, a subset of genes is dramatically differentially expressed in Tabby (Edar, Fgf8, Shh, Egf, Tgfa, Egfr, strongly suggesting the existence of an alternative Eda-mediated transcriptional pathway pivotal for SMG ontogeny. Experimental and in silico investigations have identified C/EBPα as a promising candidate. Conclusion In Tabby SMGs, upregulation of the Egf/Tgfα/Egfr pathway appears to mitigate the potentially severe abnormal phenotype predicted by the downregulation of Fgf8 and Shh. Others have suggested that the buffering of the phenotypic outcome that is coincident with variant Eda signaling could be a common mechanism that permits viable and diverse phenotypes, normal and abnormal. Our results support this proposition. Further, if branching epithelia use variations of a canonical developmental program, our results are likely applicable to understanding the phenotypes of other branching organs affected by Eda (EDA mutation.

  16. Quantitative Caffeine Analysis Using a Surface Sampling Probe Electrospray Ionization Tandem Mass Spectrometry System

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Michael J [ORNL; Deibel, Michael A. [Earlham College; Tomkins, Bruce A [ORNL; Van Berkel, Gary J [ORNL

    2005-01-01

    Quantitative determination of caffeine on reversed-phase C8 thin-layer chromatography plates using a surface sampling electrospray ionization system with tandem mass spectrometry detection is reported. The thin-layer chromatography/electrospray tandem mass spectrometry method employed a deuterium-labeled caffeine internal standard and selected reaction monitoring detection. Up to nine parallel caffeine bands on a single plate were sampled in a single surface scanning experiment requiring 35 min at a surface scan rate of 44 {mu}m/s. A reversed-phase HPLC/UV caffeine assay was developed in parallel to assess the mass spectrometry method performance. Limits of detection for the HPLC/UV and thin-layer chromatography/electrospray tandem mass spectrometry methods determined from the calibration curve statistics were 0.20 ng injected (0.50 {mu}L) and 1.0 ng spotted on the plate, respectively. Spike recoveries with standards and real samples ranged between 97 and 106% for both methods. The caffeine content of three diet soft drinks (Diet Coke, Diet Cherry Coke, Diet Pepsi) and three diet sport drinks (Diet Turbo Tea, Speed Stack Grape, Speed Stack Fruit Punch) was measured. The HPLC/UV and mass spectrometry determinations were in general agreement, and these values were consistent with the quoted values for two of the three diet colas. In the case of Diet Cherry Coke and the diet sports drinks, the determined caffeine amounts using both methods were consistently higher (by 8% or more) than the literature values.

  17. Qualitative insight and quantitative analysis of the effect of temperature on the coercivity of a magnetic system

    Directory of Open Access Journals (Sweden)

    Mariia Moskalenko

    2016-02-01

    Full Text Available The temperature dependence of the response of a magnetic system to an applied field can be understood qualitatively by considering variations in the energy surface characterizing the system and estimated quantitatively with rate theory. In the system analysed here, Fe/Sm-Co spring magnet, the width of the hysteresis loop is reduced to a half when temperature is raised from 25 K to 300 K. This narrowing can be explained and reproduced quantitatively without invoking temperature dependence of model parameters as has typically been done in previous data analysis. The applied magnetic field lowers the energy barrier for reorientation of the magnetization but thermal activation brings the system over the barrier. A 2-dimensional representation of the energy surface is developed and used to gain insight into the transition mechanism and to demonstrate how the applied field alters the transition path. Our results show the importance of explicitly including the effect of thermal activation when interpreting experiments involving the manipulation of magnetic systems at finite temperature.

  18. Monotowns: A Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Shastitko Andrei

    2016-06-01

    Full Text Available The authors propose an empirical analysis of the current situation in monotowns. The study questions the perceived seriousness of the ‘monotown problem’ as well as the actual challenges it presents. The authors use a cluster analysis to divide monotowns into groups for further structural comparison. The structural differences in the available databases limit the possibilities of empirical analysis. Hence, alternative approaches are required. The authors consider possible reasons for the limitations identified. Special attention is paid to the monotowns that were granted the status of advanced development territories. A comparative analysis makes it possible to study their general characteristics and socioeconomic indicators. The authors apply the theory of opportunistic behaviour to describe potential problems caused by the lack of unified criteria for granting monotowns the status of advanced development territories. The article identifies the main stakeholders and the character of their interaction; it desc ribes a conceptual model built on the principal/agent interactions, and identifies the parametric space of mutually beneficial cooperation. The solution to the principal/agent problem suggested in the article contributes to the development of an alternative approach to the current situation and a rational approach to overcoming the ‘monotown problem’.

  19. Quantitative analysis of illusory movement : spatial filtering and line localization in the human visual system

    NARCIS (Netherlands)

    Jansonius, Nomdo M.; Stam, Lucas; de Jong, Tim; Pijpker, Ben A.

    2014-01-01

    A narrow bar or line (width around 1 arcmin) between two fields of which the luminances are sinusoidally and in counterphase modulated in time appears to make an oscillatory movement. It is possible to annihilate this illusory movement with a real movement and thus to analyze this phenomenon quantit

  20. Extended statistical entropy analysis as a quantitative management tool for water resource systems

    Science.gov (United States)

    Sobantka, Alicja; Rechberger, Helmut

    2010-05-01

    The use of entropy in hydrology and water resources has been applied to various applications. As water resource systems are inherently spatial and complex, a stochastic description of these systems is needed, and entropy theory enables development of such a description by providing determination of the least-biased probability distributions with limited knowledge and data. Entropy can also serve as a basis for risk and reliability analysis. The relative entropy has been variously interpreted as a measure freedom of choice, uncertainty and disorder, information content, missing information or information gain or loss. In the analysis of empirical data, entropy is another measure of dispersion, an alternative to the variance. Also, as an evaluation tool, the statistical entropy analysis (SEA) has been developed by previous workers to quantify the power of a process to concentrate chemical elements. Within this research programme the SEA is aimed to be extended for application to chemical compounds and tested for its deficits and potentials in systems where water resources play an important role. The extended SEA (eSEA) will be developed first for the nitrogen balance in waste water treatment plants (WWTP). Later applications on the emission of substances to water bodies such as groundwater (e.g. leachate from landfills) will also be possible. By applying eSEA to the nitrogen balance in a WWTP, all possible nitrogen compounds, which may occur during the water treatment process, are taken into account and are quantified in their impact towards the environment and human health. It has been shown that entropy reducing processes are part of modern waste management. Generally, materials management should be performed in a way that significant entropy rise is avoided. The entropy metric might also be used to perform benchmarking on WWTPs. The result out of this management tool would be the determination of the efficiency of WWTPs. By improving and optimizing the efficiency

  1. Quantitative Gait Analysis Using a Motorized Treadmill System Sensitively Detects Motor Abnormalities in Mice Expressing ATPase Defective Spastin.

    Science.gov (United States)

    Connell, James W; Allison, Rachel; Reid, Evan

    2016-01-01

    The hereditary spastic paraplegias (HSPs) are genetic conditions in which there is progressive axonal degeneration in the corticospinal tract. Autosomal dominant mutations, including nonsense, frameshift and missense changes, in the gene encoding the microtubule severing ATPase spastin are the most common cause of HSP in North America and northern Europe. In this study we report quantitative gait analysis using a motorized treadmill system, carried out on mice knocked-in for a disease-associated mutation affecting a critical residue in the Walker A motif of the spastin ATPase domain. At 4 months and at one year of age homozygous mutant mice had a number of abnormal gait parameters, including in stride length and stride duration, compared to heterozygous and wild-type littermates. Gait parameters in heterozygous animals did not differ from wild-type littermates. We conclude that quantitative gait analysis using the DigiGait system sensitively detects motor abnormalities in a hereditary spastic paraplegia model, and would be a useful method for analyzing the effects of pharmacological treatments for HSP.

  2. Quantitative Analysis of the Trends Exhibited by the Three Interdisciplinary Biological Sciences: Biophysics, Bioinformatics, and Systems Biology.

    Science.gov (United States)

    Kang, Jonghoon; Park, Seyeon; Venkat, Aarya; Gopinath, Adarsh

    2015-12-01

    New interdisciplinary biological sciences like bioinformatics, biophysics, and systems biology have become increasingly relevant in modern science. Many papers have suggested the importance of adding these subjects, particularly bioinformatics, to an undergraduate curriculum; however, most of their assertions have relied on qualitative arguments. In this paper, we will show our metadata analysis of a scientific literature database (PubMed) that quantitatively describes the importance of the subjects of bioinformatics, systems biology, and biophysics as compared with a well-established interdisciplinary subject, biochemistry. Specifically, we found that the development of each subject assessed by its publication volume was well described by a set of simple nonlinear equations, allowing us to characterize them quantitatively. Bioinformatics, which had the highest ratio of publications produced, was predicted to grow between 77% and 93% by 2025 according to the model. Due to the large number of publications produced in bioinformatics, which nearly matches the number published in biochemistry, it can be inferred that bioinformatics is almost equal in significance to biochemistry. Based on our analysis, we suggest that bioinformatics be added to the standard biology undergraduate curriculum. Adding this course to an undergraduate curriculum will better prepare students for future research in biology.

  3. An integrated one-chip-sensor system for microRNA quantitative analysis based on digital droplet polymerase chain reaction

    Science.gov (United States)

    Tsukuda, Masahiko; Wiederkehr, Rodrigo Sergio; Cai, Qing; Majeed, Bivragh; Fiorini, Paolo; Stakenborg, Tim; Matsuno, Toshinobu

    2016-04-01

    A silicon microfluidic chip was developed for microRNA (miRNA) quantitative analysis. It performs sequentially reverse transcription and polymerase chain reaction in a digital droplet format. Individual processes take place on different cavities, and reagent and sample mixing is carried out on a chip, prior to entering each compartment. The droplets are generated on a T-junction channel before the polymerase chain reaction step. Also, a miniaturized fluorescence detector was developed, based on an optical pick-up head of digital versatile disc (DVD) and a micro-photomultiplier tube. The chip integrated in the detection system was tested using synthetic miRNA with known concentrations, ranging from 300 to 3,000 templates/µL. Results proved the functionality of the system.

  4. Cancer detection by quantitative fluorescence image analysis.

    Science.gov (United States)

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  5. Development of novel visual-plus quantitative analysis systems for studying DNA double-strand break repairs in zebrafish.

    Science.gov (United States)

    Liu, Jingang; Gong, Lu; Chang, Changqing; Liu, Cong; Peng, Jinrong; Chen, Jun

    2012-09-20

    The use of reporter systems to analyze DNA double-strand break (DSB) repairs, based on the enhanced green fluorescent protein (EGFP) and meganuclease such as I-Sce I, is usually carried out with cell lines. In this study, we developed three visual-plus quantitative assay systems for homologous recombination (HR), non-homologous end joining (NHEJ) and single-strand annealing (SSA) DSB repair pathways at the organismal level in zebrafish embryos. To initiate DNA DSB repair, we used two I-Sce I recognition sites in opposite orientation rather than the usual single site. The NHEJ, HR and SSA repair pathways were separately triggered by the injection of three corresponding I-Sce I-cut constructions, and the repair of DNA lesion caused by I-Sce I could be tracked by EGFP expression in the embryos. Apart from monitoring the intensity of green fluorescence, the repair frequencies could also be precisely measured by quantitative real-time polymerase chain reaction (qPCR). Analysis of DNA sequences at the DSB sites showed that NHEJ was predominant among these three repair pathways in zebrafish embryos. Furthermore, while HR and SSA reporter systems could be effectively decreased by the knockdown of rad51 and rad52, respectively, NHEJ could only be impaired by the knockdown of ligaseIV (lig4) when the NHEJ construct was cut by I-Sce I in vivo. More interestingly, blocking NHEJ with lig4-MO increased the frequency of HR, but decreased the frequency of SSA. Our studies demonstrate that the major mechanisms used to repair DNA DSBs are conserved from zebrafish to mammal, and zebrafish provides an excellent model for studying and manipulating DNA DSB repair at the organismal level.

  6. Development of Novel Visual-Plus Quantitative Analysis Systems for Studying DNA Double-Strand Break Repairs in Zebrafish

    Institute of Scientific and Technical Information of China (English)

    Jingang Liu; Lu Gong; Changqing Chang; Cong Liu; Jinrong Peng; Jun Chen

    2012-01-01

    The use of reporter systems to analyze DNA double-strand break (DSB) repairs,based on the enhanced green fluorescent protein (EGFP) and meganuclease such as I-Sce Ⅰ,is usually carried out with cell lines.In this study,we developed three visual-plus quantitative assay systems for homologous recombination (HR),non-homologous end joining (NHEJ) and single-strand annealing (SSA) DSB repair pathways at the organismal level in zebrafish embryos.To initiate DNA DSB repair,we used two I-Sce Ⅰ recognition sites in opposite orientation rather than the usual single site.The NHEJ,HR and SSA repair pathways were separately triggered by the injection of three corresponding I-Sce I-cut constructions,and the repair of DNA lesion caused by I-Sce Ⅰ could be tracked by EGFP expression in the embryos.Apart from monitoring the intensity of green fluorescence,the repair frequencies could also be precisely measured by quantitative real-time polymerase chain reaction (qPCR).Analysis of DNA sequences at the DSB sites showed that NHEJ was predominant among these three repair pathways in zebrafish embryos.Furthermore,while HR and SSA reporter systems could be effectively decreased by the knockdown of rad51 and rad52,respectively,NHEJ could only be impaired by the knockdown of ligaseⅣ (lig4) when the NHEJ construct was cut by I-Sce Ⅰ in vivo.More interestingly,blocking NHEJ with lig4-MO increased the frequency of HR,but decreased the frequency of SSA.Our studies demonstrate that the major mechanisms used to repair DNA DSBs are conserved from zebrafish to mammal,and zebrafish provides an excellent model for studying and manipulating DNA DSB repair at the organismal level.

  7. Successful downstream application of the Paxgene Blood RNA system from small blood samples in paediatric patients for quantitative PCR analysis

    Directory of Open Access Journals (Sweden)

    Mankhambo Limangeni A

    2007-09-01

    Full Text Available Abstract Background The challenge of gene expression studies is to reliably quantify levels of transcripts, but this is hindered by a number of factors including sample availability, handling and storage. The PAXgene™ Blood RNA System includes a stabilizing additive in a plastic evacuated tube, but requires 2.5 mL blood, which makes routine implementation impractical for paediatric use. The aim of this study was to modify the PAXgene™ Blood RNA System kit protocol for application to small, sick chidren, without compromising RNA integrity, and subsequently to perform quantitative analysis of ICAM and interleukin-6 gene expression. Aliquots of 0.86 mL PAXgene™ reagent were put into microtubes and 0.3 mL whole blood added to maintain the same recommended proportions as in the PAXgene™ evacuated tube system. RNA quality was assessed using the Agilent BioAnalyser 2100 and an in-house TaqMan™ assay which measures GAPDH transcript integrity by determining 3' to 5' ratios. qPCR analysis was performed on an additional panel of 7 housekeeping genes. Three reference genes (HPRT1, YWHAZ and GAPDH were identified using the GeNORM algorithm, which were subsequently used to normalising target gene expression levels. ICAM-1 and IL-6 gene expression were measured in 87 Malawian children with invasive pneumococcal disease. Results Total RNA yield was between 1,114 and 2,950 ng and the BioAnalyser 2100 demonstrated discernible 18s and 28s bands. The cycle threshold values obtained for the seven housekeeping genes were between 15 and 30 and showed good consistency. Median relative ICAM and IL-6 gene expression were significantly reduced in non-survivors compared to survivors (ICAM: 3.56 vs 4.41, p = 0.04, and IL-6: 2.16 vs 6.73, p = 0.02. Conclusion We have successfully modified the PAXgene™ blood collection system for use in small children and demonstrated preservation of RNA integrity and successful quantitative real-time PCR analysis.

  8. Quantitative and Descriptive Comparison of Four Acoustic Analysis Systems: Vowel Measurements

    Science.gov (United States)

    Burris, Carlyn; Vorperian, Houri K.; Fourakis, Marios; Kent, Ray D.; Bolt, Daniel M.

    2014-01-01

    Purpose: This study examines accuracy and comparability of 4 trademarked acoustic analysis software packages (AASPs): Praat, WaveSurfer, TF32, and CSL by using synthesized and natural vowels. Features of AASPs are also described. Method: Synthesized and natural vowels were analyzed using each of the AASP's default settings to secure 9…

  9. Examination of Modeling Languages to Allow Quantitative Analysis for Model-Based Systems Engineering

    Science.gov (United States)

    2014-06-01

    model of the system (Friendenthal, Moore and Steiner 2008, 17). The premise is that maintaining a logical and consistent model can be accomplished...Standard for Exchange of Product data (STEP) subgroup of ISO, and defines a standard data format for certain types of SE information ( Johnson 2006...search.credoreference.com/content/entry/encyccs/formal_languages/0. Friedenthal, Sanford, Alan Moore, and Rick Steiner . 2008. A Practical Guide to SysML

  10. Quantitative analysis of arm movement smoothness

    Science.gov (United States)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  11. Seniors' Online Communities: A Quantitative Content Analysis

    Science.gov (United States)

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  12. Quantitative analysis of glycated proteins.

    Science.gov (United States)

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  13. Dental functional status in a southern vietnamese adult population-a combined quantitative and qualitative classification system analysis

    NARCIS (Netherlands)

    Nguyen, T.C.; Witter, D.J.; Bronkhorst, E.M.; Pham, L.H.; Creugers, N.H.J.

    2011-01-01

    PURPOSE: The aim of this study was to explore the dental functional status of a Southern Vietnamese adult population using a new quantitative- and qualitative-based classification system. MATERIALS AND METHODS: The sample consisted of 2,809 dentate subjects aged =/> 20 years from urban and rural

  14. Optimization of an Optical Inspection System Based on the Taguchi Method for Quantitative Analysis of Point-of-Care Testing

    Directory of Open Access Journals (Sweden)

    Chia-Hsien Yeh

    2014-09-01

    Full Text Available This study presents an optical inspection system for detecting a commercial point-of-care testing product and a new detection model covering from qualitative to quantitative analysis. Human chorionic gonadotropin (hCG strips (cut-off value of the hCG commercial product is 25 mIU/mL were the detection target in our study. We used a complementary metal-oxide semiconductor (CMOS sensor to detect the colors of the test line and control line in the specific strips and to reduce the observation errors by the naked eye. To achieve better linearity between the grayscale and the concentration, and to decrease the standard deviation (increase the signal to noise ratio, S/N, the Taguchi method was used to find the optimal parameters for the optical inspection system. The pregnancy test used the principles of the lateral flow immunoassay, and the colors of the test and control line were caused by the gold nanoparticles. Because of the sandwich immunoassay model, the color of the gold nanoparticles in the test line was darkened by increasing the hCG concentration. As the results reveal, the S/N increased from 43.48 dB to 53.38 dB, and the hCG concentration detection increased from 6.25 to 50 mIU/mL with a standard deviation of less than 10%. With the optimal parameters to decrease the detection limit and to increase the linearity determined by the Taguchi method, the optical inspection system can be applied to various commercial rapid tests for the detection of ketamine, troponin I, and fatty acid binding protein (FABP.

  15. Optimization of an optical inspection system based on the Taguchi method for quantitative analysis of point-of-care testing.

    Science.gov (United States)

    Yeh, Chia-Hsien; Zhao, Zi-Qi; Shen, Pi-Lan; Lin, Yu-Cheng

    2014-09-01

    This study presents an optical inspection system for detecting a commercial point-of-care testing product and a new detection model covering from qualitative to quantitative analysis. Human chorionic gonadotropin (hCG) strips (cut-off value of the hCG commercial product is 25 mIU/mL) were the detection target in our study. We used a complementary metal-oxide semiconductor (CMOS) sensor to detect the colors of the test line and control line in the specific strips and to reduce the observation errors by the naked eye. To achieve better linearity between the grayscale and the concentration, and to decrease the standard deviation (increase the signal to noise ratio, S/N), the Taguchi method was used to find the optimal parameters for the optical inspection system. The pregnancy test used the principles of the lateral flow immunoassay, and the colors of the test and control line were caused by the gold nanoparticles. Because of the sandwich immunoassay model, the color of the gold nanoparticles in the test line was darkened by increasing the hCG concentration. As the results reveal, the S/N increased from 43.48 dB to 53.38 dB, and the hCG concentration detection increased from 6.25 to 50 mIU/mL with a standard deviation of less than 10%. With the optimal parameters to decrease the detection limit and to increase the linearity determined by the Taguchi method, the optical inspection system can be applied to various commercial rapid tests for the detection of ketamine, troponin I, and fatty acid binding protein (FABP).

  16. Neurofilament protein defines regional patterns of cortical organization in the macaque monkey visual system: a quantitative immunohistochemical analysis

    Science.gov (United States)

    Hof, P. R.; Morrison, J. H.; Bloom, F. E. (Principal Investigator)

    1995-01-01

    Visual function in monkeys is subserved at the cortical level by a large number of areas defined by their specific physiological properties and connectivity patterns. For most of these cortical fields, a precise index of their degree of anatomical specialization has not yet been defined, although many regional patterns have been described using Nissl or myelin stains. In the present study, an attempt has been made to elucidate the regional characteristics, and to varying degrees boundaries, of several visual cortical areas in the macaque monkey using an antibody to neurofilament protein (SMI32). This antibody labels a subset of pyramidal neurons with highly specific regional and laminar distribution patterns in the cerebral cortex. Based on the staining patterns and regional quantitative analysis, as many as 28 cortical fields were reliably identified. Each field had a homogeneous distribution of labeled neurons, except area V1, where increases in layer IVB cell and in Meynert cell counts paralleled the increase in the degree of eccentricity in the visual field representation. Within the occipitotemporal pathway, areas V3 and V4 and fields in the inferior temporal cortex were characterized by a distinct population of neurofilament-rich neurons in layers II-IIIa, whereas areas located in the parietal cortex and part of the occipitoparietal pathway had a consistent population of large labeled neurons in layer Va. The mediotemporal areas MT and MST displayed a distinct population of densely labeled neurons in layer VI. Quantitative analysis of the laminar distribution of the labeled neurons demonstrated that the visual cortical areas could be grouped in four hierarchical levels based on the ratio of neuron counts between infragranular and supragranular layers, with the first (areas V1, V2, V3, and V3A) and third (temporal and parietal regions) levels characterized by low ratios and the second (areas MT, MST, and V4) and fourth (frontal regions) levels characterized by

  17. Neurofilament protein defines regional patterns of cortical organization in the macaque monkey visual system: a quantitative immunohistochemical analysis

    Science.gov (United States)

    Hof, P. R.; Morrison, J. H.; Bloom, F. E. (Principal Investigator)

    1995-01-01

    Visual function in monkeys is subserved at the cortical level by a large number of areas defined by their specific physiological properties and connectivity patterns. For most of these cortical fields, a precise index of their degree of anatomical specialization has not yet been defined, although many regional patterns have been described using Nissl or myelin stains. In the present study, an attempt has been made to elucidate the regional characteristics, and to varying degrees boundaries, of several visual cortical areas in the macaque monkey using an antibody to neurofilament protein (SMI32). This antibody labels a subset of pyramidal neurons with highly specific regional and laminar distribution patterns in the cerebral cortex. Based on the staining patterns and regional quantitative analysis, as many as 28 cortical fields were reliably identified. Each field had a homogeneous distribution of labeled neurons, except area V1, where increases in layer IVB cell and in Meynert cell counts paralleled the increase in the degree of eccentricity in the visual field representation. Within the occipitotemporal pathway, areas V3 and V4 and fields in the inferior temporal cortex were characterized by a distinct population of neurofilament-rich neurons in layers II-IIIa, whereas areas located in the parietal cortex and part of the occipitoparietal pathway had a consistent population of large labeled neurons in layer Va. The mediotemporal areas MT and MST displayed a distinct population of densely labeled neurons in layer VI. Quantitative analysis of the laminar distribution of the labeled neurons demonstrated that the visual cortical areas could be grouped in four hierarchical levels based on the ratio of neuron counts between infragranular and supragranular layers, with the first (areas V1, V2, V3, and V3A) and third (temporal and parietal regions) levels characterized by low ratios and the second (areas MT, MST, and V4) and fourth (frontal regions) levels characterized by

  18. Quantitative analysis of urban pluvial flood alleviation by open surface water systems in new towns: Comparing Almere and Tianjin eco-city

    NARCIS (Netherlands)

    Zhou, Z.; Qu, L.; Zou, T.

    2015-01-01

    Increased surface runoff generated in urban areas due to larger proportion of impervious surfaces has, in many cases, exceeded the capacity of urban drainage systems. In response to such challenge, this paper introduces the quantitative analysis of pluvial flood alleviation by open surface water sys

  19. Quantitative analysis of urban pluvial flood alleviation by open surface water systems in new towns: Comparing Almere and Tianjin eco-city

    NARCIS (Netherlands)

    Zhou, Z.; Qu, L.; Zou, T.

    2015-01-01

    Increased surface runoff generated in urban areas due to larger proportion of impervious surfaces has, in many cases, exceeded the capacity of urban drainage systems. In response to such challenge, this paper introduces the quantitative analysis of pluvial flood alleviation by open surface water sys

  20. Quantitative phosphoproteomic analysis reveals system-wide signaling pathways downstream of SDF-1/CXCR4 in breast cancer stem cells.

    Science.gov (United States)

    Yi, Tingfang; Zhai, Bo; Yu, Yonghao; Kiyotsugu, Yoshikawa; Raschle, Thomas; Etzkorn, Manuel; Seo, Hee-Chan; Nagiec, Michal; Luna, Rafael E; Reinherz, Ellis L; Blenis, John; Gygi, Steven P; Wagner, Gerhard

    2014-05-27

    Breast cancer is the leading cause of cancer-related mortality in women worldwide, with an estimated 1.7 million new cases and 522,000 deaths around the world in 2012 alone. Cancer stem cells (CSCs) are essential for tumor reoccurrence and metastasis which is the major source of cancer lethality. G protein-coupled receptor chemokine (C-X-C motif) receptor 4 (CXCR4) is critical for tumor metastasis. However, stromal cell-derived factor 1 (SDF-1)/CXCR4-mediated signaling pathways in breast CSCs are largely unknown. Using isotope reductive dimethylation and large-scale MS-based quantitative phosphoproteome analysis, we examined protein phosphorylation induced by SDF-1/CXCR4 signaling in breast CSCs. We quantified more than 11,000 phosphorylation sites in 2,500 phosphoproteins. Of these phosphosites, 87% were statistically unchanged in abundance in response to SDF-1/CXCR4 stimulation. In contrast, 545 phosphosites in 266 phosphoproteins were significantly increased, whereas 113 phosphosites in 74 phosphoproteins were significantly decreased. SDF-1/CXCR4 increases phosphorylation in 60 cell migration- and invasion-related proteins, of them 43 (>70%) phosphoproteins are unrecognized. In addition, SDF-1/CXCR4 upregulates the phosphorylation of 44 previously uncharacterized kinases, 8 phosphatases, and 1 endogenous phosphatase inhibitor. Using computational approaches, we performed system-based analyses examining SDF-1/CXCR4-mediated phosphoproteome, including construction of kinase-substrate network and feedback regulation loops downstream of SDF-1/CXCR4 signaling in breast CSCs. We identified a previously unidentified SDF-1/CXCR4-PKA-MAP2K2-ERK signaling pathway and demonstrated the feedback regulation on MEK, ERK1/2, δ-catenin, and PPP1Cα in SDF-1/CXCR4 signaling in breast CSCs. This study gives a system-wide view of phosphorylation events downstream of SDF-1/CXCR4 signaling in breast CSCs, providing a resource for the study of CSC-targeted cancer therapy.

  1. Quantitative analysis of Boehm's GC

    Institute of Scientific and Technical Information of China (English)

    GUAN Xue-tao; ZHANG Yuan-rui; GOU Xiao-gang; CHENG Xu

    2003-01-01

    The term garbage collection describes the automated process of finding previously allocated memorythatis no longer in use in order to make the memory available to satisfy subsequent allocation requests. Wehave reviewed existing papers and implementations of GC, and especially analyzed Boehm' s C codes, which isa real-time mark-sweep GC running under Linux and ANSI C standard. In this paper, we will quantitatively an-alyze the performance of different configurations of Boehm' s collector subjected to different workloads. Reportedmeasurements demonstrate that a refined garbage collector is a viable alternative to traditional explicit memorymanagement techniques, even for low-level languages. It is more a trade-off for certain system than an all-or-nothing proposition.

  2. Quantitative structure-activity relationship analysis of substituted arylazo pyridone dyes in photocatalytic system: Experimental and theoretical study

    Energy Technology Data Exchange (ETDEWEB)

    Dostanić, J., E-mail: jasmina@nanosys.ihtm.bg.ac.rs [University of Belgrade, Institute of Chemistry, Technology and Metallurgy, Department of Catalysis and Chemical Engineering, Njegoševa 12, 11000 Belgrade (Serbia); Lončarević, D. [University of Belgrade, Institute of Chemistry, Technology and Metallurgy, Department of Catalysis and Chemical Engineering, Njegoševa 12, 11000 Belgrade (Serbia); Zlatar, M. [University of Belgrade, Institute of Chemistry, Technology and Metallurgy, Department of Chemistry, Njegoševa 12, 11000 Belgrade (Serbia); Vlahović, F. [University of Belgrade, Innovation center of the Faculty of Chemistry, 11000 Belgrade (Serbia); Jovanović, D.M. [University of Belgrade, Institute of Chemistry, Technology and Metallurgy, Department of Catalysis and Chemical Engineering, Njegoševa 12, 11000 Belgrade (Serbia)

    2016-10-05

    Highlights: • Electronic effects of para substituted arylazo pyridone dyes. • Linear relationship between Hammett σ{sub p} constants and dyes photoreactivity. • The photocatalytic reactions facilitated by el.-acceptors and retarded by el.-donors. • Fukui functions to analyze the reactivity on concurrent sites within a molecule. • Hydroxyl radicals sustain attack from two reaction sites, depending on a substituent type. - Abstract: A series of arylazo pyridone dyes was synthesized by changing the type of the substituent group in the diazo moiety, ranging from strong electron-donating to strong electron-withdrawing groups. The structural and electronic properties of the investigated dyes was calculated at the M062X/6-31 + G(d,p) level of theory. The observed good linear correlations between atomic charges and Hammett σ{sub p} constants provided a basis to discuss the transmission of electronic substituent effects through a dye framework. The reactivity of synthesized dyes was tested through their decolorization efficiency in TiO{sub 2} photocatalytic system (Degussa P-25). Quantitative structure-activity relationship analysis revealed a strong correlation between reactivity of investigated dyes and Hammett substituent constants. The reaction was facilitated by electron-withdrawing groups, and retarded by electron-donating ones. Quantum mechanical calculations was used in order to describe the mechanism of the photocatalytic oxidation reactions of investigated dyes and interpret their reactivities within the framework of the Density Functional Theory (DFT). According to DFT based reactivity descriptors, i.e. Fukui functions and local softness, the active site moves from azo nitrogen atom linked to benzene ring to pyridone carbon atom linked to azo bond, going from dyes with electron-donating groups to dyes with electron-withdrawing groups.

  3. Quantitative resilience analysis through control design.

    Energy Technology Data Exchange (ETDEWEB)

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris (Sandia National Laboratories, Carlsbad, NM)

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  4. Qualitative and Quantitative Analysis of Congested Marine Traffic Environment – An Application Using Marine Traffic Simulation System

    Directory of Open Access Journals (Sweden)

    Kazuhiko Hasegawa

    2013-06-01

    Full Text Available Difficulty of sailing is quite subjective matter. It depends on various factors. Using Marine Traffic Simulation System (MTSS developed by Osaka University this challenging subject is discussed. In this system realistic traffic flow including collision avoidance manoeuvres can be reproduced in a given area. Simulation is done for southward of Tokyo Bay, Strait of Singapore and off-Shanghai area changing traffic volume from 5 or 50 to 150 or 200% of the present volume. As a result, strong proportional relation between near-miss ratio and traffic density per hour per sailed area is found, independent on traffic volume, area size and configuration. The quantitative evaluation index of the difficulty of sailing, here called risk rate of the area is defined using thus defined traffic density and near-miss ratio.

  5. Isobaric Tags for Relative and Absolute Quantitation-Based Proteomic Analysis of Patent and Constricted Ductus Arteriosus Tissues Confirms the Systemic Regulation of Ductus Arteriosus Closure.

    Science.gov (United States)

    Hong, Haifa; Ye, Lincai; Chen, Huiwen; Xia, Yu; Liu, Yue; Liu, Jinfen; Lu, Yanan; Zhang, Haibo

    2015-08-01

    We aimed to evaluate global changes in protein expression associated with patency by undertaking proteomic analysis of human constricted and patent ductus arteriosus (DA). Ten constricted and 10 patent human DAs were excised from infants with ductal-dependent heart disease during surgery. Using isobaric tags for relative and absolute quantitation-based quantitative proteomics, 132 differentially expressed proteins were identified. Of 132 proteins, voltage-gated sodium channel 1.3 (SCN3A), myosin 1d (Myo1d), Rho GTPase activating protein 26 (ARHGAP26), and retinitis pigmentosa 1 (RP1) were selected for validation by Western blot and quantitative real-time polymerase chain reaction analyses. Significant upregulation of SCN3A, Myo1d, and RP1 messenger RNA, and protein levels was observed in the patent DA group (all P ≤ 0.048). ARHGAP26 messenger RNA and protein levels were decreased in patent DA tissue (both P ≤ 0.018). Immunohistochemistry analysis revealed that Myo1d, ARHGAP26, and RP1 were specifically expressed in the subendothelial region of constricted DAs; however, diffuse expression of these proteins was noted in the patent group. Proteomic analysis revealed global changes in the expression of proteins that regulate oxygen sensing, ion channels, smooth muscle cell migration, nervous system, immune system, and metabolism, suggesting a basis for the systemic regulation of DA patency by diverse signaling pathways, which will be confirmed in further studies.

  6. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    Science.gov (United States)

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. Quantitative histogram analysis of images

    Science.gov (United States)

    Holub, Oliver; Ferreira, Sérgio T.

    2006-11-01

    A routine for histogram analysis of images has been written in the object-oriented, graphical development environment LabVIEW. The program converts an RGB bitmap image into an intensity-linear greyscale image according to selectable conversion coefficients. This greyscale image is subsequently analysed by plots of the intensity histogram and probability distribution of brightness, and by calculation of various parameters, including average brightness, standard deviation, variance, minimal and maximal brightness, mode, skewness and kurtosis of the histogram and the median of the probability distribution. The program allows interactive selection of specific regions of interest (ROI) in the image and definition of lower and upper threshold levels (e.g., to permit the removal of a constant background signal). The results of the analysis of multiple images can be conveniently saved and exported for plotting in other programs, which allows fast analysis of relatively large sets of image data. The program file accompanies this manuscript together with a detailed description of two application examples: The analysis of fluorescence microscopy images, specifically of tau-immunofluorescence in primary cultures of rat cortical and hippocampal neurons, and the quantification of protein bands by Western-blot. The possibilities and limitations of this kind of analysis are discussed. Program summaryTitle of program: HAWGC Catalogue identifier: ADXG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXG_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers: Mobile Intel Pentium III, AMD Duron Installations: No installation necessary—Executable file together with necessary files for LabVIEW Run-time engine Operating systems or monitors under which the program has been tested: WindowsME/2000/XP Programming language used: LabVIEW 7.0 Memory required to execute with typical data:˜16MB for starting and ˜160MB used for

  8. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye mov

  9. Quantitative analysis of saccadic search strategy

    NARCIS (Netherlands)

    Over, E.A.B.

    2007-01-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye

  10. Combining the least correlation design, wavelet packet transform and correlation coefficient test to reduce the size of calibration set for NIR quantitative analysis in multi-component systems.

    Science.gov (United States)

    Cai, Chen-Bo; Xu, Lu; Han, Qing-Juan; Wu, Hai-Long; Nie, Jin-Fang; Fu, Hai-Yan; Yu, Ru-Qin

    2010-05-15

    The paper focuses on solving a common and important problem of NIR quantitative analysis in multi-component systems: how to significantly reduce the size of the calibration set while not impairing the predictive precision. To cope with the problem orthogonal discrete wavelet packet transform (WPT), the least correlation design and correlation coefficient test (r-test) have been combined together. As three examples, a two-component carbon tetrachloride system with 21 calibration samples, a two-component aqueous system with 21 calibration samples, and a two-component aqueous system with 41 calibration samples have been treated with the proposed strategy, respectively. In comparison with some previous methods based on much more calibration samples, the results out of the strategy showed that the predictive ability was not obviously decreased for the first system while being clearly strengthened for the second one, and the predictive precision out of the third one was even satisfactory enough for most cases of quantitative analysis. In addition, all important factors and parameters related to our strategy are discussed in detail.

  11. An automated system for quantitative analysis of newborns' oral-motor behavior and coordination during bottle feeding.

    Science.gov (United States)

    Tamilia, Eleonora; Formica, Domenico; Visco, Anna Maria; Scaini, Alberto; Taffoni, Fabrizio

    2015-01-01

    In this work a novel unobtrusive technology-aided system is presented and tested for the assessment of newborns' oral-motor behavior and coordination during bottle feeding. A low-cost monitoring device was designed and developed in order to record Suction (S) and Expression (E) pressures from a typical feeding bottle. A software system was developed to automatically treat the data and analyze them. A set of measures of motor control and coordination has been implemented for the specific application to the analysis of sucking behavior. Experimental data were collected with the developed system on two groups of newborns (Healthy vs. Low Birth Weight) in a clinical setting. We identified the most sensitive S features to group differences, and analyzed their correlation with S/E coordination measures. Then, Principal Component Analysis (PCA) was used to explore the system suitability to automatically identify peculiar oral behaviors. Results suggest the suitability of the proposed system to perform an objective technology-aided assessment of the newborn's oral-motor behavior and coordination during the first days of life.

  12. Quantitative Analysis of Face Symmetry.

    Science.gov (United States)

    Tamir, Abraham

    2015-06-01

    The major objective of this article was to report quantitatively the degree of human face symmetry for reported images taken from the Internet. From the original image of a certain person that appears in the center of each triplet, 2 symmetric combinations were constructed that are based on the left part of the image and its mirror image (left-left) and on the right part of the image and its mirror image (right-right). By applying a computer software that enables to determine length, surface area, and perimeter of any geometric shape, the following measurements were obtained for each triplet: face perimeter and area; distance between the pupils; mouth length; its perimeter and area; nose length and face length, usually below the ears; as well as the area and perimeter of the pupils. Then, for each of the above measurements, the value C, which characterizes the degree of symmetry of the real image with respect to the combinations right-right and left-left, was calculated. C appears on the right-hand side below each image. A high value of C indicates a low symmetry, and as the value is decreasing, the symmetry is increasing. The magnitude on the left relates to the pupils and compares the difference between the area and perimeter of the 2 pupils. The major conclusion arrived at here is that the human face is asymmetric to some degree; the degree of asymmetry is reported quantitatively under each portrait.

  13. Quantitative analysis of qualitative images

    Science.gov (United States)

    Hockney, David; Falco, Charles M.

    2005-03-01

    We show optical evidence that demonstrates artists as early as Jan van Eyck and Robert Campin (c1425) used optical projections as aids for producing their paintings. We also have found optical evidence within works by later artists, including Bermejo (c1475), Lotto (c1525), Caravaggio (c1600), de la Tour (c1650), Chardin (c1750) and Ingres (c1825), demonstrating a continuum in the use of optical projections by artists, along with an evolution in the sophistication of that use. However, even for paintings where we have been able to extract unambiguous, quantitative evidence of the direct use of optical projections for producing certain of the features, this does not mean that paintings are effectively photographs. Because the hand and mind of the artist are intimately involved in the creation process, understanding these complex images requires more than can be obtained from only applying the equations of geometrical optics.

  14. Nonlinear dynamics and quantitative EEG analysis.

    Science.gov (United States)

    Jansen, B H

    1996-01-01

    Quantitative, computerized electroencephalogram (EEG) analysis appears to be based on a phenomenological approach to EEG interpretation, and is primarily rooted in linear systems theory. A fundamentally different approach to computerized EEG analysis, however, is making its way into the laboratories. The basic idea, inspired by recent advances in the area of nonlinear dynamics and chaos theory, is to view an EEG as the output of a deterministic system of relatively simple complexity, but containing nonlinearities. This suggests that studying the geometrical dynamics of EEGs, and the development of neurophysiologically realistic models of EEG generation may produce more successful automated EEG analysis techniques than the classical, stochastic methods. A review of the fundamentals of chaos theory is provided. Evidence supporting the nonlinear dynamics paradigm to EEG interpretation is presented, and the kind of new information that can be extracted from the EEG is discussed. A case is made that a nonlinear dynamic systems viewpoint to EEG generation will profoundly affect the way EEG interpretation is currently done.

  15. Electrokinetic gated injection-based microfluidic system for quantitative analysis of hydrogen peroxide in individual HepG2 cells.

    Science.gov (United States)

    Zhang, Xinyuan; Li, Qingling; Chen, Zhenzhen; Li, Hongmin; Xu, Kehua; Zhang, Lisheng; Tang, Bo

    2011-03-21

    A microfluidic system to determine hydrogen peroxide (H(2)O(2)) in individual HepG2 cells based on the electrokinetic gated injection was developed for the first time. A home-synthesized fluorescent probe, bis(p-methylbenzenesulfonate)dichlorofluorescein (FS), was employed to label intracellular H(2)O(2) in the intact cells. On a simple cross microchip, multiple single-cell operations, including single cell injection, cytolysis, electrophoresis separation and detection of H(2)O(2), were automatically carried out within 60 s using the electrokinetic gated injection and laser-induced fluorescence detection (LIFD). The performance of the method was evaluated under the optimal conditions. The linear calibration curve was over a range of 4.39-610 amol (R(2)=0.9994). The detection limit was 0.55 amol or 9.0×10(-10) M (S/N=3). The relative standard deviations (RSDs, n=6) of migration time and peak area were 1.4% and 4.8%, respectively. With the use of this method, the average content of H(2)O(2) in single HepG2 cells was found to be 16.09±9.84 amol (n=15). Separation efficiencies in excess of 17,000 theoretical plates for the cells were achieved. These results demonstrated that the efficient integration and automation of these single-cell operations enabled the sensitive, reproducible, and quantitative examination of intracellular H(2)O(2) at single-cell level. Owing to the advantages of simple microchip structure, controllable single-cell manipulation and ease in building, this platform provides a universal way to automatically determine other intracellular constituents within single cells.

  16. Quantitative Analysis of Urban Pluvial Flood Alleviation by Open Surface Water Systems in New Towns: Comparing Almere and Tianjin Eco-City

    Directory of Open Access Journals (Sweden)

    Zhengnan Zhou

    2015-09-01

    Full Text Available Increased surface runoff generated in urban areas due to larger proportion of impervious surfaces has, in many cases, exceeded the capacity of urban drainage systems. In response to such challenge, this paper introduces the quantitative analysis of pluvial flood alleviation by open surface water systems in the case of Almere in the Netherlands and compares it with Tianjin Eco-City in China, with the aim of optimizing land use planning and urban design for new urban districts. The methodology is a combination of quantitative and qualitative analysis. With the analytical tool of ArcGIS, the authors have investigated the influence of spatial distribution of surface water system on the reduction of pluvial flood risks. The conclusions include some preliminary principles: (1 a densely distributed surface water network is preferable; (2 areas farther away from water body require water sensitive spatial intervention; and (3 optimizing the allocation of different types of ground surface could contribute to pluvial flood alleviation. An alternative design proposal for a typical urban block in Tianjin Eco-City has been put forward to illustrate these principles.

  17. Quantitative estimation of the stability of methicillin-resistant Staphylococcus aureus strain-typing systems by use of Kaplan-Meier survival analysis.

    Science.gov (United States)

    O'Sullivan, Matthew V N; Sintchenko, Vitali; Gilbert, Gwendolyn L

    2013-01-01

    Knowledge concerning stability is important in the development and assessment of microbial molecular typing systems and is critical for the interpretation of their results. Typing system stability is usually measured as the fraction of isolates that change type after several in vivo passages, but this does not necessarily reflect in vivo stability. The aim of this study was to utilize survival analysis to provide an informative quantitative measure of in vivo stability and to compare the stabilities of various techniques employed in typing methicillin-resistant Staphylococcus aureus (MRSA). We identified 100 MRSA pairs (isolated from the same patient ≥ 1 month apart) and typed them using multilocus sequence typing (MLST), phage-derived open reading frame (PDORF) typing, toxin gene profiling (TGP), staphylococcal cassette chromosome mec (SCCmec) subtyping, pulsed-field gel electrophoresis (PFGE), and spa sequence typing. Discordant isolate pairs, belonging to different MLST clonal complexes, were excluded, leaving 81 pairs for analysis. The stabilities of these methods were examined using Kaplan-Meier survival analysis, and discriminatory power was measured by Simpson's index of diversity. The probability percentages that the type remained unchanged at 6 months for spa sequence typing, TGP, multilocus variable number of tandem repeats analysis (MLVA), SCCmec subtyping, PDORF typing, and PFGE were 95, 95, 88, 82, 71, and 58, respectively, while the Simpson's indices of diversity were 0.48, 0.47, 0.70, 0.72, 0.89, and 0.88, respectively. Survival analysis using sequential clinical isolates adds an important quantitative dimension to the measurement of stability of a microbial typing system. Of the methods compared here, PDORF typing provides high discriminatory power, comparable with that of PFGE, and a level of stability suitable for MRSA surveillance and outbreak investigations.

  18. Quantitative investigations of aggregate systems.

    Science.gov (United States)

    Rai, D K; Beaucage, G; Jonah, E O; Britton, D T; Sukumaran, S; Chopra, S; Gonfa, G Goro; Härting, M

    2012-07-28

    Nanomaterials with disordered, ramified structure are increasingly being used for applications where low cost and enhanced performance are desired. A particular example is the use in printed electronics of inorganic conducting and semiconducting nanoparticles. The electrical, as well as other physical properties depend on the arrangement and connectivity of the particles in such aggregate systems. Quantification of aggregate structure and development of structure/property relationships is difficult and progress in the application of these materials in electronics has mainly been empirical. In this paper, a scaling model is used to parameterize the structure of printed electronic layers. This model has chiefly been applied to polymers but surprisingly it shows applicability to these nanolayers. Disordered structures of silicon nanoparticles forming aggregates are investigated using small angle x-ray scattering coupled with the scaling model. It is expected that predictions using these structural parameters can be made for electrical properties. The approach may have wide use in understanding and designing nano-aggregates for electronic devices.

  19. Quantitative representation of fiber-and sheet-texture in metals of cubic system. I. Computer programs for texture analysis. [TXFIB and TXSHT codes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, H.J.; Kim, S.C.; Chun, B.C. (Korea Advanced Energy Research Inst., Seoul (Republic of Korea)); Lee, C.Y. (Rutgers--the State Univ., New Brunswick, NJ (USA). Dept. of Physics)

    1983-05-01

    This is the first article of a series dealing with studies on the quantitative representation of fiber-and sheet-type textures in metals of cubic crystal system. Texture measurements by neutron diffraction method are analyzed using Bunge's series expansion method and the effect of series truncation is studied for samples of various texture sharpness. The present article describes two computer programs, TXFIB and TXSHT, develped for the analysis of the respective fiber-and sheet-type texture. Using these computer programs, the orientation distribution function can be expanded in the series of generalized spherical harmonics up to 58th term from 6 experimental pole figures as input. Estimations of various errors involved in the texture analysis and texture sharpness index are also included in the programs.

  20. Recent trends in social systems quantitative theories and quantitative models

    CERN Document Server

    Hošková-Mayerová, Šárka; Soitu, Daniela-Tatiana; Kacprzyk, Janusz

    2017-01-01

    The papers collected in this volume focus on new perspectives on individuals, society, and science, specifically in the field of socio-economic systems. The book is the result of a scientific collaboration among experts from “Alexandru Ioan Cuza” University of Iaşi (Romania), “G. d’Annunzio” University of Chieti-Pescara (Italy), "University of Defence" of Brno (Czech Republic), and "Pablo de Olavide" University of Sevilla (Spain). The heterogeneity of the contributions presented in this volume reflects the variety and complexity of social phenomena. The book is divided in four Sections as follows. The first Section deals with recent trends in social decisions. Specifically, it aims to understand which are the driving forces of social decisions. The second Section focuses on the social and public sphere. Indeed, it is oriented on recent developments in social systems and control. Trends in quantitative theories and models are described in Section 3, where many new formal, mathematical-statistical to...

  1. Quantitative analysis of protein turnover in plants.

    Science.gov (United States)

    Nelson, Clark J; Li, Lei; Millar, A Harvey

    2014-03-01

    Proteins are constantly being synthesised and degraded as plant cells age and as plants grow, develop and adapt the proteome. Given that plants develop through a series of events from germination to fruiting and even undertake whole organ senescence, an understanding of protein turnover as a fundamental part of this process in plants is essential. Both synthesis and degradation processes are spatially separated in a cell across its compartmented structure. The majority of protein synthesis occurs in the cytosol, while synthesis of specific components occurs inside plastids and mitochondria. Degradation of proteins occurs in both the cytosol, through the action of the plant proteasome, and in organelles and lytic structures through different protease classes. Tracking the specific synthesis and degradation rate of individual proteins can be undertaken using stable isotope feeding and the ability of peptide MS to track labelled peptide fractions over time. Mathematical modelling can be used to follow the isotope signature of newly synthesised protein as it accumulates and natural abundance proteins as they are lost through degradation. Different technical and biological constraints govern the potential for the use of (13)C, (15)N, (2)H and (18)O for these experiments in complete labelling and partial labelling strategies. Future development of quantitative protein turnover analysis will involve analysis of protein populations in complexes and subcellular compartments, assessing the effect of PTMs and integrating turnover studies into wider system biology study of plants.

  2. Evolutionary quantitative genetics of nonlinear developmental systems.

    Science.gov (United States)

    Morrissey, Michael B

    2015-08-01

    In quantitative genetics, the effects of developmental relationships among traits on microevolution are generally represented by the contribution of pleiotropy to additive genetic covariances. Pleiotropic additive genetic covariances arise only from the average effects of alleles on multiple traits, and therefore the evolutionary importance of nonlinearities in development is generally neglected in quantitative genetic views on evolution. However, nonlinearities in relationships among traits at the level of whole organisms are undeniably important to biology in general, and therefore critical to understanding evolution. I outline a system for characterizing key quantitative parameters in nonlinear developmental systems, which yields expressions for quantities such as trait means and phenotypic and genetic covariance matrices. I then develop a system for quantitative prediction of evolution in nonlinear developmental systems. I apply the system to generating a new hypothesis for why direct stabilizing selection is rarely observed. Other uses will include separation of purely correlative from direct and indirect causal effects in studying mechanisms of selection, generation of predictions of medium-term evolutionary trajectories rather than immediate predictions of evolutionary change over single generation time-steps, and the development of efficient and biologically motivated models for separating additive from epistatic genetic variances and covariances.

  3. Quantitative MRI analysis of dynamic enhancement of focal liver lesions

    Directory of Open Access Journals (Sweden)

    S. S. Bagnenko

    2012-01-01

    Full Text Available In our study 45 patients with different focal liver lesions (110 nodules were examined using high field MR-system (1,5 T. During this investigation quantitative MRI analysis of dynamic enhancement of various hepatic lesions and parenchymatous organs of abdomen were performed. It was shown that quantitative evaluation of enhanced MRI improves understanding of vascular transformation processes in pathologic hepatic focuses and in liver itself that is important for differential diagnoses of these diseases.

  4. Christhin: Quantitative Analysis of Thin Layer Chromatography

    CERN Document Server

    Barchiesi, Maximiliano; Renaudo, Carlos; Rossi, Pablo; Pramparo, María de Carmen; Nepote, Valeria; Grosso, Nelson Ruben; Gayol, María Fernanda

    2012-01-01

    Manual for Christhin 0.1.36 Christhin (Chromatography Riser Thin) is software developed for the quantitative analysis of data obtained from thin-layer chromatographic techniques (TLC). Once installed on your computer, the program is very easy to use, and provides data quickly and accurately. This manual describes the program, and reading should be enough to use it properly.

  5. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  6. Quantitative texture analysis of electrodeposited line patterns

    DEFF Research Database (Denmark)

    Pantleon, Karen; Somers, Marcel A.J.

    2005-01-01

    Free-standing line patterns of Cu and Ni were manufactured by electrochemical deposition into lithographically prepared patterns. Electrodeposition was carried out on top of a highly oriented Au-layer physically vapor deposited on glass. Quantitative texture analysis carried out by means of x...

  7. A quantitative approach to scar analysis.

    Science.gov (United States)

    Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia

    2011-02-01

    Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. Copyright © 2011 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.

  8. Quantitative system validation in model driven design

    DEFF Research Database (Denmark)

    Hermanns, Hilger; Larsen, Kim Guldstrand; Raskin, Jean-Francois;

    2010-01-01

    The European STREP project Quasimodo1 develops theory, techniques and tool components for handling quantitative constraints in model-driven development of real-time embedded systems, covering in particular real-time, hybrid and stochastic aspects. This tutorial highlights the advances made, focus...

  9. New Quantitative Study for Dissertations Repository System

    CERN Document Server

    Alshammari, Fahad H; Zaidan, M A; Hmood, Ali K; Zaidan, B B; Zaidan, A A

    2010-01-01

    In the age of technology, the information communication technology becomes very important especially in education field. Students must be allowed to learn anytime, anywhere and at their own place. The facility of library in the university should be developed. In this paper we are going to present new Quantitative Study for Dissertations Repository System and also recommend future application of the approach.

  10. Electric Field Quantitative Measurement System and Method

    Science.gov (United States)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  11. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids

    Directory of Open Access Journals (Sweden)

    Wilson Zoe A

    2008-06-01

    Full Text Available Abstract Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP, which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP and a complementary quenching probe (QP lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.

  12. The Stability of Currency Systems in East Asia --Quantitative Analysis Using a Multi-Country Macro-Econometric Model--

    OpenAIRE

    Koichiro Kamada

    2009-01-01

    The purpose of this paper is to examine the stability of East Asian financial and currency systems, using the multi-country macro-econometric model constructed by Kamada and Takagawa (2005) to depict economic interdependence in the Asian-Pacific region. The highly-developed system of the international production network in the East Asian region was not only the driving force behind the "East Asian miracle," but also, as seen in the "Asian currency crisis," worked as a platform whereby local e...

  13. Quantitative ion beam analysis of M-C-O systems: application to an oxidized uranium carbide sample

    Science.gov (United States)

    Martin, G.; Raveu, G.; Garcia, P.; Carlot, G.; Khodja, H.; Vickridge, I.; Barthe, M. F.; Sauvage, T.

    2014-04-01

    A large variety of materials contain both carbon and oxygen atoms, in particular oxidized carbides, carbon alloys (as ZrC, UC, steels, etc.), and oxycarbide compounds (SiCO glasses, TiCO, etc.). Here a new ion beam analysis methodology is described which enables quantification of elemental composition and oxygen concentration profile over a few microns. It is based on two procedures. The first, relative to the experimental configuration relies on a specific detection setup which is original in that it enables the separation of the carbon and oxygen NRA signals. The second concerns the data analysis procedure i.e. the method for deriving the elemental composition from the particle energy spectrum. It is a generic algorithm and is here successfully applied to characterize an oxidized uranium carbide sample, developed as a potential fuel for generation IV nuclear reactors. Furthermore, a micro-beam was used to simultaneously determine the local elemental composition and oxygen concentration profiles over the first microns below the sample surface. This method is adapted to the determination of the composition of M?C?O? compounds with a sensitivity on elemental atomic concentrations around 1000 ppm.

  14. Quantitative image analysis of celiac disease.

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-03-07

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients.

  15. Quantitative image analysis of celiac disease

    Science.gov (United States)

    Ciaccio, Edward J; Bhagat, Govind; Lewis, Suzanne K; Green, Peter H

    2015-01-01

    We outline the use of quantitative techniques that are currently used for analysis of celiac disease. Image processing techniques can be useful to statistically analyze the pixular data of endoscopic images that is acquired with standard or videocapsule endoscopy. It is shown how current techniques have evolved to become more useful for gastroenterologists who seek to understand celiac disease and to screen for it in suspected patients. New directions for focus in the development of methodology for diagnosis and treatment of this disease are suggested. It is evident that there are yet broad areas where there is potential to expand the use of quantitative techniques for improved analysis in suspected or known celiac disease patients. PMID:25759524

  16. Quantitative and phenotypic analysis of bone marrow-derived cells in the intact and inflamed central nervous system.

    Science.gov (United States)

    Short, Martin A; Campanale, Naomi; Litwak, Sara; Bernard, Claude C A

    2011-01-01

    Bone marrow has been proposed as a possible source of cells capable of replacing injured neural cells in diseases such as Multiple Sclerosis (MS). Previous studies have reported conflicting results regarding the transformation of bone marrow cells into neural cells in vivo. This study is a detailed analysis of the fate of bone marrow derived cells (BMDC) in the CNS of C57Bl/6 mice with and without experimental autoimmune encephalomyelitis using flow cytometry to identify GFP-labeled BMDC that lacked the pan-hematopoietic marker, CD45 and co-expressed neural markers polysialic acid-neural cell adhesion molecule or A2B5. A small number of BMDC displaying neural markers and lacking CD45 expression was identified within both the non-inflamed and inflamed CNS. However, the majority of BMDC exhibited a hematopoietic phenotype.

  17. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    Science.gov (United States)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  18. Quantitative analysis of α-L-iduronidase expression in immunocompetent mice treated with the Sleeping Beauty transposon system.

    Directory of Open Access Journals (Sweden)

    Elena L Aronovich

    Full Text Available The Sleeping Beauty transposon system, a non-viral, integrating vector that can deliver the alpha-L-iduronidase-encoding gene, is efficient in correcting mucopolysaccharidosis type I in NOD/SCID mice. However, in previous studies we failed to attain reliable long-term alpha-L-iduronidase expression in immunocompetent mice. Here, we focused on achieving sustained high-level expression in immunocompetent C57BL/6 mice. In our standard liver-directed treatment we hydrodynamically infuse mice with plasmids containing a SB transposon-encoding human alpha-L-iduronidase, along with a source of SB transposase. We sought to 1 minimize expression of the therapeutic enzyme in antigen-presenting cells, while avoiding promoter shutdown and gender bias, 2 increase transposition efficiency and 3 improve immunosuppression. By using a liver-specific promoter to drive IDUA expression, the SB100X hyperactive transposase and transient cyclophosphamide immunosuppression we achieved therapeutic-level (>100 wild-type stabilized expression for 1 year in 50% of C57BL/6 mice. To gain insights into the causes of variability in transgene expression, we quantified the rates of alpha-L-iduronidase activity decay vis-a-vis transposition and transgene maintenance using the data obtained in this and previous studies. Our analyses showed that immune responses are the most important variable to control in order to prevent loss of transgene expression. Cumulatively, our results allow transition to pre-clinical studies of SB-mediated alpha-L-iduronidase expression and correction of mucopolysaccharidosis type I in animal models.

  19. Quantitative analysis of α-L-iduronidase expression in immunocompetent mice treated with the Sleeping Beauty transposon system.

    Science.gov (United States)

    Aronovich, Elena L; Hall, Bryan C; Bell, Jason B; McIvor, R Scott; Hackett, Perry B

    2013-01-01

    The Sleeping Beauty transposon system, a non-viral, integrating vector that can deliver the alpha-L-iduronidase-encoding gene, is efficient in correcting mucopolysaccharidosis type I in NOD/SCID mice. However, in previous studies we failed to attain reliable long-term alpha-L-iduronidase expression in immunocompetent mice. Here, we focused on achieving sustained high-level expression in immunocompetent C57BL/6 mice. In our standard liver-directed treatment we hydrodynamically infuse mice with plasmids containing a SB transposon-encoding human alpha-L-iduronidase, along with a source of SB transposase. We sought to 1) minimize expression of the therapeutic enzyme in antigen-presenting cells, while avoiding promoter shutdown and gender bias, 2) increase transposition efficiency and 3) improve immunosuppression. By using a liver-specific promoter to drive IDUA expression, the SB100X hyperactive transposase and transient cyclophosphamide immunosuppression we achieved therapeutic-level (>100 wild-type) stabilized expression for 1 year in 50% of C57BL/6 mice. To gain insights into the causes of variability in transgene expression, we quantified the rates of alpha-L-iduronidase activity decay vis-a-vis transposition and transgene maintenance using the data obtained in this and previous studies. Our analyses showed that immune responses are the most important variable to control in order to prevent loss of transgene expression. Cumulatively, our results allow transition to pre-clinical studies of SB-mediated alpha-L-iduronidase expression and correction of mucopolysaccharidosis type I in animal models.

  20. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics.

  1. Insights into deglaciation of the largest ice-free area in the South Shetland Islands (Antarctica) from quantitative analysis of the drainage system

    Science.gov (United States)

    Mink, Sandra; López-Martínez, Jerónimo; Maestro, Adolfo; Garrote, Julio; Ortega, José A.; Serrano, Enrique; Durán, Juan José; Schmid, Thomas

    2014-11-01

    A quantitative geomorphic analysis of the drainage system on Byers Peninsula, Livingston Island, has been carried out in order to study the relief evolution, glacial history and possible neotectonic influence on the largest ice-free area of the South Shetlands archipelago. Aerial photographs, SAR data from RADARSAT-2 satellite, field work, a digital elevation model and GIS spatial analysis have been used to identify, map and study the existing drainage basins. A series of morphometric parameters have been studied in 30 selected basins in order to characterize their shape as well as the drainage network. Results in morphometric parameters reveal elongation trends in the shape of basins and a limited hierarchical network, common of a youthful stage of landscape evolution models. Several morphometric indexes (hypsometric integral, hypsometric curves, SL index, transverse topographical drainage basin asymmetry-T-Factor) have been used to study possible controls on drainage development. Results have been discussed in relation to relief and drainage evolution linked to the spatial distribution of lithological units and structural framework. T-Factor shows an apparently disorganized pattern and absence of tectonic influence. However, there are local values of second order basin asymmetry directions and magnitudes, which could reflect a succession of master rills through time, related to the changes in water supply during the deglaciation history of Byers Peninsula. Hypsometric values and curves of basins are also mainly related to a young stage of landscape evolution. Analysis of hypsometric integrals together with T-Factor index has allowed us to establish a possible deglaciation model on Byers Peninsula, which successfully explains the results. Areas of different landscape evolution stage are linked in space and support the hypothesis of local glacial centers during the ice cover retreat process. SL index results do not show the same pattern in results, which could be

  2. Semi-quantitative analysis of cytokine mRNA expression induced by the herbal medicine Sho-saiko-to (TJ-9) using a Gel Doc system.

    Science.gov (United States)

    Huang, X X; Yamashiki, M; Nakatani, K; Nobori, T; Mase, A

    2001-01-01

    The RT-PCR method was employed to determine the cytokine mRNA expression of human peripheral lymphocytes induced by the Japanese herbal medicine Sho-saiko-to (TJ-9). The results showed that the mRNA expression of IL-12, IL-1beta, IL-10, TNF-alpha, G-CSF, and IFN-gamma increased after 6 hr in culture. This is the first reported finding that TJ-9 is an IFN-gamma inducer. Next, cytokine mRNA expression was semi-quantitatively measured using the Gel Doc system with a CCD camera and then statistically analyzed in order to determine which component of TJ-9 was the true cytokine inducer. The results showed that the scutellaria root is the main component inducing the cytokines, while the glycyrrhiza root is the secondary component. When the cytokine concentrations in the supernatants of cell cultures were measured by ELISA, the levels of IL-12, IL-1beta, IL-10, TNF-alpha, and G-CSF reflected mRNA expression levels in the cell fraction. However, the level of IFN-gamma was below the detectable limit. The effects of various reagents on many different kinds of cytokine mRNA expression could be analyzed objectively in a short time using the Gel Doc system. Many important findings could be demonstrated by this simple, easy, sensitive, and cheap method. After the clinical significance of cytokine analysis is confirmed, this method may become a useful clinical examination tool.

  3. LC-MS systems for quantitative bioanalysis.

    Science.gov (United States)

    van Dongen, William D; Niessen, Wilfried M A

    2012-10-01

    LC-MS has become the method-of-choice in small-molecule drug bioanalysis (molecular mass Triple quadrupole MS is the established bioanalytical technique due to its unpreceded selectivity and sensitivity, but high-resolution accurate-mass MS is recently gaining ground due to its ability to provide simultaneous quantitative and qualitative analysis of drugs and their metabolites. This article discusses current trends in the field of bioanalytical LC-MS (until September 2012), and provides an overview of currently available commercial triple quadrupole MS and high-resolution LC-MS instruments as applied for the bioanalysis of small-molecule and biopharmaceutical drugs.

  4. Inspection, visualisation and analysis of quantitative proteomics data

    OpenAIRE

    Gatto, Laurent

    2016-01-01

    Material Quantitative Proteomics and Data Analysis Course. 4 - 5 April 2016, Queen Hotel, Chester, UK Table D - Inspection, visualisation and analysis of quantitative proteomics data, Laurent Gatto (University of Cambridge)

  5. Microgravity validation of a novel system for RNA isolation and multiplex quantitative real time PCR analysis of gene expression on the International Space Station.

    Science.gov (United States)

    Parra, Macarena; Jung, Jimmy; Boone, Travis D; Tran, Luan; Blaber, Elizabeth A; Brown, Mark; Chin, Matthew; Chinn, Tori; Cohen, Jacob; Doebler, Robert; Hoang, Dzung; Hyde, Elizabeth; Lera, Matthew; Luzod, Louie T; Mallinson, Mark; Marcu, Oana; Mohamedaly, Youssef; Ricco, Antonio J; Rubins, Kathleen; Sgarlato, Gregory D; Talavera, Rafael O; Tong, Peter; Uribe, Eddie; Williams, Jeffrey; Wu, Diana; Yousuf, Rukhsana; Richey, Charles S; Schonfeld, Julie; Almeida, Eduardo A C

    2017-01-01

    The International Space Station (ISS) National Laboratory is dedicated to studying the effects of space on life and physical systems, and to developing new science and technologies for space exploration. A key aspect of achieving these goals is to operate the ISS National Lab more like an Earth-based laboratory, conducting complex end-to-end experimentation, not limited to simple microgravity exposure. Towards that end NASA developed a novel suite of molecular biology laboratory tools, reagents, and methods, named WetLab-2, uniquely designed to operate in microgravity, and to process biological samples for real-time gene expression analysis on-orbit. This includes a novel fluidic RNA Sample Preparation Module and fluid transfer devices, all-in-one lyophilized PCR assays, centrifuge, and a real-time PCR thermal cycler. Here we describe the results from the WetLab-2 validation experiments conducted in microgravity during ISS increment 47/SPX-8. Specifically, quantitative PCR was performed on a concentration series of DNA calibration standards, and Reverse Transcriptase-quantitative PCR was conducted on RNA extracted and purified on-orbit from frozen Escherichia coli and mouse liver tissue. Cycle threshold (Ct) values and PCR efficiencies obtained on-orbit from DNA standards were similar to Earth (1 g) controls. Also, on-orbit multiplex analysis of gene expression from bacterial cells and mammalian tissue RNA samples was successfully conducted in about 3 h, with data transmitted within 2 h of experiment completion. Thermal cycling in microgravity resulted in the trapping of gas bubbles inside septa cap assay tubes, causing small but measurable increases in Ct curve noise and variability. Bubble formation was successfully suppressed in a rapid follow-up on-orbit experiment using standard caps to pressurize PCR tubes and reduce gas release during heating cycles. The WetLab-2 facility now provides a novel operational on-orbit research capability for molecular biology and

  6. Challenges in Quantitative Abstractions for Collective Adaptive Systems

    Directory of Open Access Journals (Sweden)

    Mirco Tribastone

    2016-07-01

    Full Text Available Like with most large-scale systems, the evaluation of quantitative properties of collective adaptive systems is an important issue that crosscuts all its development stages, from design (in the case of engineered systems to runtime monitoring and control. Unfortunately it is a difficult problem to tackle in general, due to the typically high computational cost involved in the analysis. This calls for the development of appropriate quantitative abstraction techniques that preserve most of the system's dynamical behaviour using a more compact representation. This paper focuses on models based on ordinary differential equations and reviews recent results where abstraction is achieved by aggregation of variables, reflecting on the shortcomings in the state of the art and setting out challenges for future research.

  7. Quantitative analysis of spirality in elliptical galaxies

    CERN Document Server

    Dojcsak, Levente

    2013-01-01

    We use an automated galaxy morphology analysis method to quantitatively measure the spirality of galaxies classified manually as elliptical. The data set used for the analysis consists of 60,518 galaxy images with redshift obtained by the Sloan Digital Sky Survey (SDSS) and classified manually by Galaxy Zoo, as well as the RC3 and NA10 catalogues. We measure the spirality of the galaxies by using the Ganalyzer method, which transforms the galaxy image to its radial intensity plot to detect galaxy spirality that is in many cases difficult to notice by manual observation of the raw galaxy image. Experimental results using manually classified elliptical and S0 galaxies with redshift <0.3 suggest that galaxies classified manually as elliptical and S0 exhibit a nonzero signal for the spirality. These results suggest that the human eye observing the raw galaxy image might not always be the most effective way of detecting spirality and curves in the arms of galaxies.

  8. Quantitative laryngeal electromyography: turns and amplitude analysis.

    Science.gov (United States)

    Statham, Melissa McCarty; Rosen, Clark A; Nandedkar, Sanjeev D; Munin, Michael C

    2010-10-01

    Laryngeal electromyography (LEMG) is primarily a qualitative examination, with no standardized approach to interpretation. The objectives of our study were to establish quantitative norms for motor unit recruitment in controls and to compare with interference pattern analysis in patients with unilateral vocal fold paralysis (VFP). Retrospective case-control study We performed LEMG of the thyroarytenoid-lateral cricoarytenoid muscle complex (TA-LCA) in 21 controls and 16 patients with unilateral VFP. Our standardized protocol used a concentric needle electrode with subjects performing variable force TA-LCA contraction. To quantify the interference pattern density, we measured turns and mean amplitude per turn for ≥10 epochs (each 500 milliseconds). Logarithmic regression analysis between amplitude and turns was used to calculate slope and intercept. Standard deviation was calculated to further define the confidence interval, enabling generation of a linear-scale graphical "cloud" of activity containing ≥90% of data points for controls and patients. Median age of controls and patients was similar (50.7 vs. 48.5 years). In controls, TA-LCA amplitude with variable contraction ranged from 145-1112 μV, and regression analysis comparing mean amplitude per turn to root-mean-square amplitude demonstrated high correlation (R = 0.82). In controls performing variable contraction, median turns per second was significantly higher compared to patients (450 vs. 290, P = .002). We first present interference pattern analysis in the TA-LCA in healthy adults and patients with unilateral VFP. Our findings indicate that motor unit recruitment can be quantitatively measured within the TA-LCA. Additionally, patients with unilateral VFP had significantly reduced turns when compared with controls.

  9. Imbalanced Development of Chinese Compulsory Education System:A Quantitative Analysis%义务教育发展不均衡程度的度量

    Institute of Scientific and Technical Information of China (English)

    刘黎明; 刘博宇; 黄恒君

    2014-01-01

    The imbalanced development of compulsory education in China has caused wide- spread concerns from both government and society. This paper addresses relevant issues concerning such imbalanced developments and quantifies the level of imbalance. A novel concept in terms of establishing a quantified minimal support standard within compulsory education system has been introduced in this paper. The model analyses the minimal support standard in different geographic areas of China which taken the imbalance of economic status into consideration. This model then has been utilized to estimate the net investment required from government funds to achieve the minimal support standard. The results of the model are further analyzed to conduct a comparison study among the 31 provincial level administrative divisions from the perspective of necessary investments in each administrative division. This unbiased and quantitative analysis clarifies the present development states of compulsory education in China. It provides the central and local government a theoretical and quantitative tool to estimate the spending and optimize distribution in compulsory education funds.%目前我国义务教育发展不均衡问题十分突出,已引起政府和社会各界的广泛关注。文章针对义务教育发展不均衡问题,构建刻画义务教育发展不均衡程度的度量方法。提出以义务教育最低保障标准为评价基点的思想。依据各地区发展差异,构建不同地区义务教育最低保障标准的测算模型。以此模型计算在满足最低保障标准条件下,义务教育经费的实际投入量。利用模型计算结果对我国31个省市义务教育经费投入,在区域之间以及区域内部进行横向或纵向差异的对比,客观量化了我国目前义务教育不均衡发展的现状,为政府提供义务教育经费分配和预算决策,提供了理论和数量依据。

  10. Evaluation of the extent of ground-glass opacity on high-resolution CT in patients with interstitial pneumonia associated with systemic sclerosis: comparison between quantitative and qualitative analysis.

    Science.gov (United States)

    Yabuuchi, H; Matsuo, Y; Tsukamoto, H; Horiuchi, T; Sunami, S; Kamitani, T; Jinnouchi, M; Nagao, M; Akashi, K; Honda, H

    2014-07-01

    To verify whether quantitative analysis of the extent of ground-glass opacity (GGO) on high-resolution computed tomography (HRCT) could show a stronger correlation with the therapeutic response of interstitial pneumonia (IP) associated with systemic sclerosis (SSc) compared with qualitative analysis. Seventeen patients with IP associated with SSc received autologous peripheral blood stem cell transplantation (auto-PBSCT) and were followed up using HRCT and pulmonary function tests. Two thoracic radiologists assessed the extent of GGO on HRCT using a workstation. Therapeutic effect was assessed using the change of vital capacity (VC) and diffusing capacity of the lung for carbon monoxide (DLco) before and 12 months after PBSCT. Interobserver agreement was assessed using Spearman's rank correlation coefficient and the Bland-Altman method. Correlation with the therapeutic response between quantitative and qualitative analysis was assessed with Pearson's correlation coefficients. Spearman's rank correlation coefficient showed good agreement, but Bland-Altman plots showed that proportional error could be suspected. Quantitative analysis showed stronger correlation than the qualitative analysis based on the relationships between the change in extent of GGO and VC, and change in extent of GGO and DLco. Quantitative analysis of the change in extent of GGO showed stronger correlation with the therapeutic response of IP with SSc after auto-PBSCT than with the qualitative analysis. Copyright © 2014 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  11. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  12. Quantitative Analysis in Nuclear Medicine Imaging

    CERN Document Server

    2006-01-01

    This book provides a review of image analysis techniques as they are applied in the field of diagnostic and therapeutic nuclear medicine. Driven in part by the remarkable increase in computing power and its ready and inexpensive availability, this is a relatively new yet rapidly expanding field. Likewise, although the use of radionuclides for diagnosis and therapy has origins dating back almost to the discovery of natural radioactivity itself, radionuclide therapy and, in particular, targeted radionuclide therapy has only recently emerged as a promising approach for therapy of cancer and, to a lesser extent, other diseases. As effort has, therefore, been made to place the reviews provided in this book in a broader context. The effort to do this is reflected by the inclusion of introductory chapters that address basic principles of nuclear medicine imaging, followed by overview of issues that are closely related to quantitative nuclear imaging and its potential role in diagnostic and therapeutic applications. ...

  13. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  14. Quantitative multivariate analysis of dynamic multicellular morphogenic trajectories.

    Science.gov (United States)

    White, Douglas E; Sylvester, Jonathan B; Levario, Thomas J; Lu, Hang; Streelman, J Todd; McDevitt, Todd C; Kemp, Melissa L

    2015-07-01

    Interrogating fundamental cell biology principles that govern tissue morphogenesis is critical to better understanding of developmental biology and engineering novel multicellular systems. Recently, functional micro-tissues derived from pluripotent embryonic stem cell (ESC) aggregates have provided novel platforms for experimental investigation; however elucidating the factors directing emergent spatial phenotypic patterns remains a significant challenge. Computational modelling techniques offer a unique complementary approach to probe mechanisms regulating morphogenic processes and provide a wealth of spatio-temporal data, but quantitative analysis of simulations and comparison to experimental data is extremely difficult. Quantitative descriptions of spatial phenomena across multiple systems and scales would enable unprecedented comparisons of computational simulations with experimental systems, thereby leveraging the inherent power of computational methods to interrogate the mechanisms governing emergent properties of multicellular biology. To address these challenges, we developed a portable pattern recognition pipeline consisting of: the conversion of cellular images into networks, extraction of novel features via network analysis, and generation of morphogenic trajectories. This novel methodology enabled the quantitative description of morphogenic pattern trajectories that could be compared across diverse systems: computational modelling of multicellular structures, differentiation of stem cell aggregates, and gastrulation of cichlid fish. Moreover, this method identified novel spatio-temporal features associated with different stages of embryo gastrulation, and elucidated a complex paracrine mechanism capable of explaining spatiotemporal pattern kinetic differences in ESC aggregates of different sizes.

  15. A strategy to apply quantitative epistasis analysis on developmental traits.

    Science.gov (United States)

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  16. A quantitative assessment of LCOs for operations using system dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Kyung, Min Kang [Department of Nuclear Engineering, Hanyang University, 17 Haengdang-Dong, Sungdong-Gu, Seoul 133-791 (Korea, Republic of); Jae, Moosung [Department of Nuclear Engineering, Hanyang University, 17 Haengdang-Dong, Sungdong-Gu, Seoul 133-791 (Korea, Republic of)]. E-mail: jae@hanyang.ac.kr

    2005-02-01

    Limiting conditions for operations (LCOs) define the allowed outage times (AOTs) and the actions to be taken if the repair cannot be completed within the AOT. Typically plant shutdown is required. In situations where the risk associated with the action, i.e. the risk of plant shutdown given a failure of the safety system, may be substantial, a strategy is needed to control the plant risk. In this study the changing operation modes are evaluated quantitatively and dynamically using the tool of system dynamics. System dynamics has been developed to analyze the dynamic reliability of a complicated system. System dynamics using the Vensim software have been applied to LCOs assessment for an example system, the auxiliary feed water system of a reference nuclear power plant. Analysis results of both full power operation and shutdown operation have been compared for a measure of core damage frequency. The increase in core damage frequency is used as a measure in this study. A time dependent framework developed in this study has been shown to be very flexible in that it can be applied to assess LCOs quantitatively under any operational context of the Technical Specifications in Final Safety Analysis Report of the reference plant.

  17. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  18. Quantitative risk analysis preoperational of gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Manfredi, Carlos; Bispo, Gustavo G.; Esteves, Alvaro [Gie S.A., Buenos Aires (Argentina)

    2009-07-01

    The purpose of this analysis is to predict how it can be affected the individual risk and the public's general security due to the operation of a gas pipeline. In case that the single or social risks are considered intolerable, compared with the international standards, to be recommended measures of mitigation of the risk associated to the operation until levels that can be considered compatible with the best practices in the industry. The quantitative risk analysis calculates the probability of occurrence of an event based on the frequency of occurrence of the same one and it requires a complex mathematical treatment. The present work has as objective to develop a calculation methodology based on the previously mentioned publication. This calculation methodology is centered in defining the frequencies of occurrence of events, according to representative database of each case in study. Besides, it settles down the consequences particularly according to the considerations of each area and the different possibilities of interferences with the gas pipeline in study. For each one of the interferences a typical curve of ignition probabilities is developed in function from the distance to the pipe. (author)

  19. Evaluation of Laser Therapy on Nevus of Ota Using Computer-aided Quantitative System Analysis%激光治疗太田痣疗效的计算机辅助定量评价

    Institute of Scientific and Technical Information of China (English)

    齐向东; 马立敏; 钟世镇

    2011-01-01

    目的 探索计算机辅助定量分析系统对激光治疗太田痣疗效的评价.方法 太田痣患者12例,采用安琪儿色素分析系统和具有丰富临床经验的3名整形外科医师目测等两种方法,分别对疗效进行评价,并对评价结果进行对比研究.结果 计算机辅助定量分析系统对太田痣的疗效评价为(54±25.33)%,整形外科医师定性对太田痣的疗效评价为(57.92±25.82)%.计算机辅助定量分析系统能够定量得出治疗效果,并能准确测量变化的面积和颜色变化的百分比.整形外科医师与计算机辅助测量系统的疗效评价具有线性相关(P<0.001).结论 计算机辅助定量评价分析系统可以替代临床医师的评价方法,更具客观性、精确性.灰度和面积具有作为疗效量化指标的价值.%Objective To investigate the treatment effects of laser therapy on nevus of Ora using computer-aided quantitative analysis system. Methods Twelve patients with nevus of Ota were evaluated and compared both by the Angel color analysis system and by the experienced three plastic surgeons double-blindedly. Results The treatment effects analyzed by quantitative system analysis were (54 ± 25. 33 ) % and evaluated by the plastic surgeons were (57. 92 ± 25. 82) %. The quantitative system analysis could accurately calculate the area and the percentage of nevus of Ota. The subjective clinical grades correlated well with the treatment effects obtained by the proposed color analysis system(P <0. 001 ). Conclusions The computer-aided quantitative system analysis could replace clinicians. It provides an objective, accurate, and quantitative way. The two parameters( grey-scale and area) are valuable quantitative indexes for evaluation of effects.

  20. Towards a quantitative OCT image analysis.

    Directory of Open Access Journals (Sweden)

    Marina Garcia Garrido

    Full Text Available Optical coherence tomography (OCT is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study.Spectral-Domain Optical Coherence Tomography (OCT, confocal Scanning-Laser Ophthalmoscopy (SLO, and Fluorescein Angiography (FA were performed in mice (Mus musculus, gerbils (Gerbillus perpadillus, and cynomolgus monkeys (Macaca fascicularis using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer. Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/.Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP, the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated.OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions. Our results highlight the

  1. Quantitative Proteomic Approaches for Analysis of Protein S-Nitrosylation.

    Science.gov (United States)

    Qu, Zhe; Greenlief, C Michael; Gu, Zezong

    2016-01-01

    S-Nitrosylation is a redox-based post-translational modification of a protein in response to nitric oxide (NO) signaling, and it participates in a variety of processes in diverse biological systems. The significance of this type of protein modification in health and diseases is increasingly recognized. In the central nervous system, aberrant S-nitrosylation, due to excessive NO production, is known to cause protein misfolding, mitochondrial dysfunction, transcriptional dysregulation, and neuronal death. This leads to an altered physiological state and consequently contributes to pathogenesis of neurodegenerative disorders. To date, much effort has been made to understand the mechanisms underlying protein S-nitrosylation, and several approaches have been developed to unveil S-nitrosylated proteins from different organisms. Interest in determining the dynamic changes of protein S-nitrosylation under different physiological and pathophysiological conditions has underscored the need for the development of quantitative proteomic approaches. Currently, both gel-based and gel-free mass spectrometry-based quantitative methods are widely used, and they each have advantages and disadvantages but may also be used together to produce complementary data. This review evaluates current available quantitative proteomic techniques for the analysis of protein S-nitrosylation and highlights recent advances, with emphasis on applications in neurodegenerative diseases. An important goal is to provide a comprehensive guide of feasible quantitative proteomic methodologies for examining protein S-nitrosylation in research to yield insights into disease mechanisms, diagnostic biomarkers, and drug discovery.

  2. 从定性到定量信息系统项目评价方法研究%Study on Evaluation Method for Information System Projects from Qualitative Analysis to Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    徐维祥; 张全寿

    2001-01-01

    According to the theory of meta-synth esis methodology fromqualitative analysis to quantitative analysis and methodol ogy of Wuli-Shili-Renli, the necessity and specialty of evaluation for informa tion system projects are discussed. A new targets mould and an integrated method of DHGF(Delphi, analytic Hierarchy process, Grey incidence degree and Fuzzy c omprehensive evaluation) are proposed. The relation of the qualitative targets a nd quantitative targets is researched.%根据从定性到定量综合集成理论和物理-事理-人理方法论,分析了信息系统项目评价的重要性和特殊性。建立了评价信息系统的指标体系模型。探讨了定性指标和定量指标之间的关系,提出了将德尔菲、层次分析、灰色关联及模糊综合评价集成的DHGF算法。

  3. Quantitative impact of hydrothermal alteration on electrical resistivity in geothermal systems from a joint analysis of laboratory measurements and borehole data in Krafla area, N-E Iceland

    Science.gov (United States)

    Lévy, Léa; Páll Hersir, Gylfi; Flóvenz, Ólafur; Gibert, Benoit; Pézard, Philippe; Sigmundsson, Freysteinn; Briole, Pierre

    2016-04-01

    Rock permeability and fluid temperature are the two most decisive factors for a successful geothermal drilling. While those parameters are only measured from drilling, they might be estimated on the basis of their impact on electrical resistivity that might be imaged from surface soundings, for example through TEM (Transient Electro Magnetic) down to one km depth. The electrical conductivity of reservoir rocks is the sum of a volume term depending on fluid parameters and a surface term related to rock alteration. Understanding the link between electrical resistivity and geothermal key parameters requires the knowledge of hydrothermal alteration and its petrophysical signature with the Cation Exchange Capacity (CEC). Fluid-rock interactions related to hydrothermal circulation trigger the precipitation of alteration minerals, which are both witnesses of the temperature at the time of reaction and new paths for the electrical current. Alteration minerals include zeolites, smectites, chlorites, epidotes and amphiboles among which low temperatures parageneses are often the most conductive. The CEC of these mineral phases contributes to account for surface conductivity occuring at the water-rock interface. In cooling geothermal systems, these minerals constitute in petrophysical terms and from surface electrical conduction a memory of the equilibrium phase revealed from electrical probing at all scales. The qualitative impact of alteration minerals on resistivity structure has been studied over the years in the Icelandic geothermal context. In this work, the CEC impact on pore surfaces electrical conductivity is studied quantitatively at the borehole scale, where several types of volcanic rocks are mixed together, with various degrees of alteration and porosity. Five boreholes located within a few km at the Krafla volcano, Northeast Iceland, constitute the basis for this study. The deepest and reference hole, KJ-18, provides cuttings of rock and logging data down to 2215

  4. Applying Knowledge of Quantitative Design and Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  5. Sexual dimorphism of the electrosensory system: a quantitative analysis of nerve axons in the dorsal anterior lateral line nerve of the blue-spotted Fantail Stingray (Taeniura lymma).

    Science.gov (United States)

    Kempster, R M; Garza-Gisholt, E; Egeberg, C A; Hart, N S; O'Shea, O R; Collin, S P

    2013-01-01

    Quantitative studies of sensory axons provide invaluable insights into the functional significance and relative importance of a particular sensory modality. Despite the important role electroreception plays in the behaviour of elasmobranchs, to date, there have been no studies that have assessed the number of electrosensory axons that project from the peripheral ampullae to the central nervous system (CNS). The complex arrangement and morphology of the peripheral electrosensory system has a significant influence on its function. However, it is not sufficient to base conclusions about function on the peripheral system alone. To fully appreciate the function of the electrosensory system, it is essential to also assess the neural network that connects the peripheral system to the CNS. Using stereological techniques, unbiased estimates of the total number of axons were obtained for both the electrosensory bundles exiting individual ampullary organs and those entering the CNS (via the dorsal root of the anterior lateral line nerve, ALLN) in males and females of different sizes. The dorsal root of the ALLN consists solely of myelinated electrosensory axons and shows both ontogenetic and sexual dimorphism. In particular, females exhibit a greater abundance of electrosensory axons, which may result in improved sensitivity of the electrosensory system and may facilitate mate identification for reproduction. Also presented are detailed morphological data on the peripheral electrosensory system to allow a complete interpretation of the functional significance of the sexual dimorphism found in the ALLN. © 2013 S. Karger AG, Basel.

  6. Quantitative color analysis for capillaroscopy image segmentation.

    Science.gov (United States)

    Goffredo, Michela; Schmid, Maurizio; Conforto, Silvia; Amorosi, Beatrice; D'Alessio, Tommaso; Palma, Claudio

    2012-06-01

    This communication introduces a novel approach for quantitatively evaluating the role of color space decomposition in digital nailfold capillaroscopy analysis. It is clinically recognized that any alterations of the capillary pattern, at the periungual skin region, are directly related to dermatologic and rheumatic diseases. The proposed algorithm for the segmentation of digital capillaroscopy images is optimized with respect to the choice of the color space and the contrast variation. Since the color space is a critical factor for segmenting low-contrast images, an exhaustive comparison between different color channels is conducted and a novel color channel combination is presented. Results from images of 15 healthy subjects are compared with annotated data, i.e. selected images approved by clinicians. By comparison, a set of figures of merit, which highlights the algorithm capability to correctly segment capillaries, their shape and their number, is extracted. Experimental tests depict that the optimized procedure for capillaries segmentation, based on a novel color channel combination, presents values of average accuracy higher than 0.8, and extracts capillaries whose shape and granularity are acceptable. The obtained results are particularly encouraging for future developments on the classification of capillary patterns with respect to dermatologic and rheumatic diseases.

  7. Quantitative gold nanoparticle analysis methods: A review.

    Science.gov (United States)

    Yu, Lei; Andriola, Angelo

    2010-08-15

    Research and development in the area of gold nanoparticles' (AuNPs) preparation, characterization, and applications are burgeoning in recent years. Many of the techniques and protocols are very mature, but two major concerns are with the mass domestic production and the consumption of AuNP based products. First, how many AuNPs exist in a dispersion? Second, where are the AuNPs after digestion by the environment and how many are there? To answer these two questions, reliable and reproducible methods are needed to analyze the existence and the population of AuNP in samples. This review summarized the most recent chemical and particle quantitative analysis methods that have been used to characterize the concentration (in number of moles of gold per liter) or population (in number of particles per mL) of AuNPs. The methods summarized in this review include, mass spectroscopy, electroanalytical methods, spectroscopic methods, and particle counting methods. These methods may count the number of AuNP directly or analyze the total concentration of element gold in an AuNP dispersion.

  8. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  9. Quantitative risks analysis of maritime terminal petrochemical

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Leandro Silveira; Leal, Cesar A. [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica (PROMEC)]. E-mail: leandro19889900@yahoo.com.br

    2008-07-01

    This work consists of the application of a computer program (RISKAN) developed for studies of quantification of industrial risks and also a revision of the models used in the program. As part the evaluation made, a test was performed with the application of the computer program to estimate the risks for a marine terminal for storage of petrochemical products, in the city of Rio Grande, Brazil. Thus, as part of the work, it was performed a Quantitative Risk Analysis associated to the terminal, both for the workers and for the population nearby, with a verification of acceptability using the tolerability limits established by the State Licensing Agency (FEPAM-RS). In the risk analysis methodology used internationally, the most used way of presenting results of social risks is in the graphical form with the use of the FN curves and for the individual risk it is common the use of the iso-risk curves traced on the map of the area where is the plant. In the beginning of the study, both a historical analysis of accidents and use of the technique of Preliminary Analysis of Risks were made in order to aid in the process of identification of the possible scenarios of accidents related to the activities in the terminal. After identifying the initiating events, their frequencies or probabilities of occurrence were estimated and followed by the calculations of the physical effects and deaths, with the use, inside the computer program, of published models of Prins Mauritz Laboratory and of American Institute of Chemical Engineers. The average social risk obtained for the external populations was of 8.7x10{sup -7} fatality.year{sup -1} and for the internal population (people working inside the terminal), 3.2x10{sup -4} fatality.year-1. The accident scenario that most contributed to the social risk was death due to exposure to the thermal radiation caused by pool fire, with 84.3% of the total estimated for external populations and 82.9% for the people inside the terminal. The

  10. Novel method for ANA quantitation using IIF imaging system.

    Science.gov (United States)

    Peng, Xiaodong; Tang, Jiangtao; Wu, Yongkang; Yang, Bin; Hu, Jing

    2014-02-01

    A variety of antinuclear antibodies (ANAs) are found in the serum of patients with autoimmune diseases. The detection of abnormal ANA titers is a critical criterion for diagnosis of systemic lupus erythematosus (SLE) and other connective tissue diseases. Indirect immunofluorescence assay (IIF) on HEp-2 cells is the gold standard method to determine the presence of ANA and therefore provides information about the localization of autoantigens that are useful for diagnosis. However, its utility was limited in prognosing and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. On the other hand, ELISA for the detection of ANA can quantitate ANA but could not provide further information about the localization of the autoantigens. It would be ideal to integrate both of the quantitative and qualitative methods. To address this issue, this study was conducted to quantitatively detect ANAs by using IIF imaging analysis system. Serum samples from patients with ANA positive (including speckled, homogeneous, nuclear mixture and cytoplasmic mixture patterns) and negative were detected for ANA titers by the classical IIF and analyzed by an image system, the image of each sample was acquired by the digital imaging system and the green fluorescence intensity was quantified by the Image-Pro plus software. A good correlation was found in between two methods and the correlation coefficients (R(2)) of various ANA patterns were 0.942 (speckled), 0.942 (homogeneous), 0.923 (nuclear mixture) and 0.760 (cytoplasmic mixture), respectively. The fluorescence density was linearly correlated with the log of ANA titers in various ANA patterns (R(2)>0.95). Moreover, the novel ANA quantitation method showed good reproducibility (F=0.091, p>0.05) with mean±SD and CV% of positive, and negative quality controls were equal to 126.4±9.6 and 7.6%, 10.4±1.25 and 12

  11. Reproducible quantitative proteotype data matrices for systems biology.

    Science.gov (United States)

    Röst, Hannes L; Malmström, Lars; Aebersold, Ruedi

    2015-11-05

    Historically, many mass spectrometry-based proteomic studies have aimed at compiling an inventory of protein compounds present in a biological sample, with the long-term objective of creating a proteome map of a species. However, to answer fundamental questions about the behavior of biological systems at the protein level, accurate and unbiased quantitative data are required in addition to a list of all protein components. Fueled by advances in mass spectrometry, the proteomics field has thus recently shifted focus toward the reproducible quantification of proteins across a large number of biological samples. This provides the foundation to move away from pure enumeration of identified proteins toward quantitative matrices of many proteins measured across multiple samples. It is argued here that data matrices consisting of highly reproducible, quantitative, and unbiased proteomic measurements across a high number of conditions, referred to here as quantitative proteotype maps, will become the fundamental currency in the field and provide the starting point for downstream biological analysis. Such proteotype data matrices, for example, are generated by the measurement of large patient cohorts, time series, or multiple experimental perturbations. They are expected to have a large effect on systems biology and personalized medicine approaches that investigate the dynamic behavior of biological systems across multiple perturbations, time points, and individuals. © 2015 Röst et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  12. Chromatic Image Analysis For Quantitative Thermal Mapping

    Science.gov (United States)

    Buck, Gregory M.

    1995-01-01

    Chromatic image analysis system (CIAS) developed for use in noncontact measurements of temperatures on aerothermodynamic models in hypersonic wind tunnels. Based on concept of temperature coupled to shift in color spectrum for optical measurement. Video camera images fluorescence emitted by phosphor-coated model at two wavelengths. Temperature map of model then computed from relative brightnesses in video images of model at those wavelengths. Eliminates need for intrusive, time-consuming, contact temperature measurements by gauges, making it possible to map temperatures on complex surfaces in timely manner and at reduced cost.

  13. Quantitative Data Analysis--In the Graduate Curriculum

    Science.gov (United States)

    Albers, Michael J.

    2017-01-01

    A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…

  14. The Curriculum in Quantitative Analysis: Results of a Survey.

    Science.gov (United States)

    Locke, David C.; Grossman, William E. L.

    1987-01-01

    Reports on the results of a survey of college level instructors of quantitative analysis courses. Discusses what topics are taught in such courses, how much weight is given to these topics, and which experiments are used in the laboratory. Poses some basic questions about the curriculum in quantitative analysis. (TW)

  15. Chromatin immunoprecipitation: optimization, quantitative analysis and data normalization

    Directory of Open Access Journals (Sweden)

    Peterhansel Christoph

    2007-09-01

    Full Text Available Abstract Background Chromatin remodeling, histone modifications and other chromatin-related processes play a crucial role in gene regulation. A very useful technique to study these processes is chromatin immunoprecipitation (ChIP. ChIP is widely used for a few model systems, including Arabidopsis, but establishment of the technique for other organisms is still remarkably challenging. Furthermore, quantitative analysis of the precipitated material and normalization of the data is often underestimated, negatively affecting data quality. Results We developed a robust ChIP protocol, using maize (Zea mays as a model system, and present a general strategy to systematically optimize this protocol for any type of tissue. We propose endogenous controls for active and for repressed chromatin, and discuss various other controls that are essential for successful ChIP experiments. We experienced that the use of quantitative PCR (QPCR is crucial for obtaining high quality ChIP data and we explain why. The method of data normalization has a major impact on the quality of ChIP analyses. Therefore, we analyzed different normalization strategies, resulting in a thorough discussion of the advantages and drawbacks of the various approaches. Conclusion Here we provide a robust ChIP protocol and strategy to optimize the protocol for any type of tissue; we argue that quantitative real-time PCR (QPCR is the best method to analyze the precipitates, and present comprehensive insights into data normalization.

  16. Modular System Modeling for Quantitative Reliability Evaluation of Technical Systems

    Directory of Open Access Journals (Sweden)

    Stephan Neumann

    2016-01-01

    Full Text Available In modern times, it is necessary to offer reliable products to match the statutory directives concerning product liability and the high expectations of customers for durable devices. Furthermore, to maintain a high competitiveness, engineers need to know as accurately as possible how long their product will last and how to influence the life expectancy without expensive and time-consuming testing. As the components of a system are responsible for the system reliability, this paper introduces and evaluates calculation methods for life expectancy of common machine elements in technical systems. Subsequently, a method for the quantitative evaluation of the reliability of technical systems is proposed and applied to a heavy-duty power shift transmission.

  17. Combination and Integration of Qualitative and Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Philipp Mayring

    2001-02-01

    Full Text Available In this paper, I am going to outline ways of combining qualitative and quantitative steps of analysis on five levels. On the technical level, programs for the computer-aided analysis of qualitative data offer various combinations. Where the data are concerned, the employment of categories (for instance by using qualitative content analysis allows for combining qualitative and quantitative forms of data analysis. On the individual level, the creation of types and the inductive generalisation of cases allow for proceeding from individual case material to quantitative generalisations. As for research design, different models can be distinguished (preliminary study, generalisation, elaboration, triangulation which combine qualitative and quantitative steps of analysis. Where the logic of research is concerned, it can be shown that an extended process model which combined qualitative and quantitative research can be appropriate and thus lead to an integration of the two approaches. URN: urn:nbn:de:0114-fqs010162

  18. Quantitative simulation of the hydrothermal systems of crystallizing magmas on the basis of transport theory and oxygen isotope data: an analysis of the Skaergaard intrusion

    Energy Technology Data Exchange (ETDEWEB)

    Norton, D. (Univ. of Arizona, Tucson); Taylor, H.P. Jr.

    1979-08-01

    Application of the principles of transport theory to studies of magma-hydrothermal systems permits quantitative predictions to be made of the consequences of magma intruding into permeable rocks. Transport processes which redistribute energy, mass, and momentum in these environments can be represented by a set of partial differential equations involving the rate of change of extensive properties in the system. Numerical approximation and computer evaluation of the transport equations effectively simulate the crystallization of magma, cooling of the igneous rocks, advection of chemical components, and chemical and isotopic mass transfer between minerals and aqueous solution. Numerical modeling of the deep portions of the Skaergaard magma-hydrothermal system has produced detailed maps of the temperature, pressure, fluid velocity, integrated fluid flux, delta/sup 18/O-values in rock and fluid, and extent of nonequilibrium exchange reactions between fluid and rock as a function of time for a two-dimensional cross-section through the pluton. An excellent match was made between calculated delta/sup 18/O-values and the measured delta/sup 18/O-values in the three principal rock units, basalt, gabbro, and gneiss, as well as in xenoliths of roof rocks that are now embedded in Layered Series; the latter were evidently depleted in /sup 18/O early in the system's cooling history, prior to falling to the bottom of the magma chamber. The best match was realized for a system in which the bulk rock permeabilities were 10/sup -13/ cm/sup 2/ for the intrusion, 10/sup 11/ cm/sup 2/ for basalt, and 10/sup -16/ cm/sup 2/ for gneiss; reaction domain sizes were 0.2 cm in the intrusion and gneiss and 0.01 cm in the basalts, and activation energy for the isotope exchange reaction between fluid and plagioclase was 30 kcal/ mole.

  19. Some Epistemological Considerations Concerning Quantitative Analysis

    Science.gov (United States)

    Dobrescu, Emilian

    2008-01-01

    This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…

  20. Structural model analysis of multiple quantitative traits.

    Directory of Open Access Journals (Sweden)

    Renhua Li

    2006-07-01

    Full Text Available We introduce a method for the analysis of multilocus, multitrait genetic data that provides an intuitive and precise characterization of genetic architecture. We show that it is possible to infer the magnitude and direction of causal relationships among multiple correlated phenotypes and illustrate the technique using body composition and bone density data from mouse intercross populations. Using these techniques we are able to distinguish genetic loci that affect adiposity from those that affect overall body size and thus reveal a shortcoming of standardized measures such as body mass index that are widely used in obesity research. The identification of causal networks sheds light on the nature of genetic heterogeneity and pleiotropy in complex genetic systems.

  1. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  2. Quantitative Analysis of Seismicity in Iran

    Science.gov (United States)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  3. Head-to-head comparison of quantitative and semi-quantitative ultrasound scoring systems for rheumatoid arthritis

    DEFF Research Database (Denmark)

    Terslev, Lene; Ellegaard, Karen; Christensen, Robin;

    2012-01-01

    To evaluate the reliability and agreement of semi-quantitative scoring (SQS) and quantitative scoring (QS) systems. To compare the two types of scoring system and investigate the construct validity for both scoring systems.......To evaluate the reliability and agreement of semi-quantitative scoring (SQS) and quantitative scoring (QS) systems. To compare the two types of scoring system and investigate the construct validity for both scoring systems....

  4. A quantitative analysis of IRAS maps of molecular clouds

    Science.gov (United States)

    Wiseman, Jennifer J.; Adams, Fred C.

    1994-01-01

    We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.

  5. QUANTITATIVE MEASUREMENT OF THE SIMILARITIES AND DIFFERENCES OF CLONES OF GRAPES USING CONTOURS OF LEAVES WITH THE USE OF ASC-ANALYSIS AND "EIDOS" SYSTEM

    Directory of Open Access Journals (Sweden)

    Lutsenko Y. V.

    2016-02-01

    Full Text Available The article discusses the application of automated system-cognitive analysis (ASC-analysis, its mathematical model is a system of information theory and implements, its software tools – intellectual system called "Eidos" for solving one of the important tasks of ampelography: to quantify the similarities and differences of different clones of grapes using contours of the leaves. To solve this task we perform the following steps: 1 digitization of scanned images of the leaves and creation their mathematical models; 2 formation mathematical models of specific leaves with the application of information theory; 3 modeling the generalized images of leaves of different clones on the basis of specific leaves (multiparameter typing; 4 verification of the model by identifying specific leaf images with generic clones, i.e., classes (system identification; 5 quantification of the similarities and differences of the clones, i.e. cluster-constructive analysis of generalized images of leaves of various clones. The specific shape of the contour of the leaf is regarded as noise information on the clone to which it relates, including information about the true shape of a leaf of this clone (clean signal and noise, which distort the real shape, due to the random influence of the environment. Software tools of ASA-analysis which is intellectual "Eidos" system provides the noise suppression and the detection of a signal about the true shape of a leaf of each clone on the basis of a number of noisy concrete examples of the leaves of this clone. This creates a single image of the shape of the leaf of each clone, independent of their specific implementations, i.e. "Eidos" of these images (in the sense of Plato - the prototype or archetype (in the Jungian sense of the images

  6. Quantitative Analysis of Polarimetric Model-Based Decomposition Methods

    Directory of Open Access Journals (Sweden)

    Qinghua Xie

    2016-11-01

    Full Text Available In this paper, we analyze the robustness of the parameter inversion provided by general polarimetric model-based decomposition methods from the perspective of a quantitative application. The general model and algorithm we have studied is the method proposed recently by Chen et al., which makes use of the complete polarimetric information and outperforms traditional decomposition methods in terms of feature extraction from land covers. Nevertheless, a quantitative analysis on the retrieved parameters from that approach suggests that further investigations are required in order to fully confirm the links between a physically-based model (i.e., approaches derived from the Freeman–Durden concept and its outputs as intermediate products before any biophysical parameter retrieval is addressed. To this aim, we propose some modifications on the optimization algorithm employed for model inversion, including redefined boundary conditions, transformation of variables, and a different strategy for values initialization. A number of Monte Carlo simulation tests for typical scenarios are carried out and show that the parameter estimation accuracy of the proposed method is significantly increased with respect to the original implementation. Fully polarimetric airborne datasets at L-band acquired by German Aerospace Center’s (DLR’s experimental synthetic aperture radar (E-SAR system were also used for testing purposes. The results show different qualitative descriptions of the same cover from six different model-based methods. According to the Bragg coefficient ratio (i.e., β , they are prone to provide wrong numerical inversion results, which could prevent any subsequent quantitative characterization of specific areas in the scene. Besides the particular improvements proposed over an existing polarimetric inversion method, this paper is aimed at pointing out the necessity of checking quantitatively the accuracy of model-based PolSAR techniques for a

  7. Local Data Libraries for Quantitative Political Analysis.

    Science.gov (United States)

    Tanenbaum, Eric

    1979-01-01

    Considers several systems automating local departmental-based data facilities. Describes design and implementation of a simple system. Concludes by reflecting on changes needed in the system. (Author/CK)

  8. [Quantitative analysis of transformer oil dissolved gases using FTIR].

    Science.gov (United States)

    Zhao, An-xin; Tang, Xiao-jun; Wang, Er-zhen; Zhang, Zhong-hua; Liu, Jun-hua

    2013-09-01

    For the defects of requiring carrier gas and regular calibration, and low safety using chromatography to on line monitor transformer dissolved gases, it was attempted to establish a dissolved gas analysis system based on Fourier transform infrared spectroscopy. Taking into account the small amount of characteristic gases, many components, detection limit and safety requirements and the difficulty of degasser to put an end to the presence of interference gas, the quantitative analysis model was established based on sparse partial least squares, piecewise section correction and feature variable extraction algorithm using improvement TR regularization. With the characteristic gas of CH4, C2H6, C2H6, and CO2, the results show that using FTIR meets DGA requirements with the spectrum wave number resolution of 1 cm(-1) and optical path of 10 cm.

  9. Structural and quantitative analysis of Equisetum alkaloids.

    Science.gov (United States)

    Cramer, Luise; Ernst, Ludger; Lubienski, Marcus; Papke, Uli; Schiebel, Hans-Martin; Jerz, Gerold; Beuerle, Till

    2015-08-01

    Equisetum palustre L. is known for its toxicity for livestock. Several studies in the past addressed the isolation and identification of the responsible alkaloids. So far, palustrine (1) and N(5)-formylpalustrine (2) are known alkaloids of E. palustre. A HPLC-ESI-MS/MS method in combination with simple sample work-up was developed to identify and quantitate Equisetum alkaloids. Besides the two known alkaloids six related alkaloids were detected in different Equisetum samples. The structure of the alkaloid palustridiene (3) was derived by comprehensive 1D and 2D NMR experiments. N(5)-Acetylpalustrine (4) was also thoroughly characterized by NMR for the first time. The structure of N(5)-formylpalustridiene (5) is proposed based on mass spectrometry results. Twenty-two E. palustre samples were screened by a HPLC-ESI-MS/MS method after development of a simple sample work-up and in most cases the set of all eight alkaloids were detected in all parts of the plant. A high variability of the alkaloid content and distribution was found depending on plant organ, plant origin and season ranging from 88 to 597mg/kg dried weight. However, palustrine (1) and the alkaloid palustridiene (3) always represented the main alkaloids. For the first time, a comprehensive identification, quantitation and distribution of Equisetum alkaloids was achieved.

  10. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    Science.gov (United States)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  11. A Quantitative Systems Pharmacology Perspective on Cancer Immunology

    Directory of Open Access Journals (Sweden)

    Christina Byrne-Hoffman

    2015-04-01

    Full Text Available The return on investment within the pharmaceutical industry has exhibited an exponential decline over the last several decades. Contemporary analysis suggests that the rate-limiting step associated with the drug discovery and development process is our limited understanding of the disease pathophysiology in humans that is targeted by a drug. Similar to other industries, mechanistic modeling and simulation has been proposed as an enabling quantitative tool to help address this problem. Moreover, immunotherapies are transforming the clinical treatment of cure cancer and are becoming a major segment of the pharmaceutical research and development pipeline. As the clinical benefit of these immunotherapies seems to be limited to subset of the patient population, identifying the specific defect in the complex network of interactions associated with host immunity to a malignancy is a major challenge for expanding the clinical benefit. Understanding the interaction between malignant and immune cells is inherently a systems problem, where an engineering perspective may be helpful. The objective of this manuscript is to summarize this quantitative systems perspective, particularly with respect to developing immunotherapies for the treatment of cancer.

  12. Quantitative performance assessments for neuromagnetic imaging systems.

    Science.gov (United States)

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  13. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Soichiro; Araki; Itaru; Nishioka; Yoshihiko; Suemura

    2003-01-01

    This paper proposes two migration scenarios from China ring networks to ASON mesh networks. In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  14. China ASON Network Migration Scenarios and Their Quantitative Analysis

    Institute of Scientific and Technical Information of China (English)

    Guoying Zhang; Soichiro Araki; Itaru Nishioka; Yoshihiko Suemura

    2003-01-01

    This paper proposes two migration scenarios from China rin g networks to ASON mesh networks . In our quantitative analysis with ASON/GMPLS simulator, a subnetwork protection scheme achieved best balanced performance in resource utilization and restoration time.

  15. Quantitative and qualitative analysis of sterols/sterolins and ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-06-03

    Jun 3, 2008 ... Quantitative and qualitative analysis of sterols/sterolins ... method was developed to identify and quantify sterols (especially β-sitosterol) in chloroform extracts of ... Studies with phytosterols, especially β-sitosterol, have.

  16. Quantitative transcript analysis of the inducible expression system pSIP: comparison of the overexpression of Lactobacillus spp. β-galactosidases in Lactobacillus plantarum

    Directory of Open Access Journals (Sweden)

    Eijsink Vincent GH

    2011-06-01

    Full Text Available Abstract Background Two sets of overlapping genes, lacLMReu and lacLMAci, encoding heterodimeric β-galactosidases from Lactobacillus reuteri and Lactobacillus acidophilus, respectively, have previously been cloned and expressed using the pSIP vector system and Lactobacillus plantarum WCSF1 as host. Despite the high similarity between these lacLM genes and the use of identical cloning and expression strategies, strains harboring lacLMReu produced about twenty-fold more β-galactosidase than strains containing lacLMAci. Results In this study, the plasmid copy numbers (PCN of expression vectors pEH9R (lacLMReu and pEH9A (lacLMAci as well as the transcription levels of both lacLM genes were compared using quantitative PCR methods. Analyses of parallel fermentations of L. plantarum harboring either pEH9R or pEH9A showed that the expression plasmids were present in similar copy numbers. However, transcript levels of lacLM from L. reuteri (pEH9R were up to 18 times higher than those of lacLM from L. acidophilus (pEH9A. As a control, it was shown that the expression levels of regulatory genes involved in pheromone-induced promoter activation were similar in both strains. Conclusion The use of identical expression strategies for highly similar genes led to very different mRNA levels. The data indicate that this difference is primarily caused by translational effects that are likely to affect both mRNA synthesis rates and mRNA stability. These translational effects thus seem to be a dominant determinant for the success of gene expression efforts in lactobacilli.

  17. Quantitative Thin-Layer Chromatography/Mass Spectrometry Analysis of Caffeine Using a Surface Sampling Probe Electrospray Ionization Tandem Mass Spectrometry System

    Energy Technology Data Exchange (ETDEWEB)

    Ford, Michael J [ORNL; Deibel, Michael A. [Earlham College; Tomkins, Bruce A [ORNL; Van Berkel, Gary J [ORNL

    2005-01-01

    Quantitative determination of caffeine on reversed-phase C8 thin-layer chromatography plates using a surface sampling electrospray ionization system with tandem mass spectrometry detection is reported. The thin-layer chromatography/electrospray tandem mass spectrometry method employed a deuterium-labeled caffeine internal standard and selected reaction monitoring detection. Up to nine parallel caffeine bands on a single plate were sampled in a single surface scanning experiment requiring 35 min at a surface scan rate of 44 {mu}m/s. A reversed-phase HPLC/UV caffeine assay was developed in parallel to assess the mass spectrometry method performance. Limits of detection for the HPLC/UV and thin-layer chromatography/electrospray tandem mass spectrometry methods determined from the calibration curve statistics were 0.20 ng injected (0.50 {mu}L) and 1.0 ng spotted on the plate, respectively. Spike recoveries with standards and real samples ranged between 97 and 106% for both methods. The caffeine content of three diet soft drinks (Diet Coke, Diet Cherry Coke, Diet Pepsi) and three diet sport drinks (Diet Turbo Tea, Speed Stack Grape, Speed Stack Fruit Punch) was measured. The HPLC/UV and mass spectrometry determinations were in general agreement, and these values were consistent with the quoted values for two of the three diet colas. In the case of Diet Cherry Coke and the diet sports drinks, the determined caffeine amounts using both methods were consistently higher (by 8% or more) than the literature values.

  18. Quantitative data analysis in education a critical introduction using SPSS

    CERN Document Server

    Connolly, Paul

    2007-01-01

    This book provides a refreshing and user-friendly guide to quantitative data analysis in education for students and researchers. It assumes absolutely no prior knowledge of quantitative methods or statistics. Beginning with the very basics, it provides the reader with the knowledge and skills necessary to be able to undertake routine quantitative data analysis to a level expected of published research. Rather than focusing on teaching statistics through mathematical formulae, the book places an emphasis on using SPSS to gain a real feel for the data and an intuitive grasp of t

  19. Joint association analysis of bivariate quantitative and qualitative traits.

    Science.gov (United States)

    Yuan, Mengdie; Diao, Guoqing

    2011-11-29

    Univariate genome-wide association analysis of quantitative and qualitative traits has been investigated extensively in the literature. In the presence of correlated phenotypes, it is more intuitive to analyze all phenotypes simultaneously. We describe an efficient likelihood-based approach for the joint association analysis of quantitative and qualitative traits in unrelated individuals. We assume a probit model for the qualitative trait, under which an unobserved latent variable and a prespecified threshold determine the value of the qualitative trait. To jointly model the quantitative and qualitative traits, we assume that the quantitative trait and the latent variable follow a bivariate normal distribution. The latent variable is allowed to be correlated with the quantitative phenotype. Simultaneous modeling of the quantitative and qualitative traits allows us to make more precise inference on the pleiotropic genetic effects. We derive likelihood ratio tests for the testing of genetic effects. An application to the Genetic Analysis Workshop 17 data is provided. The new method yields reasonable power and meaningful results for the joint association analysis of the quantitative trait Q1 and the qualitative trait disease status at SNPs with not too small MAF.

  20. Research on Petroleum Reservoir Diagenesis and Damage Using EDS Quantitative Analysis Method With Standard Samples

    Institute of Scientific and Technical Information of China (English)

    包书景; 陈文学; 等

    2000-01-01

    In recent years,the X-ray spectrometer has been devekloped not only just in enhancing resolution,but also towards dynamic analysis.Computer modeling processing,sampled quantitative analysis and supra-light element analysis.With the gradual sophistication of the quantitative analysis system software,the rationality and accuracy of the established sample deferential document have become the most important guarantee to the reliability of sample quantitative analysis.This work is an important technical subject in China Petroleum Reservoir Research.Through two years of research and experimental work,the EDS quantitative analysis method for petroleum geolgey and resevoir research has been established.and referential documents for five mineral(silicate,etc).specimen standards have been compiled.Closely combining the shape characters and compositional characters of the minerals together and applying them into reservoir diagenetic research and prevention of oil formations from damage,we have obtained obvious geological effects.

  1. A quantitative assessment of LCOs using system dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Kang, K. M.; Jey, M. S. [Hanyang Univ., Seoul (Korea, Republic of); Seong, C. K. [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    2002-10-01

    The research on the technical specifications improvements for too conservative TS has been conducted recently using PSA techniques. In this study a tool of Vensim, which one of System Dynamics software products, is applied to evaluate the Limiting Conditions for Operation (LCOs) quantitatively. A value of core damage frequency in PSA is used as risk measure. The analysis of both full power operation and shutdown operation has been compared for the value of the CDF. The time dependent framework developed in this study has been applied to an example problem accompanied by the operation of Wolsong Nuclear Power plants and it is shown that it is flexible in that it can be applied to any operational context of the technical specification00.

  2. Quantitative Analysis on the Energy and Environmental Impact of the Korean National Energy R&D Roadmap a Using Bottom-Up Energy System Model

    Directory of Open Access Journals (Sweden)

    Sang Jin Choi

    2017-03-01

    Full Text Available According to the Paris Agreement at the 21st Conference of the Parties, 196 member states are obliged to submit their Intended Nationally Determined Contributions (INDC for every 5 years. As a member, South Korea has already proposed the reduction target and need to submit the achievement as a result of the policies and endeavors in the near future. In this paper, a Korean bottom-up energy system model to support the low-carbon national energy R&D roadmap will be introduced and through the modeling of various scenarios, the mid-to long-term impact on energy consumptions and CO2 emissions will be analyzed as well. The results of the analysis showed that, assuming R&D investments for the 11 types of technologies, savings of 13.7% with regards to final energy consumptions compared to the baseline scenario would be feasible by 2050. Furthermore, in the field of power generation, the generation proportion of new and renewable energy is expected to increase from 3.0% as of 2011 to 19.4% by 2050. This research also suggested that the analysis on the Energy Technology R&D Roadmap based on the model can be used not only for overall impact analysis and R&D portfolio establishment, but also for the development of detailed R&D strategies.

  3. Multiple quantitative trait analysis using bayesian networks.

    Science.gov (United States)

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness.

  4. Highly hydrogen-sensitive thermal desorption spectroscopy system for quantitative analysis of low hydrogen concentration (∼1 × 10(16) atoms/cm(3)) in thin-film samples.

    Science.gov (United States)

    Hanna, Taku; Hiramatsu, Hidenori; Sakaguchi, Isao; Hosono, Hideo

    2017-05-01

    We developed a highly hydrogen-sensitive thermal desorption spectroscopy (HHS-TDS) system to detect and quantitatively analyze low hydrogen concentrations in thin films. The system was connected to an in situ sample-transfer chamber system, manipulators, and an rf magnetron sputtering thin-film deposition chamber under an ultra-high-vacuum (UHV) atmosphere of ∼10(-8) Pa. The following key requirements were proposed in developing the HHS-TDS: (i) a low hydrogen residual partial pressure, (ii) a low hydrogen exhaust velocity, and (iii) minimization of hydrogen thermal desorption except from the bulk region of the thin films. To satisfy these requirements, appropriate materials and components were selected, and the system was constructed to extract the maximum performance from each component. Consequently, ∼2000 times higher sensitivity to hydrogen than that of a commercially available UHV-TDS system was achieved using H(+)-implanted Si samples. Quantitative analysis of an amorphous oxide semiconductor InGaZnO4 thin film (1 cm × 1 cm × 1 μm thickness, hydrogen concentration of 4.5 × 10(17) atoms/cm(3)) was demonstrated using the HHS-TDS system. This concentration level cannot be detected using UHV-TDS or secondary ion mass spectroscopy (SIMS) systems. The hydrogen detection limit of the HHS-TDS system was estimated to be ∼1 × 10(16) atoms/cm(3), which implies ∼2 orders of magnitude higher sensitivity than that of SIMS and resonance nuclear reaction systems (∼10(18) atoms/cm(3)).

  5. Quantitative and Qualitative Analysis of Aconitum Alkaloids in Raw and Processed Chuanwu and Caowu by HPLC in Combination with Automated Analytical System and ESI/MS/MS

    Directory of Open Access Journals (Sweden)

    Aimin Sun

    2012-01-01

    Full Text Available HPLC in combination with automated analytical system and ESI/MS/MS was used to analyze aconitine (A, mesaconitine (MA, hypaconitine (HA, and their benzoyl analogs in the Chinese herbs Caowu and Chuanwu. First, an HPLC method was developed and validated to determine A, MA, and HA in raw and processed Caowu and Chuanwu. Then an automated analytical system and ESI/MS/MS were applied to analyze these alkaloids and their semihydrolyzed products. The results obtained from automated analytical system are identical to those from ESI/MS/MS, which indicated that the method is a convenient and rapid tool for the qualitative analysis of herbal preparations. Furthermore, HA was little hydrolyzed by heating processes and thus it might account more for the toxicity of processed aconites. Hence, HA could be used as an indicator when one alkaloid is required as a reference to monitor the quality of raw and processed Chuanwu and Caowu. In addition, the raw and processed Chuanwu and Caowu can be distinguished by monitoring the ratio of A and MA to HA.

  6. Is having quality as an item on the executive board associated with the implementation of quality management systems in European hospitals: a quantitative analysis.

    NARCIS (Netherlands)

    Botje, D.; Klazinga, N.S.; Suñol, R.; Groene, O.; Pfaff, H.; Mannion, R.; Depaigne-Loth, A.; Arah, O.A.; DerSarkissian, M.; Wagner, C.

    2014-01-01

    Objective:To assess whether there is a relationship between having quality as an item on the board’s agenda, perceived external pressure (PEP) and the implementation of quality management in European hospitals. Design: A quantitative, mixed method, cross-sectional study in seven European countries i

  7. Quantitative image analysis of microstructure development during pressure sintering of CoO

    Energy Technology Data Exchange (ETDEWEB)

    Miro, A; Notis, M R

    1979-01-01

    An automatic system for quantitative image analysis was developed to study the transition from intermediate to final stage pore structure in pressure-sintered CoO. One of the significant results from this study indicates that the projected length is a good parameter to observe the transition from open cylindrical to closed porosity. Quantitative image analysis indicates that the Zener relationship (r/G approx. P) is obeyed through the entire sintering process.

  8. An approach for quantitative image quality analysis for CT

    Science.gov (United States)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  9. Quantitative Safety Analysis of Train Control System Based on Markov Decision Process%基于 Markov 决策过程的列控系统定量安全分析方法

    Institute of Scientific and Technical Information of China (English)

    周果; 赵会兵

    2016-01-01

    In order to manage the hazards in the phase of train control system design and safety assessment , the quantitative safety analysis of train control system is crucial . T he results of the quantitative analysis can be used to judge and compare the pros and cons of the prototype designs , to evaluate the probabilistic risks of haz‐ards and to determine w hether hidden dangers can be controlled within the acceptable range with the risk miti‐gation measures taken . In this paper , a Markov Decision Process based modelling method was proposed to build system behavior model of the two consecutive trains in the train control system , to integrate the normal behaviours and failure behaviours of the system and to put forward Comprehensive Behaviour Model (CBM ) . The dangerous failure probability of the hazard was calculated under the probabilistic model checking tool PRISM . The methodology of quantitative safety analysis for train control system was presented .%为了在列控系统的设计阶段和安全评估阶段对系统隐患进行把握,对系统的设计进行定量安全分析是至关重要的。定量分析的结果可以用来判断和比较设计的优劣,也可用来评估隐患的风险,并根据分析结果判断所采取的隐患控制措施是否使隐患的风险被控制在可接受的范围内。本文应用以Markov决策过程为基础的建模方法,对列控系统中的双车追踪场景进行系统行为建模,集成系统正常行为和失效行为,提出综合系统行为模型CBM ,并通过概率模型检验工具 PRISM对危险失效概率进行准确计算,提出列控系统定量安全分析方法。

  10. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  11. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  12. Quantitative analysis of multiple sclerosis: a feasibility study

    Science.gov (United States)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  13. Quantitative Systems Pharmacology Approaches Applied to Microphysiological Systems (MPS): Data Interpretation and Multi-MPS Integration.

    Science.gov (United States)

    Yu, J; Cilfone, N A; Large, E M; Sarkar, U; Wishnok, J S; Tannenbaum, S R; Hughes, D J; Lauffenburger, D A; Griffith, L G; Stokes, C L; Cirit, M

    2015-10-01

    Our goal in developing Microphysiological Systems (MPS) technology is to provide an improved approach for more predictive preclinical drug discovery via a highly integrated experimental/computational paradigm. Success will require quantitative characterization of MPSs and mechanistic analysis of experimental findings sufficient to translate resulting insights from in vitro to in vivo. We describe herein a systems pharmacology approach to MPS development and utilization that incorporates more mechanistic detail than traditional pharmacokinetic/pharmacodynamic (PK/PD) models. A series of studies illustrates diverse facets of our approach. First, we demonstrate two case studies: a PK data analysis and an inflammation response--focused on a single MPS, the liver/immune MPS. Building on the single MPS modeling, a theoretical investigation of a four-MPS interactome then provides a quantitative way to consider several pharmacological concepts such as absorption, distribution, metabolism, and excretion in the design of multi-MPS interactome operation and experiments.

  14. Quantitative Microscopic Analysis of Myelinated Nerve Fibers

    NARCIS (Netherlands)

    Prodanov, D.P.; Feierabend, Hans K.P.; Marani, Enrico; Flynn, Cian E.; Callaghan, Brandon R.

    2010-01-01

    Neuroanatomy is the study of the anatomical organization of the brain. Reciprocal communication between the brain and the cardiovascular system is important in sustaining neurobehavioral states that allow organisms to cope with their environment. Furthermore, in vertebrate animals, the routes that

  15. Quantitative analysis of cascade impactor samples - revisited

    Science.gov (United States)

    Orlić , I.; Chiam, S. Y.; Sanchez, J. L.; Tang, S. M.

    1999-04-01

    Concentrations of aerosols collected in Singapore during the three months long haze period that affected the whole South-East Asian region in 1997 are reported. Aerosol samples were continuously collected by using a fine aerosol sampler (PM2.5) and occasionally with a single orifice cascade impactor (CI) sampler. Our results show that in the fine fraction (<2.5 μm) the concentrations of two well-known biomass burning products, i.e. K and S were generally increased by a factor 2-3 compared to the non-hazy periods. However, a discrepancy was noticed, at least for elements with lower atomic number (Ti and below) between the results obtained by the fine aerosol sampler and the cascade impactor. Careful analysis by means of Nuclear Microscopy, in particular by the Scanning Transmission Ion Microscopy (STIM) technique, revealed that thicknesses of the lower CI stages exceeded thick target limits for 2 MeV protons. Detailed depth profiles of all CI stages were therefore measured using the STIM technique and concentrations corrected for absorption and proton energy loss. After correcting results for the actual sample thickness, concentrations of all major elements (S, Cl, K, Ca) agreed much better with the PM2.5 results. The importance of implementing thick target corrections in analysis of CI samples, especially those collected in the urban environments, is emphasized. Broad beam PIXE analysis approach is certainly not adequate in these cases.

  16. Quantitative analysis of Li by PIGE technique

    Science.gov (United States)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  17. Analysis of generalized interictal discharges using quantitative EEG.

    Science.gov (United States)

    da Silva Braga, Aline Marques; Fujisao, Elaine Keiko; Betting, Luiz Eduardo

    2014-12-01

    Experimental evidence from animal models of the absence seizures suggests a focal source for the initiation of generalized spike-and-wave (GSW) discharges. Furthermore, clinical studies indicate that patients diagnosed with idiopathic generalized epilepsy (IGE) exhibit focal electroencephalographic abnormalities, which involve the thalamo-cortical circuitry. This circuitry is a key network that has been implicated in the initiation of generalized discharges, and may contribute to the pathophysiology of GSW discharges. Quantitative electroencephalogram (qEEG) analysis may be able to detect abnormalities associated with the initiation of GSW discharges. The objective of this study was to determine whether interictal GSW discharges exhibit focal characteristics using qEEG analysis. In this study, 75 EEG recordings from 64 patients were analyzed. All EEG recordings analyzed contained at least one GSW discharge. EEG recordings were obtained by a 22-channel recorder with electrodes positioned according to the international 10-20 system of electrode placement. EEG activity was recorded for 20 min including photic stimulation and hyperventilation. The EEG recordings were visually inspected, and the first unequivocally confirmed generalized spike was marked for each discharge. Three methods of source imaging analysis were applied: dipole source imaging (DSI), classical LORETA analysis recursively applied (CLARA), and equivalent dipole of independent components with cluster analysis. A total of 753 GSW discharges were identified and spatiotemporally analyzed. Source evaluation analysis using all three techniques revealed that the frontal lobe was the principal source of GSW discharges (70%), followed by the parietal and occipital lobes (14%), and the basal ganglia (12%). The main anatomical sources of GSW discharges were the anterior cingulate cortex (36%) and the medial frontal gyrus (23%). Source analysis did not reveal a common focal source of GSW discharges. However

  18. Financial indicators for municipalities: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    Sreĉko Devjak

    2009-12-01

    Full Text Available From the characterization of Local Authority financing models and structures in Portugal and Slovenia, a set of financial and generic budget indicators has been established. These indicators may be used in a comparative analysis considering the Bragança District in Portugal, and municipalities of similar population size in Slovenia. The research identified significant differences, in terms of financing sources due to some discrepancies on financial models and competences of municipalities on each country. The results show that Portuguese and Slovenian municipalities, in 2003, for the economy indicator, had similar ranking behaviour, but in 2004, they changed this behaviour.

  19. QUANTITATIVE ANALYSIS OF DRAWING TUBES MICROSTRUCTURE

    Directory of Open Access Journals (Sweden)

    Maroš Martinkovič

    2009-05-01

    Full Text Available Final properties of forming pieces are affected by production, at first conditions of mechanical working. Application of stereology methods to statistic reconstruction of three-dimensional plastic deformed material structure by bulk forming led to detail analysis of material structure changes. The microstructure of cold drawing tubes from STN 411353 steel was analyzed. Grain boundaries orientation was measured on perpendicular and parallel section of tubes with different degree of deformation. Macroscopic deformation leads to grain boundaries deformation and these ones were compared.

  20. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...... time-to-event characteristic of interest. Real genetic longevity studies based on female animals of different species (sows, dairy cows, and sheep) exemplifies the use of the methods. Moreover these studies allow to understand som genetic mechanisms related to the lenght of the productive life...

  1. Segmentation and Quantitative Analysis of Epithelial Tissues.

    Science.gov (United States)

    Aigouy, Benoit; Umetsu, Daiki; Eaton, Suzanne

    2016-01-01

    Epithelia are tissues that regulate exchanges with the environment. They are very dynamic and can acquire virtually any shape; at the cellular level, they are composed of cells tightly connected by junctions. Most often epithelia are amenable to live imaging; however, the large number of cells composing an epithelium and the absence of informatics tools dedicated to epithelial analysis largely prevented tissue scale studies. Here we present Tissue Analyzer, a free tool that can be used to segment and analyze epithelial cells and monitor tissue dynamics.

  2. Quantum integrable systems. Quantitative methods in biology

    CERN Document Server

    Feverati, Giovanni

    2011-01-01

    Quantum integrable systems have very strong mathematical properties that allow an exact description of their energetic spectrum. From the Bethe equations, I formulate the Baxter "T-Q" relation, that is the starting point of two complementary approaches based on nonlinear integral equations. The first one is known as thermodynamic Bethe ansatz, the second one as Kl\\"umper-Batchelor-Pearce-Destri- de Vega. I show the steps toward the derivation of the equations for some of the models concerned. I study the infrared and ultraviolet limits and discuss the numerical approach. Higher rank integrals of motion can be obtained, so gaining some control on the eigenvectors. After, I discuss the Hubbard model in relation to the N = 4 supersymmetric gauge theory. The Hubbard model describes hopping electrons on a lattice. In the second part, I present an evolutionary model based on Turing machines. The goal is to describe aspects of the real biological evolution, or Darwinism, by letting evolve populations of algorithms. ...

  3. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  4. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    Science.gov (United States)

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  5. Quantitative DNA Analysis Using Droplet Digital PCR.

    Science.gov (United States)

    Vossen, Rolf H A M; White, Stefan J

    2017-01-01

    Droplet digital PCR (ddPCR) is based on the isolated amplification of thousands of individual DNA molecules simultaneously, with each molecule compartmentalized in a droplet. The presence of amplified product in each droplet is indicated by a fluorescent signal, and the proportion of positive droplets allows the precise quantification of a given sequence. In this chapter we briefly outline the basis of ddPCR, and describe two different applications using the Bio-Rad QX200 system: genotyping copy number variation and quantification of Illumina sequencing libraries.

  6. A computational tool for quantitative analysis of vascular networks.

    Directory of Open Access Journals (Sweden)

    Enrique Zudaire

    Full Text Available Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time--and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called "branching index" (branch points/unit area, providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge.

  7. Three-dimensional quantitative analysis of the proximal femur and the pelvis in children and adolescents using an upright biplanar slot-scanning X-ray system

    Energy Technology Data Exchange (ETDEWEB)

    Szuper, Kinga; Schlegl, Adam Tibor; Vermes, Csaba; Somoskeoey, Szabolcs; Than, Peter [University of Pecs, Department of Orthopaedics, Institute of Musculoskeletal Surgery, Clinical Centre, Pecs (Hungary); Leidecker, Eleonora [University of Pecs, Institute of Physiotherapy and Nutritional Sciences, Faculty of Health Sciences, Pecs (Hungary)

    2015-03-01

    The anatomy and biomechanics of the pelvis and lower limbs play a key role in the development of orthopaedic disorders. This study aimed to establish normal reference standards for the measurement of gender-specific pelvic and femoral parameters in children and adolescents with the EOS 2-D/3-D system. EOS 2-D images of 508 individuals (ages 4-16 years) were obtained as part of routine diagnostics. Patients with lower limb abnormalities were excluded. Pelvic and femoral surface 3-D models were generated and clinical parameters calculated by sterEOS 3-D reconstruction software. Data were evaluated using Spearman correlation, paired-samples and independent-samples t-test and linear regression analysis. Changes in anatomical parameters were found to correlate with age and gender in (1) femoral mechanical axis length: 27.3-43.7 cm (males), 25.5-41.2 cm (females), (2) femoral head diameter: 29.4-46.1 mm (males), 27.7-41.3 mm (females), (3) femoral offset: 26.8-42.4 mm (males), 25.5-37.9 mm (females) and (4) femoral neck length: 35.1-52.9 mm (males), 32.8-48.1 mm (females). There was no gender-specific correlation for the neck shaft angle with values from 130.4 to 129.3 , for femoral torsion (22.5 -19.4 ), for sacral slope (39.0 -44.4 ) and for lateral pelvic tilt (5.1 mm-6.2 mm). Sagittal pelvic tilt exhibited no significant correlation with age showing average values of 6.5 . The EOS 2-D/3-D system proved to be a valuable method in the evaluation of female and male developmental changes in pelvic and lower limb anatomical parameters, in normal individuals younger than 16 years of age. (orig.)

  8. Quantitative Fissile Assay In Used Fuel Using LSDS System

    Directory of Open Access Journals (Sweden)

    Lee YongDeok

    2017-01-01

    Full Text Available A quantitative assay of isotopic fissile materials (U235, Pu239, Pu241 was done at Korea Atomic Energy Research Institute (KAERI, using lead slowing down spectrometer (LSDS. The optimum design of LSDS was performed based on economics, easy maintenance and assay effectiveness. LSDS system consists of spectrometer, neutron source, detection and control. LSDS system induces fissile fission and fast neutrons are collected at fission chamber. The detected signal has a direct relation to the mass of existing fissile isotopes. Many current commercial assay technologies have a limitation in direct application on isotopic fissile assay of spent fuel, except chemical analysis. In the designed system, the fissile assay model was setup and the correction factor for self-shield was obtained. The isotopic fissile content assay was performed by changing the content of Pu239. Based on the fuel rod, the isotopic content was consistent with ~2% uncertainty for Pu239. By applying the covering (neutron absorber, the effective shielding was obtained and the activation was calculated on the target. From the assay evaluation, LSDS technique is very powerful and direct to analyze the isotopic fissile content. LSDS is applicable for nuclear fuel cycle and spent fuel management for safety and economics. Additionally, an accurate fissile content will contribute to the international transparency and credibility on spent fuel.

  9. Quantitative Fissile Assay In Used Fuel Using LSDS System

    Science.gov (United States)

    Lee, YongDeok; Jeon, Ju Young; Park, Chang-Je

    2017-09-01

    A quantitative assay of isotopic fissile materials (U235, Pu239, Pu241) was done at Korea Atomic Energy Research Institute (KAERI), using lead slowing down spectrometer (LSDS). The optimum design of LSDS was performed based on economics, easy maintenance and assay effectiveness. LSDS system consists of spectrometer, neutron source, detection and control. LSDS system induces fissile fission and fast neutrons are collected at fission chamber. The detected signal has a direct relation to the mass of existing fissile isotopes. Many current commercial assay technologies have a limitation in direct application on isotopic fissile assay of spent fuel, except chemical analysis. In the designed system, the fissile assay model was setup and the correction factor for self-shield was obtained. The isotopic fissile content assay was performed by changing the content of Pu239. Based on the fuel rod, the isotopic content was consistent with 2% uncertainty for Pu239. By applying the covering (neutron absorber), the effective shielding was obtained and the activation was calculated on the target. From the assay evaluation, LSDS technique is very powerful and direct to analyze the isotopic fissile content. LSDS is applicable for nuclear fuel cycle and spent fuel management for safety and economics. Additionally, an accurate fissile content will contribute to the international transparency and credibility on spent fuel.

  10. Quantitative analysis of forest fire extinction efficiency

    Directory of Open Access Journals (Sweden)

    Miguel E. Castillo-Soto

    2015-08-01

    Full Text Available Aim of study: Evaluate the economic extinction efficiency of forest fires, based on the study of fire combat undertaken by aerial and terrestrial means. Area of study, materials and methods: Approximately 112,000 hectares in Chile. Records of 5,876 forest fires that occurred between 1998 and 2009 were analyzed. The area further provides a validation sector for results, by incorporating databases for the years 2010 and 2012. The criteria used for measuring extinction efficiency were economic value of forestry resources, Contraction Factor analysis and definition of the extinction costs function. Main results: It is possible to establish a relationship between burnt area, extinction costs and economic losses. The method proposed may be used and adapted to other fire situations, requiring unit costs for aerial and terrestrial operations, economic value of the property to be protected and speed attributes of fire spread in free advance. Research highlights: The determination of extinction efficiency in containment works of forest fires and potential projection of losses, different types of plant fuel and local conditions favoring the spread of fire broaden the admissible ranges of a, φ and Ce considerably.

  11. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    Science.gov (United States)

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods.

  12. Quantitative methods for the analysis of electron microscope images

    DEFF Research Database (Denmark)

    Skands, Peter Ulrik Vallø

    1996-01-01

    The topic of this thesis is an general introduction to quantitative methods for the analysis of digital microscope images. The images presented are primarily been acquired from Scanning Electron Microscopes (SEM) and interfermeter microscopes (IFM). The topic is approached though several examples...... foundation of the thesis fall in the areas of: 1) Mathematical Morphology; 2) Distance transforms and applications; and 3) Fractal geometry. Image analysis opens in general the possibility of a quantitative and statistical well founded measurement of digital microscope images. Herein lies also the conditions...

  13. Assessment of Clostridium difficile infections by quantitative detection of tcdB toxin by use of a real-time cell analysis system.

    Science.gov (United States)

    Ryder, Alex B; Huang, Ying; Li, Haijing; Zheng, Min; Wang, Xiaobo; Stratton, Charles W; Xu, Xiao; Tang, Yi-Wei

    2010-11-01

    We explored the use of a real-time cell analysis (RTCA) system for the assessment of Clostridium difficile toxins in human stool specimens by monitoring the dynamic responses of the HS27 cells to tcdB toxins. The C. difficile toxin caused cytotoxic effects on the cells, which resulted in a dose-dependent and time-dependent decrease in cell impedance. The RTCA assay possessed an analytical sensitivity of 0.2 ng/ml for C. difficile toxin B with no cross-reactions with other enterotoxins, nontoxigenic C. difficile, or other Clostridum species. Clinical validation was performed on 300 consecutively collected stool specimens from patients with suspected C. difficile infection (CDI). Each stool specimen was tested in parallel by a real-time PCR assay (PCR), a dual glutamate dehydrogenase and toxin A/B enzyme immunoassay (EIA), and the RTCA assay. In comparison to a reference standard in a combination of the three assays, the RTCA had a specificity of 99.6% and a sensitivity of 87.5% (28 of 32), which was higher than the EIA result (P = 0.005) but lower than the PCR result (P = 0.057). In addition, the RTCA assay allowed for quantification of toxin protein concentration in a given specimen. Among RTCA-positive specimens collected prior to treatment with metronidazole and/or vancomycin, a significant correlation between toxin protein concentrations and clinical CDI severities was observed (R(2) = 0.732, P = 0.0004). Toxin concentrations after treatment (0.89 ng/ml) were significantly lower than those prior to the treatment (15.68 ng/ml, Wilcoxon P = 0.01). The study demonstrates that the RTCA assay provides a functional tool for the potential assessment of C. difficile infections.

  14. Quantitative assessment of the effectiveness of a rockfall warning system

    Science.gov (United States)

    Bründl, Michael; Sättele, Martina; Krautblatter, Michael; Straub, Daniel

    2016-04-01

    Rockslides and rockfalls can pose high risk to human settlements and traffic infrastructure. In addition to structural mitigation measures like rockfall nets, warning systems are increasingly installed to reduce rockfall risks. Whereas for structural mitigation measures with reducing effects on the spatial extent a structured evaluation method is existing, no or only few approaches to assess the effectiveness for warning systems are known. Especially for higher magnitude rockfalls structural mitigation measures are not effective, and reliable early warning systems will be essential in future. In response to that, we developed a classification and a framework to assess the reliability and effectiveness of early warning systems (Sättele et al, 2015a; 2016). Here, we demonstrate an application for the rockfall warning system installed in Preonzo prior to a major rockfall in May 2012 (Sättele et al., 2015b). We show that it is necessary to design such a warning system as fail-safe construction, which has to incorporate components with low failure probabilities, high redundancy, low warning thresholds, and additional control systems. With a hypothetical probabilistic analysis, we investigate the effect of the risk attitude of decision makers and of the number of sensors on the probability of detecting an event and on initiating a timely evacuation, as well as on related intervention cost. We conclude that it is possible to quantitatively assess the effectiveness of warning systems, which helps to optimize mitigation strategies against rockfall events. References Sättele, M., Bründl, M., and Straub, D.: Reliability and effectiveness of warning systems for natural hazards: concept and application to debris flow warning, Rel. Eng. Syst. Safety, 142, 192-202, 2015a. Sättele, M., Krautblatter, M., Bründl, M., and Straub, D.: Forecasting rock slope failure: How reliable and effective are warning systems?, Landslides, 605, 1-14, 2015b. Sättele, M., Bründl, M., and

  15. Can Patient Safety Incident Reports Be Used to Compare Hospital Safety? Results from a Quantitative Analysis of the English National Reporting and Learning System Data.

    Science.gov (United States)

    Howell, Ann-Marie; Burns, Elaine M; Bouras, George; Donaldson, Liam J; Athanasiou, Thanos; Darzi, Ara

    2015-01-01

    The National Reporting and Learning System (NRLS) collects reports about patient safety incidents in England. Government regulators use NRLS data to assess the safety of hospitals. This study aims to examine whether annual hospital incident reporting rates can be used as a surrogate indicator of individual hospital safety. Secondly assesses which hospital characteristics are correlated with high incident reporting rates and whether a high reporting hospital is safer than those lower reporting hospitals. Finally, it assesses which health-care professionals report more incidents of patient harm, which report more near miss incidents and what hospital factors encourage reporting. These findings may suggest methods for increasing the utility of reporting systems. This study used a mix methods approach for assessing NRLS data. The data were investigated using Pareto analysis and regression models to establish which patients are most vulnerable to reported harm. Hospital factors were correlated with institutional reporting rates over one year to examine what factors influenced reporting. Staff survey findings regarding hospital safety culture were correlated with reported rates of incidents causing harm; no harm and death to understand what barriers influence error disclosure. 5,879,954 incident reports were collected from acute hospitals over the decade. 70.3% of incidents produced no harm to the patient and 0.9% were judged by the reporter to have caused severe harm or death. Obstetrics and Gynaecology reported the most no harm events [OR 1.61(95%CI: 1.12 to 2.27), ppharmacy was the hospital location where most near-misses were captured [OR 3.03(95%CI: 2.04 to 4.55), pPatient satisfaction and mortality outcomes were not significantly associated with reporting rates. Staff survey responses revealed that keeping reports confidential, keeping staff informed about incidents and giving feedback on safety initiatives increased reporting rates [r = 0.26 (ppatient safety

  16. Can Patient Safety Incident Reports Be Used to Compare Hospital Safety? Results from a Quantitative Analysis of the English National Reporting and Learning System Data.

    Directory of Open Access Journals (Sweden)

    Ann-Marie Howell

    Full Text Available The National Reporting and Learning System (NRLS collects reports about patient safety incidents in England. Government regulators use NRLS data to assess the safety of hospitals. This study aims to examine whether annual hospital incident reporting rates can be used as a surrogate indicator of individual hospital safety. Secondly assesses which hospital characteristics are correlated with high incident reporting rates and whether a high reporting hospital is safer than those lower reporting hospitals. Finally, it assesses which health-care professionals report more incidents of patient harm, which report more near miss incidents and what hospital factors encourage reporting. These findings may suggest methods for increasing the utility of reporting systems.This study used a mix methods approach for assessing NRLS data. The data were investigated using Pareto analysis and regression models to establish which patients are most vulnerable to reported harm. Hospital factors were correlated with institutional reporting rates over one year to examine what factors influenced reporting. Staff survey findings regarding hospital safety culture were correlated with reported rates of incidents causing harm; no harm and death to understand what barriers influence error disclosure.5,879,954 incident reports were collected from acute hospitals over the decade. 70.3% of incidents produced no harm to the patient and 0.9% were judged by the reporter to have caused severe harm or death. Obstetrics and Gynaecology reported the most no harm events [OR 1.61(95%CI: 1.12 to 2.27, p<0.01] and pharmacy was the hospital location where most near-misses were captured [OR 3.03(95%CI: 2.04 to 4.55, p<0.01]. Clinicians were significantly more likely to report death than other staff [OR 3.04(95%CI: 2.43 to 3.80 p<0.01]. A higher ratio of clinicians to beds correlated with reduced rate of harm reported [RR = -1.78(95%Cl: -3.33 to -0.23, p = 0.03]. Litigation claims per bed were

  17. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  18. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    Science.gov (United States)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  19. A Quantitative System for Studying Metastasis Using Transparent Zebrafish

    DEFF Research Database (Denmark)

    Heilmann, Silja; Ratnakumar, Kajan; Langdon, Erin M

    2015-01-01

    of transgenic mitfa-BRAF(V600E);p53(-/-) fish. We then transplanted the melanoma cells into the transparent casper strain to enable highly quantitative measurement of the metastatic process at single-cell resolution. Using computational image analysis of the resulting metastases, we generated a metastasis score...

  20. Accuracy of Image Analysis in Quantitative Study of Cement Paste

    Directory of Open Access Journals (Sweden)

    Feng Shu-Xia

    2016-01-01

    Full Text Available Quantitative study on cement paste especially blended cement paste has been a hot and difficult issue over the years, and the technique of backscattered electron image analysis showed unique advantages in this field. This paper compared the test results of cement hydration degree, Ca(OH2 content and pore size distribution in pure pastes by image analysis and other methods. Then the accuracy of qualitative study by image analysis was analyzed. The results showed that image analysis technique had displayed higher accuracy in quantifying cement hydration degree and Ca(OH2 content than non-evaporable water test and thermal analysis respectively.

  1. Quantitative analysis of microtubule transport in growing nerve processes

    DEFF Research Database (Denmark)

    Ma*, Ytao; Shakiryanova*, Dinara; Vardya, Irina;

    2004-01-01

    the translocation of MT plus ends in the axonal shaft by expressing GFP-EB1 in Xenopus embryo neurons in culture. Formal quantitative analysis of MT assembly/disassembly indicated that none of the MTs in the axonal shaft were rapidly transported. Our results suggest that transport of axonal MTs is not required...

  2. Quantitative Analysis of Complex Tropical Forest Stands: A Review ...

    African Journals Online (AJOL)

    FIRST LADY

    The importance of data analysis in quantitative assessment of natural resources ... 350 km long extends from the eastern border of Sierra Leone all the way to. Ghana. .... consider whether data will likely fit the assumptions of a selected model. ... These tests are not alternatives to parametric tests, but rather are a means of.

  3. Analysis of Forecasting Sales By Using Quantitative And Qualitative Methods

    Directory of Open Access Journals (Sweden)

    B. Rama Sanjeeva Sresta,

    2016-09-01

    Full Text Available This paper focuses on analysis of forecasting sales using quantitative and qualitative methods. This forecast should be able to help create a model for measuring a successes and setting goals from financial and operational view points. The resulting model should tell if we have met our goals with respect to measures, targets, initiatives.

  4. Insights Into Quantitative Biology: analysis of cellular adaptation

    OpenAIRE

    Agoni, Valentina

    2013-01-01

    In the last years many powerful techniques have emerged to measure protein interactions as well as gene expression. Many progresses have been done since the introduction of these techniques but not toward quantitative analysis of data. In this paper we show how to study cellular adaptation and how to detect cellular subpopulations. Moreover we go deeper in analyzing signal transduction pathways dynamics.

  5. Influence of hydrodynamic conditions on quantitative cellular assays in microfluidic systems.

    Science.gov (United States)

    Yin, Huabing; Zhang, Xunli; Pattrick, Nicola; Klauke, Norbert; Cordingley, Hayley C; Haswell, Stephen J; Cooper, Jonathan M

    2007-09-15

    This study demonstrates the importance of the hydrodynamic environment in microfluidic systems in quantitative cellular assays using live cells. Commonly applied flow conditions used in microfluidics were evaluated using the quantitative intracellular Ca2+ analysis of Chinese hamster ovary (CHO) cells as a model system. Above certain thresholds of shear stress, hydrodynamically induced intracellular Ca2+ fluxes were observed which mimic the responses induced by chemical stimuli, such as the agonist uridine 5'-triphosphate tris salt (UTP). This effect is of significance given the increasing application of microfluidic devices in high-throughput cellular analysis for biophysical applications and pharmacological screening.

  6. Quantitating the subtleties of microglial morphology with fractal analysis.

    Science.gov (United States)

    Karperien, Audrey; Ahammer, Helmut; Jelinek, Herbert F

    2013-01-01

    It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells). Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  7. Quantitating the Subtleties of Microglial Morphology with Fractal Analysis

    Directory of Open Access Journals (Sweden)

    Audrey eKarperien

    2013-01-01

    Full Text Available It is well established that microglial form and function are inextricably linked. In recent years, the traditional view that microglial form ranges between "ramified resting" and "activated amoeboid" has been emphasized through advancing imaging techniques that point to microglial form being highly dynamic even within the currently accepted morphological categories. Moreover, microglia adopt meaningful intermediate forms between categories, with considerable crossover in function and varying morphologies as they cycle, migrate, wave, phagocytose, and extend and retract fine and gross processes. From a quantitative perspective, it is problematic to measure such variability using traditional methods, but one way of quantitating such detail is through fractal analysis. The techniques of fractal analysis have been used for quantitating microglial morphology, to categorize gross differences but also to differentiate subtle differences (e.g., amongst ramified cells. Multifractal analysis in particular is one technique of fractal analysis that may be useful for identifying intermediate forms. Here we review current trends and methods of fractal analysis, focusing on box counting analysis, including lacunarity and multifractal analysis, as applied to microglial morphology.

  8. Quantitative modelling in design and operation of food supply systems

    NARCIS (Netherlands)

    Beek, van P.

    2004-01-01

    During the last two decades food supply systems not only got interest of food technologists but also from the field of Operations Research and Management Science. Operations Research (OR) is concerned with quantitative modelling and can be used to get insight into the optimal configuration and opera

  9. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    Science.gov (United States)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  10. Application of Quantitative Analysis in Brain SPECT Imaging of Neuropsychiatric in Systemic Lupus Erythematosus%定量分析在系统性红斑狼疮脑病SPECT脑显像中的应用

    Institute of Scientific and Technical Information of China (English)

    许守林; 冯雪凤; 施鸣

    2013-01-01

    Objective: To elucidate a method for the quantitative analysis in 99mTc-ECD brain SPECT imaging of neuropsychiatric in systemic lupus erythematosus and the correlation of visual and quantitative analysis. Methods: 99mTc-ECD SPECT imaging was performed in 33 SLE patients and 29 controls. The results were analyzed by visual and quantitative comparison. The images were analyzed with brain search (BS). Results: The change of cerebral blood flow, especially decreases in regional cerebral blood flow were associated with serious neuropsychiatric SLE presentations. Cingulate gyrus and temporal- parietal were most involved areas under unpaired t-test. The positive rate of SPECT imaging by visual and quantitative analysis was respectively 51.51%, and 57.57%. The method of quantitative and visual analysis had high correlation. Conclusions: 99mTc-ECD SPECT could easily demonstrate metabolic, functional lesions and cerebral blood flow change without structural abnormalities. The sensitivity of cerebral blood flow perfusion SPECT imaging was high, but lack of specificity. The combination of brain perfusion SPECT and brain search (BS) was a convenient and shortcut method to align the disfunctional areas of the brain. 99Tcm-ECD SPECT was a useful and objective method for detecting perfusion abnormalities in SLE patients.%目的:探讨定量分析在SPECT脑99mTc-ECD显像检测系统性红斑狼疮(SLE)脑病中的应用价值和目测分析与定量分析方法的相关性.方法:选择年龄匹配的健康人作比较,分别对33例SLE患者和29例正常对照进行SPECT脑血流灌注显像,采用肉眼读片分析及定量分析进行评价,BS软件进行分析.结果:狼疮脑病患者99mTc-ECD显像有脑血流改变,多表现为局部脑血流降低.肉眼读片分析SPECT诊断的SLE脑部受损的阳性率为51.51%,定量分析的阳性率达57.57%.成组t检验示扣带回、颞顶叶等部位最常受累.结论:SPECT脑血流灌注显像可探查到脑组织代谢

  11. Quantitative numerical analysis of transient IR-experiments on buildings

    Science.gov (United States)

    Maierhofer, Ch.; Wiggenhauser, H.; Brink, A.; Röllig, M.

    2004-12-01

    Impulse-thermography has been established as a fast and reliable tool in many areas of non-destructive testing. In recent years several investigations have been done to apply active thermography to civil engineering. For quantitative investigations in this area of application, finite difference calculations have been performed for systematic studies on the influence of environmental conditions, heating power and time, defect depth and size and thermal properties of the bulk material (concrete). The comparison of simulated and experimental data enables the quantitative analysis of defects.

  12. Quantitative analysis of culture using millions of digitized books.

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva Presser; Veres, Adrian; Gray, Matthew K; Pickett, Joseph P; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A; Aiden, Erez Lieberman

    2011-01-14

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of 'culturomics,' focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. Culturomics extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities.

  13. Quantitative analysis of culture using millions of digitized books

    Science.gov (United States)

    Michel, Jean-Baptiste; Shen, Yuan Kui; Aiden, Aviva P.; Veres, Adrian; Gray, Matthew K.; Pickett, Joseph P.; Hoiberg, Dale; Clancy, Dan; Norvig, Peter; Orwant, Jon; Pinker, Steven; Nowak, Martin A.; Aiden, Erez Lieberman

    2011-01-01

    We constructed a corpus of digitized texts containing about 4% of all books ever printed. Analysis of this corpus enables us to investigate cultural trends quantitatively. We survey the vast terrain of ‘culturomics’, focusing on linguistic and cultural phenomena that were reflected in the English language between 1800 and 2000. We show how this approach can provide insights about fields as diverse as lexicography, the evolution of grammar, collective memory, the adoption of technology, the pursuit of fame, censorship, and historical epidemiology. ‘Culturomics’ extends the boundaries of rigorous quantitative inquiry to a wide array of new phenomena spanning the social sciences and the humanities. PMID:21163965

  14. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    Science.gov (United States)

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  15. Communication about vaccinations in Italian websites: a quantitative analysis.

    Science.gov (United States)

    Tafuri, Silvio; Gallone, Maria S; Gallone, Maria F; Zorico, Ivan; Aiello, Valeria; Germinario, Cinzia

    2014-01-01

    Babies' parents and people who look for information about vaccination often visit anti-vaccine movement's websites, blogs by naturopathic physicians or natural and alternative medicine practitioners. The aim of this work is to provide a quantitative analysis on the type of information available to Italian people regarding vaccination and a quality analysis of websites retrieved through our searches. A quality score was created to evaluate the technical level of websites. A research was performed through Yahoo, Google, and MSN using the keywords "vaccine" and "vaccination," with the function "OR" in order to identify the most frequently used websites. The 2 keywords were input in Italian, and the first 15 pages retrieved by each search engine were analyzed. 149 websites were selected through this methodology. Fifty-three per cent of the websites belonged to associations, groups, or scientific companies, 32.2% (n = 48) consisted of a personal blog and 14.8% (n = 22) belonged to some of the National Health System offices. Among all analyzed websites, 15.4% (n = 23) came from anti-vaccine movement groups. 37.6% reported webmaster name, 67.8% webmaster e-mail, 28.6% indicated the date of the last update and 46.6% the author's name. The quality score for government sites was higher on average than anti-vaccine websites; although, government sites don't use Web 2.0 functions, as the forums.: National Health System institutions who have to promote vaccination cannot avoid investing in web communication because it cannot be managed by private efforts but must be the result of Public Health, private and scientific association, and social movement synergy.

  16. Foundational and Systems Support for Quantitative Trust Management (QTM)

    Science.gov (United States)

    2016-06-21

    assignment Ppat sayspat Ohosp sayshosp Ppat access (pat, record(pat)) Patients are allowed to ask a hospital for their medical records. In response, the...compositions in GIG – Reusing components/subsystem in complex DoD systems – Social Networks – Medical systems – Cloud computing 11/4/09 ONR MURI Review...QUANTITATIVE TRUST MANAGEMENT (QTM) • Effective for delegated credentials and access enforcement • Can‟t handle uncertainty and partial information

  17. Quantitative assessments of distributed systems methodologies and techniques

    CERN Document Server

    Bruneo, Dario

    2015-01-01

    Distributed systems employed in critical infrastructures must fulfill dependability, timeliness, and performance specifications. Since these systems most often operate in an unpredictable environment, their design and maintenance require quantitative evaluation of deterministic and probabilistic timed models. This need gave birth to an abundant literature devoted to formal modeling languages combined with analytical and simulative solution techniques The aim of the book is to provide an overview of techniques and methodologies dealing with such specific issues in the context of distributed

  18. A GPGPU accelerated modeling environment for quantitatively characterizing karst systems

    Science.gov (United States)

    Myre, J. M.; Covington, M. D.; Luhmann, A. J.; Saar, M. O.

    2011-12-01

    The ability to derive quantitative information on the geometry of karst aquifer systems is highly desirable. Knowing the geometric makeup of a karst aquifer system enables quantitative characterization of the systems response to hydraulic events. However, the relationship between flow path geometry and karst aquifer response is not well understood. One method to improve this understanding is the use of high speed modeling environments. High speed modeling environments offer great potential in this regard as they allow researchers to improve their understanding of the modeled karst aquifer through fast quantitative characterization. To that end, we have implemented a finite difference model using General Purpose Graphics Processing Units (GPGPUs). GPGPUs are special purpose accelerators which are capable of high speed and highly parallel computation. The GPGPU architecture is a grid like structure, making it is a natural fit for structured systems like finite difference models. To characterize the highly complex nature of karst aquifer systems our modeling environment is designed to use an inverse method to conduct the parameter tuning. Using an inverse method reduces the total amount of parameter space needed to produce a set of parameters describing a system of good fit. Systems of good fit are determined with a comparison to reference storm responses. To obtain reference storm responses we have collected data from a series of data-loggers measuring water depth, temperature, and conductivity at locations along a cave stream with a known geometry in southeastern Minnesota. By comparing the modeled response to those of the reference responses the model parameters can be tuned to quantitatively characterize geometry, and thus, the response of the karst system.

  19. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  20. Spotsizer: High-throughput quantitative analysis of microbial growth

    Science.gov (United States)

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  1. Quantitative produced water analysis using mobile 1H NMR

    Science.gov (United States)

    Wagner, Lisabeth; Kalli, Chris; Fridjonsson, Einar O.; May, Eric F.; Stanwix, Paul L.; Graham, Brendan F.; Carroll, Matthew R. J.; Johns, Michael L.

    2016-10-01

    Measurement of oil contamination of produced water is required in the oil and gas industry to the (ppm) level prior to discharge in order to meet typical environmental legislative requirements. Here we present the use of compact, mobile 1H nuclear magnetic resonance (NMR) spectroscopy, in combination with solid phase extraction (SPE), to meet this metrology need. The NMR hardware employed featured a sufficiently homogeneous magnetic field, such that chemical shift differences could be used to unambiguously differentiate, and hence quantitatively detect, the required oil and solvent NMR signals. A solvent system consisting of 1% v/v chloroform in tetrachloroethylene was deployed, this provided a comparable 1H NMR signal intensity for the oil and the solvent (chloroform) and hence an internal reference 1H signal from the chloroform resulting in the measurement being effectively self-calibrating. The measurement process was applied to water contaminated with hexane or crude oil over the range 1-30 ppm. The results were validated against known solubility limits as well as infrared analysis and gas chromatography.

  2. Quantitative analysis of brain magnetic resonance imaging for hepatic encephalopathy

    Science.gov (United States)

    Syh, Hon-Wei; Chu, Wei-Kom; Ong, Chin-Sing

    1992-06-01

    High intensity lesions around ventricles have recently been observed in T1-weighted brain magnetic resonance images for patients suffering hepatic encephalopathy. The exact etiology that causes magnetic resonance imaging (MRI) gray scale changes has not been totally understood. The objective of our study was to investigate, through quantitative means, (1) the amount of changes to brain white matter due to the disease process, and (2) the extent and distribution of these high intensity lesions, since it is believed that the abnormality may not be entirely limited to the white matter only. Eleven patients with proven haptic encephalopathy and three normal persons without any evidence of liver abnormality constituted our current data base. Trans-axial, sagittal, and coronal brain MRI were obtained on a 1.5 Tesla scanner. All processing was carried out on a microcomputer-based image analysis system in an off-line manner. Histograms were decomposed into regular brain tissues and lesions. Gray scale ranges coded as lesion were then brought back to original images to identify distribution of abnormality. Our results indicated the disease process involved pallidus, mesencephalon, and subthalamic regions.

  3. The Image Quality of a Digital Chest X-Ray Radiography System: Comparison of Quantitative Image Quality Analysis and Radiologists' Visual Scoring

    Energy Technology Data Exchange (ETDEWEB)

    Nam, Ji Ho [Dept. of Radiology Oncology, Yongsan Hospital, Pusan National University College of Medicine, Yongsan (Korea, Republic of); Chung, Myung Jin [Dept. of Radiology, Samsung Medical Center, Seoul (Korea, Republic of); Park, Darl; Kim, Won Taek; Kim, Yong Ho; Ki, Yong Kan; Kim, DFong Hyun; Lee, Ju Hee; Kim, Dong Won [Dept. of Radiology Oncology, Yongsan Hospital, Pusan National University College of Medicine, Yongsan (Korea, Republic of); Jeon, Ho Sang [Reserach Institue for Convergence of Biomedical Science and Technology, Yongsan Hospital, Pusan National University College of Medicine, Yongsan (Korea, Republic of)

    2011-11-15

    To evaluate the performance of imaging devices, which should be periodically monitored to maintain high quality images to the radiologists. Additionally, this evaluation may prevent patients from radiation over-exposure. The most suitable engineering standard for imaging performance evaluation of digital X-ray thoracic images was determined. IEC 62220-1 standards were used to evaluate the performance of the images. In succession, the visibilities of overall image, pneumothorax, and humerus head in anthropomorphic thoracic phantom images were used to evaluate the image qualities by radiologists. The rank correlation coefficient (p) of visual scoring by radiologists with system spatial resolution is not meaningful (p-value, p = 0.295), but is significant with image noise (p-value, p -0.9267). Finally, the noise equivalent quanta (NEQ) presents a high rank correlation for visual scoring of radiologists (p-value, p = 0.9320). Image quality evaluation of radiologists were mainly affected by imaging noise. Hence, the engineered standard for evaluating image noise is the most important index to effectively monitor the performance of X-ray images. Additionally, the NEQ can be used to evaluate the performance of radiographic systems, because it theoretically corresponds to the synthetic image quality of systems.

  4. The importance of quantitative systemic thinking in medicine.

    Science.gov (United States)

    West, Geoffrey B

    2012-04-21

    The study and practice of medicine could benefit from an enhanced engagement with the new perspectives provided by the emerging areas of complexity science and systems biology. A more integrated, systemic approach is needed to fully understand the processes of health, disease, and dysfunction, and the many challenges in medical research and education. Integral to this approach is the search for a quantitative, predictive, multilevel, theoretical conceptual framework that both complements the present approaches and stimulates a more integrated research agenda that will lead to novel questions and experimental programmes. As examples, the importance of network structures and scaling laws are discussed for the development of a broad, quantitative, mathematical understanding of issues that are important in health, including ageing and mortality, sleep, growth, circulatory systems, and drug doses. A common theme is the importance of understanding the quantifiable determinants of the baseline scale of life, and developing corresponding parameters that define the average, idealised, healthy individual.

  5. Optimization Research of Mine Skip Quantitative Loading System

    Science.gov (United States)

    Wang, Shuang; Hu, Kun; Cheng, Gang; Li, De-yong

    2016-06-01

    The size and change of the impact load of coal material applied to the skip are studied aiming at the quantitative loading system of the skip. Based on the impulse theorem and with reasonable assumption, the calculation formula for impact force of the coal material is deducted and the impact characteristic of the impact force to the quantitative loading system of the skip is analyzed. The process of the coal material falling from quantitative conveyor to skip is analyzed with the discrete element simulation so that the distributed load of the impact force of the coal material at the skip bottom is obtained. The results show that the coal material produces large impact force (687 N) to the skip bottom the moment the coal material falls into the skip, and then the force decreases rapidly to about 245 N and increases gradually during the fluctuation; the impact force applied to the skip bottom increases with the increase of the coal transportation speed and the size of discharging port of the chute, but it is not in direct proportional relationship. The simulation results are basically the same as the experimental results. Finally the optimization parameters of the speed of quantitative conveyor and the size of the discharging port of the chute are searched for so as to improve the capacity of the conveyor and impact load assumed by the skip bottom.

  6. Quantitative nanoscale analysis in 3D using electron tomography

    Energy Technology Data Exchange (ETDEWEB)

    Kuebel, Christian [Karlsruhe Institute of Technology, INT, 76344 Eggenstein-Leopoldshafen (Germany)

    2011-07-01

    State-of-the-art electron tomography has been established as a powerful tool to image complex structures with nanometer resolution in 3D. Especially STEM tomography is used extensively in materials science in such diverse areas as catalysis, semiconductor materials, and polymer composites mainly providing qualitative information on morphology, shape and distribution of materials. However, for an increasing number of studies quantitative information, e.g. surface area, fractal dimensions, particle distribution or porosity are needed. A quantitative analysis is typically performed after segmenting the tomographic data, which is one of the main sources of error for the quantification. In addition to noise, systematic errors due to the missing wedge and due to artifacts from the reconstruction algorithm itself are responsible for these segmentation errors and improved algorithms are needed. This presentation will provide an overview of the possibilities and limitations of quantitative nanoscale analysis by electron tomography. Using catalysts and nano composites as applications examples, intensities and intensity variations observed for the 3D volume reconstructed by WBP and SIRT will be quantitatively compared to alternative reconstruction algorithms; implications for quantification of electron (or X-ray) tomographic data will be discussed and illustrated for quantification of particle size distributions, particle correlations, surface area, and fractal dimensions in 3D.

  7. Universal platform for quantitative analysis of DNA transposition

    Directory of Open Access Journals (Sweden)

    Pajunen Maria I

    2010-11-01

    Full Text Available Abstract Background Completed genome projects have revealed an astonishing diversity of transposable genetic elements, implying the existence of novel element families yet to be discovered from diverse life forms. Concurrently, several better understood transposon systems have been exploited as efficient tools in molecular biology and genomics applications. Characterization of new mobile elements and improvement of the existing transposition technology platforms warrant easy-to-use assays for the quantitative analysis of DNA transposition. Results Here we developed a universal in vivo platform for the analysis of transposition frequency with class II mobile elements, i.e., DNA transposons. For each particular transposon system, cloning of the transposon ends and the cognate transposase gene, in three consecutive steps, generates a multifunctional plasmid, which drives inducible expression of the transposase gene and includes a mobilisable lacZ-containing reporter transposon. The assay scores transposition events as blue microcolonies, papillae, growing within otherwise whitish Escherichia coli colonies on indicator plates. We developed the assay using phage Mu transposition as a test model and validated the platform using various MuA transposase mutants. For further validation and to illustrate universality, we introduced IS903 transposition system components into the assay. The developed assay is adjustable to a desired level of initial transposition via the control of a plasmid-borne E. coli arabinose promoter. In practice, the transposition frequency is modulated by varying the concentration of arabinose or glucose in the growth medium. We show that variable levels of transpositional activity can be analysed, thus enabling straightforward screens for hyper- or hypoactive transposase mutants, regardless of the original wild-type activity level. Conclusions The established universal papillation assay platform should be widely applicable to a

  8. Fuzzy Logic as a Computational Tool for Quantitative Modelling of Biological Systems with Uncertain Kinetic Data.

    Science.gov (United States)

    Bordon, Jure; Moskon, Miha; Zimic, Nikolaj; Mraz, Miha

    2015-01-01

    Quantitative modelling of biological systems has become an indispensable computational approach in the design of novel and analysis of existing biological systems. However, kinetic data that describe the system's dynamics need to be known in order to obtain relevant results with the conventional modelling techniques. These data are often hard or even impossible to obtain. Here, we present a quantitative fuzzy logic modelling approach that is able to cope with unknown kinetic data and thus produce relevant results even though kinetic data are incomplete or only vaguely defined. Moreover, the approach can be used in the combination with the existing state-of-the-art quantitative modelling techniques only in certain parts of the system, i.e., where kinetic data are missing. The case study of the approach proposed here is performed on the model of three-gene repressilator.

  9. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  10. Data from quantitative label free proteomics analysis of rat spleen

    Directory of Open Access Journals (Sweden)

    Khadar Dudekula

    2016-09-01

    Full Text Available The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides. A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  11. Data from quantitative label free proteomics analysis of rat spleen.

    Science.gov (United States)

    Dudekula, Khadar; Le Bihan, Thierry

    2016-09-01

    The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.

  12. An improved quantitative analysis method for plant cortical microtubules.

    Science.gov (United States)

    Lu, Yi; Huang, Chenyang; Wang, Jia; Shang, Peng

    2014-01-01

    The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD) algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1) image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM) algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  13. An Improved Quantitative Analysis Method for Plant Cortical Microtubules

    Directory of Open Access Journals (Sweden)

    Yi Lu

    2014-01-01

    Full Text Available The arrangement of plant cortical microtubules can reflect the physiological state of cells. However, little attention has been paid to the image quantitative analysis of plant cortical microtubules so far. In this paper, Bidimensional Empirical Mode Decomposition (BEMD algorithm was applied in the image preprocessing of the original microtubule image. And then Intrinsic Mode Function 1 (IMF1 image obtained by decomposition was selected to do the texture analysis based on Grey-Level Cooccurrence Matrix (GLCM algorithm. Meanwhile, in order to further verify its reliability, the proposed texture analysis method was utilized to distinguish different images of Arabidopsis microtubules. The results showed that the effect of BEMD algorithm on edge preserving accompanied with noise reduction was positive, and the geometrical characteristic of the texture was obvious. Four texture parameters extracted by GLCM perfectly reflected the different arrangements between the two images of cortical microtubules. In summary, the results indicate that this method is feasible and effective for the image quantitative analysis of plant cortical microtubules. It not only provides a new quantitative approach for the comprehensive study of the role played by microtubules in cell life activities but also supplies references for other similar studies.

  14. Quantitative risk analysis of oil storage facilities in seismic areas.

    Science.gov (United States)

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  15. Prevention of malaria in pregnancy with intermittent preventive treatment and insecticide treated nets in Mali: a quantitative health systems effectiveness analysis.

    Directory of Open Access Journals (Sweden)

    Jayne Webster

    Full Text Available INTRODUCTION: The objectives of the study were to evaluate the health system effectiveness of ANC for the delivery of a dose of IPTp and an ITN to women attending ANC during eligible gestation, and to identify the predictors of systems effectiveness. METHODS: A cross sectional study was undertaken in 10 health facilities including structured non-participant observations of the ANC process for 780 pregnant women followed by exit interviews. The proportion of pregnant women receiving a dose of IPTp-SP and an ITN was assessed. Predictors of each ineffective intermediate process were identified using multivariable logistic regression. RESULTS: Overall, 0% and 24.5% of pregnant women of eligible gestation on the first visit to ANC received a dose of IPTp-SP by DOT at the district and community levels respectively. Ineffective intermediate processes were 'given IPTp-SP at the ANC' 63.9% and 74.0% (95% CI 62.0, 83.3, and 'given IPTp-SP by DOT' 0% and 34.3% (95% CI 10.5, 69.8, at district and community levels, respectively. Delivery of ITNs was effective where they were in stock; however stock-outs were a problem. Predictors of receiving IPTp-SP at the district level were 4 to 6 months gestation, not reporting symptoms of malaria at ANC visit and the amount of money spent during the visit. At the community level, the predictors were 4 to 6 months gestation, maternal education below primary level, routine ANC visit (not for an illness, palpation of the abdomen, and expenditure of money in ANC. CONCLUSION: In Segou District, the delivery of IPTp-SP was ineffective; whilst ITN delivery was effective if ITNs were in stock. Predictors of receiving IPTp-SP at the district and community levels included gestational age, the amount of expenditure during the ANC visit and no illness.

  16. Quantitative Analysis of the Effective Functional Structure in Yeast Glycolysis

    Science.gov (United States)

    De la Fuente, Ildefonso M.; Cortes, Jesus M.

    2012-01-01

    The understanding of the effective functionality that governs the enzymatic self-organized processes in cellular conditions is a crucial topic in the post-genomic era. In recent studies, Transfer Entropy has been proposed as a rigorous, robust and self-consistent method for the causal quantification of the functional information flow among nonlinear processes. Here, in order to quantify the functional connectivity for the glycolytic enzymes in dissipative conditions we have analyzed different catalytic patterns using the technique of Transfer Entropy. The data were obtained by means of a yeast glycolytic model formed by three delay differential equations where the enzymatic rate equations of the irreversible stages have been explicitly considered. These enzymatic activity functions were previously modeled and tested experimentally by other different groups. The results show the emergence of a new kind of dynamical functional structure, characterized by changing connectivity flows and a metabolic invariant that constrains the activity of the irreversible enzymes. In addition to the classical topological structure characterized by the specific location of enzymes, substrates, products and feedback-regulatory metabolites, an effective functional structure emerges in the modeled glycolytic system, which is dynamical and characterized by notable variations of the functional interactions. The dynamical structure also exhibits a metabolic invariant which constrains the functional attributes of the enzymes. Finally, in accordance with the classical biochemical studies, our numerical analysis reveals in a quantitative manner that the enzyme phosphofructokinase is the key-core of the metabolic system, behaving for all conditions as the main source of the effective causal flows in yeast glycolysis. PMID:22393350

  17. Application of quantitative signal detection in the Dutch spontaneous reporting system for adverse drug reactions

    NARCIS (Netherlands)

    van Puijenbroek, Eugène; Diemont, Willem; van Grootheest, Kees

    2003-01-01

    The primary aim of spontaneous reporting systems (SRSs) is the timely detection of unknown adverse drug reactions (ADRs), or signal detection. Generally this is carried out by a systematic manual review of every report sent to an SRS. Statistical analysis of the data sets of an SRS, or quantitative

  18. Qualitative and quantitative stability analysis of penta-rhythmic circuits

    Science.gov (United States)

    Schwabedal, Justus T. C.; Knapper, Drake E.; Shilnikov, Andrey L.

    2016-12-01

    Inhibitory circuits of relaxation oscillators are often-used models for dynamics of biological networks. We present a qualitative and quantitative stability analysis of such a circuit constituted by three generic oscillators (of a Fitzhugh-Nagumo type) as its nodes coupled reciprocally. Depending on inhibitory strengths, and parameters of individual oscillators, the circuit exhibits polyrhythmicity of up to five simultaneously stable rhythms. With methods of bifurcation analysis and phase reduction, we investigate qualitative changes in stability of these circuit rhythms for a wide range of parameters. Furthermore, we quantify robustness of the rhythms maintained under random perturbations by monitoring phase diffusion in the circuit. Our findings allow us to describe how circuit dynamics relate to dynamics of individual nodes. We also find that quantitative and qualitative stability properties of polyrhythmicity do not always align.

  19. Quantitative analysis of myocardial tissue with digital autofluorescence microscopy

    DEFF Research Database (Denmark)

    Jensen, Thomas; Holten-Rossing, Henrik; Svendsen, Ida M H;

    2016-01-01

    BACKGROUND: The opportunity offered by whole slide scanners of automated histological analysis implies an ever increasing importance of digital pathology. To go beyond the importance of conventional pathology, however, digital pathology may need a basic histological starting point similar...... to that of hematoxylin and eosin staining in conventional pathology. This study presents an automated fluorescence-based microscopy approach providing highly detailed morphological data from unstained microsections. This data may provide a basic histological starting point from which further digital analysis including...... staining may benefit. METHODS: This study explores the inherent tissue fluorescence, also known as autofluorescence, as a mean to quantitate cardiac tissue components in histological microsections. Data acquisition using a commercially available whole slide scanner and an image-based quantitation algorithm...

  20. Quantitative and qualitative analysis and interpretation of CT perfusion imaging.

    Science.gov (United States)

    Valdiviezo, Carolina; Ambrose, Marietta; Mehra, Vishal; Lardo, Albert C; Lima, Joao A C; George, Richard T

    2010-12-01

    Coronary artery disease (CAD) remains the leading cause of death in the United States. Rest and stress myocardial perfusion imaging has an important role in the non-invasive risk stratification of patients with CAD. However, diagnostic accuracies have been limited, which has led to the development of several myocardial perfusion imaging techniques. Among them, myocardial computed tomography perfusion imaging (CTP) is especially interesting as it has the unique capability of providing anatomic- as well as coronary stenosis-related functional data when combined with computed tomography angiography (CTA). The primary aim of this article is to review the qualitative, semi-quantitative, and quantitative analysis approaches to CTP imaging. In doing so, we will describe the image data required for each analysis and discuss the advantages and disadvantages of each approach.

  1. Country Risk Analysis: A Survey of the Quantitative Methods

    OpenAIRE

    Hiranya K Nath

    2008-01-01

    With globalization and financial integration, there has been rapid growth of international lending and foreign direct investment (FDI). In view of this emerging trend, country risk analysis has become extremely important for the international creditors and investors. This paper briefly discusses the concepts and definitions, and presents a survey of the quantitative methods that are used to address various issues related to country risk. It also gives a summary review of selected empirical st...

  2. The Quantitative Analysis of Green Degree in Wuhan Traffic System%对武汉市交通体系绿色度的定量分析

    Institute of Scientific and Technical Information of China (English)

    陈升平; 贺伟; 李冠; 周倩

    2012-01-01

    将武汉社会发展能源消费量,建成区绿化覆盖率,交通干线噪声这三大压力指标进行数据统计,对数据进行无量纲化处理后,利用GM(1,1)灰色理论分别预测2003-2008年的3大压力指标值,采用层次分析法确定各项指标的综合绿度权重,再将综合绿度权重与对应的压力指标进行加权得到刻画武汉的绿色交通可持续发展和建设的综合绿度值.以实施"绿色交通"前后各个压力指标差值定量定义"绿色度"概念.结果表明:2003-2008年综合绿色度指标年平均综合绿色度值为0.345 2,说明2002年以后,武汉市实施绿色交通系统建设起到了一定的成效.%Three great pressure index were analyzed,namely the energy consumption,the built-up green coverage and the traffic noise in Wuhan.After the dimensionless processing,the GM(1,1) grey theory was used to separately predict the pressure index between 2003 and 2008.AHP was then used to determine the green degree weight of each index.Finally,those degree weights and the corresponding pressure index were weighted to get integrated green degree value so as to depict Wuhan's sustainable green traffic development,defining "green degree" by the pressure difference in the pressure index before and after the implementation of "green traffic".The results show that: the 2003-2008 average annual comprehensive green index comprehensive green degrees value is 0.3452,indicating the effectiveness of the green traffic system construction in Wuhan since 2002.

  3. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies.

    Science.gov (United States)

    Chau, Siu-Leung; Huang, Zhi-Bing; Song, Yan-Gang; Yue, Rui-Qi; Ho, Alan; Lin, Chao-Zhan; Huang, Wen-Hua; Han, Quan-Bin

    2016-08-19

    Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ) injection (SQI), via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC) coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS); saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC) with evaporative light scattering detector (ELSD) on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w) of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%-0.21%, and 53.49%-58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  4. Comprehensive Quantitative Analysis of SQ Injection Using Multiple Chromatographic Technologies

    Directory of Open Access Journals (Sweden)

    Siu-Leung Chau

    2016-08-01

    Full Text Available Quality control of Chinese medicine injections remains a challenge due to our poor knowledge of their complex chemical profile. This study aims to investigate the chemical composition of one of the best-selling injections, Shenqi Fuzheng (SQ injection (SQI, via a full component quantitative analysis. A total of 15 representative small molecular components of SQI were simultaneously determined using ultra-high performance liquid chromatography (UHPLC coupled with quadrupole tandem time-of-flight mass spectrometry (Q-TOF-MS; saccharide composition of SQI was also quantitatively determined by high performance liquid chromatography (HPLC with evaporative light scattering detector (ELSD on an amino column before and after acid hydrolysis. The existence of polysaccharides was also examined on a gel permeation chromatography column. The method was well validated in terms of linearity, sensitivity, precision, accuracy and stability, and was successfully applied to analyze 13 SQI samples. The results demonstrate that up to 94.69% (w/w of this injection product are quantitatively determined, in which small molecules and monosaccharide/sucrose account for 0.18%–0.21%, and 53.49%–58.2%, respectively. The quantitative information contributes to accumulating scientific evidence to better understand the therapy efficacy and safety of complex Chinese medicine injections.

  5. Quantitative analysis for nonlinear fluorescent spectra based on edges matching

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    A novel spectra-edge-matching approach is proposed for the quantitative analysis of the nonlinear fluorescence spectra of the air impurities excited by a femtosecond laser.The fluorescence spectra are first denoised and compressed,both by wavelet transform,and several peak groups are then picked from each spectrum according to a threshold of intensity and are used to extract the spectral features through principal component analysis.It is indicated that the first two principle components actually cover up to 98% of the total information and are sufficient for the final concentration analysis.The analysis reveals a monotone relationship between the spectra intensity and the concentration of the air impurities,suggesting that the femtosecond laser induced fluorescence spectroscopy along with the proposed spectra analysis method can become a powerful tool for monitoring environmental pollutants.

  6. An iterative approach to case study analysis: insights from qualitative analysis of quantitative inconsistencies

    Directory of Open Access Journals (Sweden)

    Allain J Barnett

    2016-09-01

    Full Text Available Large-N comparative studies have helped common pool resource scholars gain general insights into the factors that influence collective action and governance outcomes. However, these studies are often limited by missing data, and suffer from the methodological limitation that important information is lost when we reduce textual information to quantitative data. This study was motivated by nine case studies that appeared to be inconsistent with the expectation that the presence of Ostrom’s Design Principles increases the likelihood of successful common pool resource governance. These cases highlight the limitations of coding and analysing Large-N case studies. We examine two issues: 1 the challenge of missing data and 2 potential approaches that rely on context (which is often lost in the coding process to address inconsistencies between empirical observations theoretical predictions.  For the latter, we conduct a post-hoc qualitative analysis of a large-N comparative study to explore 2 types of inconsistencies: 1 cases where evidence for nearly all design principles was found, but available evidence led to the assessment that the CPR system was unsuccessful and 2 cases where the CPR system was deemed successful despite finding limited or no evidence for design principles.  We describe inherent challenges to large-N comparative analysis to coding complex and dynamically changing common pool resource systems for the presence or absence of design principles and the determination of “success”.  Finally, we illustrate how, in some cases, our qualitative analysis revealed that the identity of absent design principles explained inconsistencies hence de-facto reconciling such apparent inconsistencies with theoretical predictions.  This analysis demonstrates the value of combining quantitative and qualitative analysis, and using mixed-methods approaches iteratively to build comprehensive methodological and theoretical approaches to understanding

  7. Quantitative gait analysis following hemispherotomy for Rasmussen′s encephalitis

    Directory of Open Access Journals (Sweden)

    Santhosh George Thomas

    2007-01-01

    Full Text Available Peri-insular hemispherotomy is a form of disconnective hemispherectomy involving complete disconnection of all ascending / descending and commisural connections of one hemisphere. We report a case of a seven and a half year old child with intractable epilepsy due to Rasmussen′s encephalitis who underwent peri-insular hemispherotomy and achieved complete freedom from seizures. Quantitative gait analysis was used to describe the changes in the kinematic and kinetic parameters of gait with surface electromyographs 18 months after surgery. The focus of this paper is to highlight the utility of gait analysis following hemispherotomy with a view to directing postsurgical motor training and rehabilitation.

  8. Quantitative microleakage of some dentinal bonding restorative systems.

    Science.gov (United States)

    Hasegawa, T; Retief, D H

    1993-03-01

    The quantitative microleakage of class V cementum (dentin) cavities restored with six dentinal bonding restorative systems was determined in vitro. Ninety extracted human permanent first and second mandibular and maxillary premolars were used in this study. Class V preparations were made in cementum (dentin) at the root facial surfaces. The preparations were restored with 1) a dentin bonding system containing 2% HEMA and BisGMA and a light-cured microfilled composite; 2) the same materials only substituting META/MMA base and TBB catalyst monomers for the BisGMA sealer; 3) a dentin bonding system containing 35% HEMA with META/MMA base and TBB catalyst, and a light-cured hybrid composite; 5) the same dentin bonding system only substituting the 35% glycerylmethacrylate for the 35% HEMA and using the microfilled composite; and 6) the previously described system with a substitution of 0.5 mol EDTA for the 10% citric acid -3% FeCl3. Fifteen teeth were restored with each procedure. The restorations were finished with 12-bladed carbide burs 15 min after placement, the teeth were stored in saline at 37 degrees C for 24 h, finished with Sof-Lex discs and then thermocycled in 2% methylene blue solution 500 times between 50 degrees C and 8 degrees C with a dwell time of 15 s. Quantitative microleakage was determined by a spectrophotometric dye-recovery method and expressed in microgram/dye/restoration. The data were analyzed by ANOVA, Student-Newman-Keuls and Kruskal-Wallis tests. The quantitative microleakage of the teeth restored with the adhesive systems containing 35% glyceryl methacrylate was significantly reduced. The bonding mechanism of glyceryl methacrylate is not known.

  9. Quantitative Adaptation Analytics for Assessing Dynamic Systems of Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Gauthier, John H.; Miner, Nadine E.; Wilson, Michael L.; Le, Hai D.; Kao, Gio K; Melander, Darryl J.; Longsine, Dennis Earl [Sandia National Laboratories, Unknown, Unknown; Vander Meer, Robert Charles,

    2015-01-01

    Our society is increasingly reliant on systems and interoperating collections of systems, known as systems of systems (SoS). These SoS are often subject to changing missions (e.g., nation- building, arms-control treaties), threats (e.g., asymmetric warfare, terrorism), natural environments (e.g., climate, weather, natural disasters) and budgets. How well can SoS adapt to these types of dynamic conditions? This report details the results of a three year Laboratory Directed Research and Development (LDRD) project aimed at developing metrics and methodologies for quantifying the adaptability of systems and SoS. Work products include: derivation of a set of adaptability metrics, a method for combining the metrics into a system of systems adaptability index (SoSAI) used to compare adaptability of SoS designs, development of a prototype dynamic SoS (proto-dSoS) simulation environment which provides the ability to investigate the validity of the adaptability metric set, and two test cases that evaluate the usefulness of a subset of the adaptability metrics and SoSAI for distinguishing good from poor adaptability in a SoS. Intellectual property results include three patents pending: A Method For Quantifying Relative System Adaptability, Method for Evaluating System Performance, and A Method for Determining Systems Re-Tasking.

  10. Scientific production on surgical nursing: analysis of the quantitative studies carried out between 2005 and 2009

    Directory of Open Access Journals (Sweden)

    Keila Maria de Azevedo Ponte

    2012-04-01

    Full Text Available This study aimed to analyze the characteristics of quantitative studies of nursing scientific publication in the surgical area. Research with quantitative approach carried out in the Virtual Healthcare Library with articles of quantitative nature, published from 2005 to 2009, in journals classified as A1, A2 and B1 by the Capes Nursing Qualis system 2008, full-text available on-line and in Portuguese. 28 articles were analyzed and it was detected that most of them are descriptive, exploratory, cross-sectional and retrospective. The most adopted instrument of data collection was the form; as disposition of data analysis, the use of tables and of simple and relative frequency; as database, the Excel program; and as statistical tests, the chi-square and Fisher. Researches with quantitative approach in surgical nursing area have been published in journals of national impact with the main characteristics of this type of approach.

  11. A Quantitative Method for Microtubule Analysis in Fluorescence Images.

    Science.gov (United States)

    Lan, Xiaodong; Li, Lingfei; Hu, Jiongyu; Zhang, Qiong; Dang, Yongming; Huang, Yuesheng

    2015-12-01

    Microtubule analysis is of significant value for a better understanding of normal and pathological cellular processes. Although immunofluorescence microscopic techniques have proven useful in the study of microtubules, comparative results commonly rely on a descriptive and subjective visual analysis. We developed an objective and quantitative method based on image processing and analysis of fluorescently labeled microtubular patterns in cultured cells. We used a multi-parameter approach by analyzing four quantifiable characteristics to compose our quantitative feature set. Then we interpreted specific changes in the parameters and revealed the contribution of each feature set using principal component analysis. In addition, we verified that different treatment groups could be clearly discriminated using principal components of the multi-parameter model. High predictive accuracy of four commonly used multi-classification methods confirmed our method. These results demonstrated the effectiveness and efficiency of our method in the analysis of microtubules in fluorescence images. Application of the analytical methods presented here provides information concerning the organization and modification of microtubules, and could aid in the further understanding of structural and functional aspects of microtubules under normal and pathological conditions.

  12. Quantitative Phosphoproteomic Analysis of T-Cell Receptor Signaling.

    Science.gov (United States)

    Ahsan, Nagib; Salomon, Arthur R

    2017-01-01

    TCR signaling critically depends on protein phosphorylation across many proteins. Localization of each phosphorylation event relative to the T-cell receptor (TCR) and canonical T-cell signaling proteins will provide clues about the structure of TCR signaling networks. Quantitative phosphoproteomic analysis by mass spectrometry provides a wide-scale view of cellular phosphorylation networks. However, analysis of phosphorylation by mass spectrometry is still challenging due to the relative low abundance of phosphorylated proteins relative to all proteins and the extraordinary diversity of phosphorylation sites across the proteome. Highly selective enrichment of phosphorylated peptides is essential to provide the most comprehensive view of the phosphoproteome. Optimization of phosphopeptide enrichment methods coupled with highly sensitive mass spectrometry workflows significantly improves the sequencing depth of the phosphoproteome to over 10,000 unique phosphorylation sites from complex cell lysates. Here we describe a step-by-step method for phosphoproteomic analysis that has achieved widespread success for identification of serine, threonine, and tyrosine phosphorylation. Reproducible quantification of relative phosphopeptide abundance is provided by intensity-based label-free quantitation. An ideal set of mass spectrometry analysis parameters is also provided that optimize the yield of identified sites. We also provide guidelines for the bioinformatic analysis of this type of data to assess the quality of the data and to comply with proteomic data reporting requirements.

  13. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    Science.gov (United States)

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  14. Hydrocarbons on Phoebe, Iapetus, and Hyperion: Quantitative Analysis

    Science.gov (United States)

    Cruikshank, Dale P.; MoreauDalleOre, Cristina; Pendleton, Yvonne J.; Clark, Roger Nelson

    2012-01-01

    We present a quantitative analysis of the hydrocarbon spectral bands measured on three of Saturn's satellites, Phoebe, Iaperus, and Hyperion. These bands, measured with the Cassini Visible-Infrared Mapping Spectrometer on close fly-by's of these satellites, are the C-H stretching modes of aromatic hydrocarbons at approximately 3.28 micrometers (approximately 3050 per centimeter), and the are four blended bands of aliphatic -CH2- and -CH3 in the range approximately 3.36-3.52 micrometers (approximately 2980- 2840 per centimeter) bably indicating the presence of polycyclic aromatic hydrocarbons (PAH), is unusually strong in comparison to the aliphatic bands, resulting in a unique signarure among Solar System bodies measured so far, and as such offers a means of comparison among the three satellites. The ratio of the C-H bands in aromatic molecules to those in aliphatic molecules in the surface materials of Phoebe, NAro:NAliph approximately 24; for Hyperion the value is approximately 12, while laperus shows an intermediate value. In view of the trend of the evolution (dehydrogenation by heat and radiation) of aliphatic complexes toward more compact molecules and eventually to aromatics, the relative abundances of aliphatic -CH2- and -CH3- is an indication of the lengths of the molecular chain structures, hence the degree of modification of the original material. We derive CH2:CH3 approximately 2.2 in the spectrum of low-albedo material on laperus; this value is the same within measurement errors to the ratio in the diffuse interstellar medium. The similarity in the spectral signatures of the three satellites, plus the apparent weak trend of aromatic/aliphatic abundance from Phoebe to Hyperion, is consistent with, and effectively confirms that the source of the hydrocarbon-bearing material is Phoebe, and that the appearance of that material on the other two satellites arises from the deposition of the inward-spiraling dust that populates the Phoebe ring.

  15. Multivariate analysis of quantitative traits can effectively classify rapeseed germplasm

    Directory of Open Access Journals (Sweden)

    Jankulovska Mirjana

    2014-01-01

    Full Text Available In this study, the use of different multivariate approaches to classify rapeseed genotypes based on quantitative traits has been presented. Tree regression analysis, PCA analysis and two-way cluster analysis were applied in order todescribe and understand the extent of genetic variability in spring rapeseed genotype by trait data. The traits which highly influenced seed and oil yield in rapeseed were successfully identified by the tree regression analysis. Principal predictor for both response variables was number of pods per plant (NP. NP and 1000 seed weight could help in the selection of high yielding genotypes. High values for both traits and oil content could lead to high oil yielding genotypes. These traits may serve as indirect selection criteria and can lead to improvement of seed and oil yield in rapeseed. Quantitative traits that explained most of the variability in the studied germplasm were classified using principal component analysis. In this data set, five PCs were identified, out of which the first three PCs explained 63% of the total variance. It helped in facilitating the choice of variables based on which the genotypes’ clustering could be performed. The two-way cluster analysissimultaneously clustered genotypes and quantitative traits. The final number of clusters was determined using bootstrapping technique. This approach provided clear overview on the variability of the analyzed genotypes. The genotypes that have similar performance regarding the traits included in this study can be easily detected on the heatmap. Genotypes grouped in the clusters 1 and 8 had high values for seed and oil yield, and relatively short vegetative growth duration period and those in cluster 9, combined moderate to low values for vegetative growth duration and moderate to high seed and oil yield. These genotypes should be further exploited and implemented in the rapeseed breeding program. The combined application of these multivariate methods

  16. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    Science.gov (United States)

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  17. Complex Politics: A Quantitative Semantic and Topological Analysis of UK House of Commons Debates

    CERN Document Server

    Gurciullo, Stefano; Pereda, María; Battiston, Federico; Patania, Alice; Poledna, Sebastian; Hedblom, Daniel; Oztan, Bahattin Tolga; Herzog, Alexander; John, Peter; Mikhaylov, Slava

    2015-01-01

    This study is a first, exploratory attempt to use quantitative semantics techniques and topological analysis to analyze systemic patterns arising in a complex political system. In particular, we use a rich data set covering all speeches and debates in the UK House of Commons between 1975 and 2014. By the use of dynamic topic modeling (DTM) and topological data analysis (TDA) we show that both members and parties feature specific roles within the system, consistent over time, and extract global patterns indicating levels of political cohesion. Our results provide a wide array of novel hypotheses about the complex dynamics of political systems, with valuable policy applications.

  18. Analysis of quantitative pore features based on mathematical morphology

    Institute of Scientific and Technical Information of China (English)

    QI Heng-nian; CHEN Feng-nong; WANG Hang-jun

    2008-01-01

    Wood identification is a basic technique of wood science and industry. Pore features are among the most important identification features for hardwoods. We have used a method based on an analysis of quantitative pore feature, which differs from traditional qualitative methods. We applies mathematical morphology methods such as dilation and erosion, open and close transformation of wood cross-sections, image repairing, noise filtering and edge detection to segment the pores from their background. Then the mean square errors (MSE) of pores were computed to describe the distribution of pores. Our experiment shows that it is easy to classift the pore features into three basic types, just as in traditional qualitative methods, but with the use of MSE of pores. This quantitative method improves wood identification considerably.

  19. Qualitative and Quantitative Proteome Analysis of Oral Fluids in Health and Periodontal Disease by Mass Spectrometry.

    Science.gov (United States)

    Salih, Erdjan

    2017-01-01

    The significance of protein identification and characterization by classical protein chemistry approaches is clearly highlighted by our detailed understanding of the biological systems assembled over a time period of almost a century. The advent of state-of-the-art mass spectrometry (MS) with sensitivity, speed, and global protein analysis capacity without individual protein purification has transformed the classical protein chemistry with premise to accelerate discovery. These combined with the ability of the oral fluids such as whole saliva (WS) and gingival crevicular fluid (GCF) to reflect both systemic and locally derived proteins have generated significant interest to characterize these fluids more extensively by MS technology. This chapter deals with the experimental details of preanalytical steps using multidimensional protein separation combined with MS analysis of WS and GCF to achieve detailed protein composition at qualitative and quantitative levels. These approaches are interfaced with gold standard "stable-isotope" labeling technologies for large-scale quantitative MS analysis which is a prerequisite to determine accurate alterations in protein levels as a function of disease progression. The latter incorporates two stable-isotope chemistries one specific for cysteine containing proteins and the other universal amine-specific reagent in conjunction with oral fluids in health and periodontal disease to perform quantitative MS analysis. In addition, specific preanalytical steps demanded by the oral fluids such as GCF and WS for sample preparations to overcome limitations and uncertainties are elaborated for reliable large-scale quantitative MS analysis.

  20. 基于图论的火电机组热经济性定量分析方法%A Quantitative Analysis Method for the Power Plant Thermal System Based on Graph Theory

    Institute of Scientific and Technical Information of China (English)

    冉鹏; 李庚生; 张树芳; 王松龄

    2012-01-01

    Based on the analysis of the structure features of the coal-fired power unit thermal system,graph theory was introduced into the power unit thermal system energy-saving analysis fields,and the division principles and the expression way based on graph theory of power unit thermal system were stipulated.The rules to fill the digraph weighted adjacency matrix of coal-fired power unit thermal system were determined.Combined with the energy conservation law,mass conservation law and the digraph weighted adjacency matrix,the digraph weighted adjacency equation of the coal-fired power unit thermal system,fuel consumption rate fuel differential equation were deduced.The quantitative analysis method for the power plant thermal system based on graph theory studies the power unit thermal system by the way of graph,and describes the energy flow mass flow of thermal system by binary relation graph combined with point and line.This method is of standard and concise form,whose physical meaning is well-expressed.It is a novel thermal economics analysis method of coal-fired power unit thermal system.An example was given to illustrate the validity of the method.%在分析火电机组热力系统结构特点的基础上,将图论思想引入到热力系统节能分析中,规定了火电机组热力系统的划分原则及其基于图的表达方法,确定了火电机组热力系统的有向图带权邻接矩阵填写规则。根据能量守恒定律、质量守恒定律并结合有向图带权邻接矩阵,推导出火电机组热力系统的有向图带权邻接矩阵方程、燃料消耗率及燃料微分变动方程。基于图论的火电机组热力系统节能分析方法,以图的形式研究电站热力系统,以点与线组成的能量流、物质流二元关系图形来描述电站热力系统,数学模型物理意义明确,使用方便,矩阵填写规范,是一种全新的火电机组热力系统热经济性分析方法。最后给出实例验证了本方法的正确性。

  1. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  2. Quantitative risk analysis as a basis for emergency planning

    Energy Technology Data Exchange (ETDEWEB)

    Yogui, Regiane Tiemi Teruya [Bureau Veritas do Brasil, Rio de Janeiro, RJ (Brazil); Macedo, Eduardo Soares de [Instituto de Pesquisas Tecnologicas (IPT), Sao Paulo, SP (Brazil)

    2009-07-01

    Several environmental accidents happened in Brazil and in the world during the 70's and 80's. This strongly motivated the preparation for emergencies in the chemical and petrochemical industries. Environmental accidents affect the environment and the communities that are neighbor to the industrial facilities. The present study aims at subsidizing and providing orientation to develop Emergency Planning from the data obtained on Quantitative Risk Analysis, elaborated according to the Technical Standard P4.261/03 from CETESB (Sao Paulo Environmental Agency). It was observed, during the development of the research, that the data generated on these studies need a complementation and a deeper analysis, so that it is possible to use them on the Emergency Plans. The main issues that were analyzed and discussed on this study were the reevaluation of hazard identification for the emergency plans, the consequences and vulnerability analysis for the response planning, the risk communication, and the preparation to respond to the emergencies of the communities exposed to manageable risks. As a result, the study intends to improve the interpretation and use of the data deriving from the Quantitative Risk Analysis to develop the emergency plans. (author)

  3. Quantitative analysis of in vivo confocal microscopy images: a review.

    Science.gov (United States)

    Patel, Dipika V; McGhee, Charles N

    2013-01-01

    In vivo confocal microscopy (IVCM) is a non-invasive method of examining the living human cornea. The recent trend towards quantitative studies using IVCM has led to the development of a variety of methods for quantifying image parameters. When selecting IVCM images for quantitative analysis, it is important to be consistent regarding the location, depth, and quality of images. All images should be de-identified, randomized, and calibrated prior to analysis. Numerous image analysis software are available, each with their own advantages and disadvantages. Criteria for analyzing corneal epithelium, sub-basal nerves, keratocytes, endothelium, and immune/inflammatory cells have been developed, although there is inconsistency among research groups regarding parameter definition. The quantification of stromal nerve parameters, however, remains a challenge. Most studies report lower inter-observer repeatability compared with intra-observer repeatability, and observer experience is known to be an important factor. Standardization of IVCM image analysis through the use of a reading center would be crucial for any future large, multi-centre clinical trials using IVCM.

  4. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    OpenAIRE

    2016-01-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequen...

  5. Thermopower and Nernst coefficient in the Y0.85Ca0.15Ba2-x La x Cu3O y system: experimental results and joint quantitative analysis

    Science.gov (United States)

    Gasumyants, V. E.; Martynova, O. A.

    2017-09-01

    The temperature dependences of the thermopower, S, and Nernst coefficient, Q, at temperatures up to 300-350 K for a series of ceramic Y0.85Ca0.15Ba2-x La x Cu3O y samples with a varied doping level have been measured. The peculiarities of these dependences as well as the variation of the room-temperature thermopower and the critical temperature values are revealed and discussed in comparison with the case of single lanthanum doping. It is shown that the narrow-band model can be successfully used to describe all the specific features of the experimental S(T) and Q(T) dependences. Their joint quantitative analysis allows one to unambiguously determine the values of all the model parameters characterizing the energy spectrum structure and the charge-carrier system properties including the band-averaged electron mobility and the degree of the dispersion law asymmetry. The observed changes in these parameters with increasing lanthanum content are discussed, taking into account the earlier revealed specific impact of calcium ions on the energy spectrum structure in Y-based high-temperature superconductors.

  6. Quantitative analysis of major dibenzocyclooctane lignans in Schisandrae fructus by online TLC-DART-MS.

    Science.gov (United States)

    Kim, Hye Jin; Oh, Myung Sook; Hong, Jongki; Jang, Young Pyo

    2011-01-01

    Direct analysis in real time (DART) ion source is a powerful ionising technique for the quick and easy detection of various organic molecules without any sample preparation steps, but the lack of quantitation capacity limits its extensive use in the field of phytochemical analysis. To improvise a new system which utilize DART-MS as a hyphenated detector for quantitation. A total extract of Schisandra chinensis fruit was analyzed on a TLC plate and three major lignan compounds were quantitated by three different methods of UV densitometry, TLC-DART-MS and HPLC-UV to compare the efficiency of each method. To introduce the TLC plate into the DART ion source at a constant velocity, a syringe pump was employed. The DART-MS total ion current chromatogram was recorded for the entire TLC plate. The concentration of each lignan compound was calculated from the calibration curve established with standard compound. Gomisin A, gomisin N and schisandrin were well separated on a silica-coated TLC plate and the specific ion current chromatograms were successfully acquired from the TLC-DART-MS system. The TLC-DART-MS system for the quantitation of natural products showed better linearity and specificity than TLC densitometry, and consumed less time and solvent than conventional HPLC method. A hyphenated system for the quantitation of phytochemicals from crude herbal drugs was successfully established. This system was shown to have a powerful analytical capacity for the prompt and efficient quantitation of natural products from crude drugs. Copyright © 2010 John Wiley & Sons, Ltd.

  7. Quantitative HAZOP analysis of ethylene/ethane hybrid distillationmembrane separation system%乙烯/乙烷精馏-膜耦合分离系统的HAZOP量化分析

    Institute of Scientific and Technical Information of China (English)

    冉慧丽; 肖武; 王明洋; 贺高红

    2012-01-01

    used to separate many industrially important systems. These systems were difficult or impossible to be separated by simple continuous distillation because the phase behavior contains an azeotrope, a tangent pinch, or an overall low relative volatility. But so far HAZOP analysis for the hybrid separation process has rarely been reported. Simulation model of ethylene/ethane hybrid distillation-membrane separation system was built using the Membrane Extension module developed based on UniSim Design software by our research group. Quantitative HAZOP analysis of hybrid distillation-membrane separation system was achieved by combining HAZOP with process simulation. Then effects of different grade deviations of parameters, including feed temperature, feed flow rate and side-draw flow rate of the rectifying column, were investigated. On the basis of simulated results, firstly, deviations used in HAZOP analysis were quantified according to the safety operation scopes of the equipments, then safety thresholds were achieved. Secondly, results of HAZOP analysis were quantified. Reasons of the deviations, consequences of the accident and the severity were analyzed according to the deviation levels. Then the hazard levels were developed. Finally, an analysis report of ethylene/ethane hybrid distillation-membrane separation system was provided based on deviations used in HAZOP analysis and results of HAZOP analysis.

  8. Quantitative and qualitative research on service quality evaluation system in NGN

    Institute of Scientific and Technical Information of China (English)

    LIU Lu; ZHOU Wen-an; SONG Jun-de

    2009-01-01

    With the development of next generation network (NGN), reasonable service quality evaluation is essential in network management. Based on NGN service characteristics, this article presents a comprehensive service quality evaluation system from two perspectives: quantitative and qualitative. From the quantitative point of view, this article brings forward the normalized service level achievement function (NSLA function) at technical layer. Also, with mean opinion score (MOS) mode, it proposes customer satisfaction assessment methods at customer perception layer. From the qualitative perspective, a hierarchical model is established, which forms mapping relations from the upper customer perception to the lower service quality parameters, and then the influence of different service parameters on customer satisfaction degree can be denoted by the fuzzy analysis hierarchy process (FAHP) algorithm. Quantitative and qualitative evaluations together form a comprehensive solution which is universal, customer-oriented and flexible. Demonstrated by the representative voice service, the proposed system is proved reliable and applicable to service evaluation in NGN.

  9. Optimal climate policy is a utopia. From quantitative to qualitative cost-benefit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Van den Bergh, Jeroen C.J.M. [Department of Spatial Economics, Faculty of Economics and Business Administration, and Institute for Environmental Studies, Free University, De Boelelaan 1105, 1081 HV, Amsterdam (Netherlands)

    2004-04-20

    The dominance of quantitative cost-benefit analysis (CBA) and optimality concepts in the economic analysis of climate policy is criticised. Among others, it is argued to be based in a misplaced interpretation of policy for a complex climate-economy system as being analogous to individual inter-temporal welfare optimisation. The transfer of quantitative CBA and optimality concepts reflects an overly ambitious approach that does more harm than good. An alternative approach is to focus the attention on extreme events, structural change and complexity. It is argued that a qualitative rather than a quantitative CBA that takes account of these aspects can support the adoption of a minimax regret approach or precautionary principle in climate policy. This means: implement stringent GHG reduction policies as soon as possible.

  10. Quantitative phosphoproteomic analysis using iTRAQ method.

    Science.gov (United States)

    Asano, Tomoya; Nishiuchi, Takumi

    2014-01-01

    The MAPK (mitogen-activated kinase) cascade plays important roles in plant perception of and reaction to developmental and environmental cues. Phosphoproteomics are useful to identify target proteins regulated by MAPK-dependent signaling pathway. Here, we introduce the quantitative phosphoproteomic analysis using a chemical labeling method. The isobaric tag for relative and absolute quantitation (iTRAQ) method is a MS-based technique to quantify protein expression among up to eight different samples in one experiment. In this technique, peptides were labeled by some stable isotope-coded covalent tags. We perform quantitative phosphoproteomics comparing Arabidopsis wild type and a stress-responsive mapkk mutant after phytotoxin treatment. To comprehensively identify the downstream phosphoproteins of MAPKK, total proteins were extracted from phytotoxin-treated wild-type and mapkk mutant plants. The phosphoproteins were purified by Pro-Q(®) Diamond Phosphoprotein Enrichment Kit and were digested with trypsin. Resulting peptides were labeled with iTRAQ reagents and were quantified and identified by MALDI TOF/TOF analyzer. We identified many phosphoproteins that were decreased in the mapkk mutant compared with wild type.

  11. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  12. Fluorescent foci quantitation for high-throughput analysis

    Science.gov (United States)

    Ledesma-Fernández, Elena; Thorpe, Peter H.

    2015-01-01

    A number of cellular proteins localize to discrete foci within cells, for example DNA repair proteins, microtubule organizing centers, P bodies or kinetochores. It is often possible to measure the fluorescence emission from tagged proteins within these foci as a surrogate for the concentration of that specific protein. We wished to develop tools that would allow quantitation of fluorescence foci intensities in high-throughput studies. As proof of principle we have examined the kinetochore, a large multi-subunit complex that is critical for the accurate segregation of chromosomes during cell division. Kinetochore perturbations lead to aneuploidy, which is a hallmark of cancer cells. Hence, understanding kinetochore homeostasis and regulation are important for a global understanding of cell division and genome integrity. The 16 budding yeast kinetochores colocalize within the nucleus to form a single focus. Here we have created a set of freely-available tools to allow high-throughput quantitation of kinetochore foci fluorescence. We use this ‘FociQuant’ tool to compare methods of kinetochore quantitation and we show proof of principle that FociQuant can be used to identify changes in kinetochore protein levels in a mutant that affects kinetochore function. This analysis can be applied to any protein that forms discrete foci in cells. PMID:26290880

  13. Dependence of quantitative accuracy of CT perfusion imaging on system parameters

    Science.gov (United States)

    Li, Ke; Chen, Guang-Hong

    2017-03-01

    Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.

  14. Lessons Learned from Quantitative Dynamical Modeling in Systems Biology

    Science.gov (United States)

    Bachmann, Julie; Matteson, Andrew; Schelke, Max; Kaschek, Daniel; Hug, Sabine; Kreutz, Clemens; Harms, Brian D.; Theis, Fabian J.; Klingmüller, Ursula; Timmer, Jens

    2013-01-01

    Due to the high complexity of biological data it is difficult to disentangle cellular processes relying only on intuitive interpretation of measurements. A Systems Biology approach that combines quantitative experimental data with dynamic mathematical modeling promises to yield deeper insights into these processes. Nevertheless, with growing complexity and increasing amount of quantitative experimental data, building realistic and reliable mathematical models can become a challenging task: the quality of experimental data has to be assessed objectively, unknown model parameters need to be estimated from the experimental data, and numerical calculations need to be precise and efficient. Here, we discuss, compare and characterize the performance of computational methods throughout the process of quantitative dynamic modeling using two previously established examples, for which quantitative, dose- and time-resolved experimental data are available. In particular, we present an approach that allows to determine the quality of experimental data in an efficient, objective and automated manner. Using this approach data generated by different measurement techniques and even in single replicates can be reliably used for mathematical modeling. For the estimation of unknown model parameters, the performance of different optimization algorithms was compared systematically. Our results show that deterministic derivative-based optimization employing the sensitivity equations in combination with a multi-start strategy based on latin hypercube sampling outperforms the other methods by orders of magnitude in accuracy and speed. Finally, we investigated transformations that yield a more efficient parameterization of the model and therefore lead to a further enhancement in optimization performance. We provide a freely available open source software package that implements the algorithms and examples compared here. PMID:24098642

  15. Lessons learned from quantitative dynamical modeling in systems biology.

    Directory of Open Access Journals (Sweden)

    Andreas Raue

    Full Text Available Due to the high complexity of biological data it is difficult to disentangle cellular processes relying only on intuitive interpretation of measurements. A Systems Biology approach that combines quantitative experimental data with dynamic mathematical modeling promises to yield deeper insights into these processes. Nevertheless, with growing complexity and increasing amount of quantitative experimental data, building realistic and reliable mathematical models can become a challenging task: the quality of experimental data has to be assessed objectively, unknown model parameters need to be estimated from the experimental data, and numerical calculations need to be precise and efficient. Here, we discuss, compare and characterize the performance of computational methods throughout the process of quantitative dynamic modeling using two previously established examples, for which quantitative, dose- and time-resolved experimental data are available. In particular, we present an approach that allows to determine the quality of experimental data in an efficient, objective and automated manner. Using this approach data generated by different measurement techniques and even in single replicates can be reliably used for mathematical modeling. For the estimation of unknown model parameters, the performance of different optimization algorithms was compared systematically. Our results show that deterministic derivative-based optimization employing the sensitivity equations in combination with a multi-start strategy based on latin hypercube sampling outperforms the other methods by orders of magnitude in accuracy and speed. Finally, we investigated transformations that yield a more efficient parameterization of the model and therefore lead to a further enhancement in optimization performance. We provide a freely available open source software package that implements the algorithms and examples compared here.

  16. Quantitative multiphase analysis of archaeological bronzes by neutron diffraction

    CERN Document Server

    Siano, S; Celli, M; Pini, R; Salimbeni, R; Zoppi, M; Kockelmann, W A; Iozzo, M; Miccio, M; Moze, O

    2002-01-01

    In this paper, we report the first investigation on the potentials of neutron diffraction to characterize archaeological bronze artifacts. The preliminary feasibility of phase and structural analysis was demonstrated on standardised specimens with a typical bronze alloy composition. These were realised through different hardening and annealing cycles, simulating possible ancient working techniques. The Bragg peak widths that resulted were strictly dependent on the working treatment, thus providing an important analytical element to investigate ancient making techniques. The diagnostic criteria developed on the standardised specimens were then applied to study two Etruscan museum pieces. Quantitative multiphase analysis by Rietveld refinement of the diffraction patterns was successfully demonstrated. Furthermore, the analysis of patterns associated with different artifact elements also yielded evidence for some peculiar perspective of the neutron diffraction diagnostics in archeometric applications. (orig.)

  17. Interchangeability between Placido disc and Scheimpflug system: quantitative and qualitative analysis Permutabilidade entre o disco de Plácido e o sistema Scheimpflug: análise quantitativa e qualitativa

    Directory of Open Access Journals (Sweden)

    Vinícius Silbiger de Stefano

    2010-08-01

    Full Text Available PURPOSE: Many systems try to replace Placido disc-based topographers, such as those based in Scheimpflug principles. The purpose of this study is to check if they are interchangeable. METHODS: Quantitative analysis evaluated data obtained from EyeSys and Pentacam, i.e. simulated keratometric values, in addition to flattest and steepest keratometric values. Sixty-three maps from each device (EyeSys scale=0.5 D; Pentacam scale= 0.25 D were used for the comparison. Qualitative analysis selected 10 EyeSys and 15 Pentacam topographies used in the quantitative evaluation. Aspheric, keratoconus suspects (KS and established keratoconus corneas were included. Four groups (children [CH], non-physicians adults [AD], residents in ophthalmology [OP] and refractive surgeons [RS] were asked to match the topographies belonging to the same eye. RESULTS: Analysis showed that the parameters are correlated; however they are not clinically similar. In the qualitative analysis, the percent of correct matches increased when KS was removed. CH group was statistically different from every group in these comparisons. When only KS was considered, CH vs. OP, CH vs. RS and AD vs. RS remained statistically different. AD vs. OP showed no relevant difference in any comparison. CONCLUSIONS: The systems are not fully interchangeable, yet they are correlated. Practitioners who are adapting to Pentacam should use the 0.25 D scale maps and transform formulas that use EyeSys parameters. Only with persistent training may the topographies be properly matched; KS corneas are more difficult to be correctly paired.OBJETIVO: Muitos sistemas tentam substituir os topógrafos baseados no disco de Plácido, como aqueles baseados nos princípios de Scheimpflug. O objetivo deste estudo é verificar se eles são intercambiáveis. MÉTODOS: A análise quantitativa avaliou dados obtidos através do EyeSys e do Pentacam, os valores de ceratometria simulada, além dos menores e maiores valores

  18. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    OpenAIRE

    A. O. Domanov

    2014-01-01

    Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg) explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial chara...

  19. Quantitative analysis of sideband coupling in photoinduced force microscopy

    Science.gov (United States)

    Jahng, Junghoon; Kim, Bongsu; Lee, Eun Seong; Potma, Eric Olaf

    2016-11-01

    We present a theoretical and experimental analysis of the cantilever motions detected in photoinduced force microscopy (PiFM) using the sideband coupling detection scheme. In sideband coupling, the cantilever dynamics are probed at a combination frequency of a fundamental mechanical eigenmode and the modulation frequency of the laser beam. Using this detection mode, we develop a method for reconstructing the modulated photoinduced force gradient from experimental parameters in a quantitative manner. We show evidence, both theoretically and experimentally, that the sideband coupling detection mode provides PiFM images with superior contrast compared to images obtained when detecting the cantilever motions directly at the laser modulation frequency.

  20. Quantitative and comparative analysis of hyperspectral data fusion performance

    Institute of Scientific and Technical Information of China (English)

    王强; 张晔; 李硕; 沈毅

    2002-01-01

    Hyperspectral data fusion technique is the key to hyperspectral data processing in recent years. Manyfusion methods have been proposed, but little research has been done to evaluate the performances of differentdata fusion methods. In order to meet the urgent need, quantitative correlation analysis (QCA) is proposed toanalyse and compare the performances of different fusion methods directly from data before and after fusion. Ex-periment results show that the new method is effective and the results of comparison are in agreement with theresults of application.

  1. Quantitative approach to small-scale nonequilibrium systems

    DEFF Research Database (Denmark)

    Dreyer, Jakob K; Berg-Sørensen, Kirstine; Oddershede, Lene B

    2006-01-01

    fluctuations are ignored, misinterpretation of measured quantities such as interaction forces, potentials, and constants may result. Here, we consider a particle moving in a time-dependent landscape, as, e.g., in an optical tweezers or atomic force nanoscopic measurement. Based on the Kramers equations, we...... propose an approximate but quantitative way of dealing with such an out-of-equilibrium system. The limits of this approximate description of the escape process are determined through optical tweezers experiments and comparison to simulations. Also, this serves as a recipe for how to use the proposed...

  2. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  3. Oxidized fatty acid analysis by charge-switch derivatization, selected reaction monitoring, and accurate mass quantitation.

    Science.gov (United States)

    Liu, Xinping; Moon, Sung Ho; Mancuso, David J; Jenkins, Christopher M; Guan, Shaoping; Sims, Harold F; Gross, Richard W

    2013-11-01

    A highly sensitive, specific, and robust method for the analysis of oxidized metabolites of linoleic acid (LA), arachidonic acid (AA), and docosahexaenoic acid (DHA) was developed using charge-switch derivatization, liquid chromatography-electrospray ionization tandem mass spectrometry (LC-ESI MS/MS) with selected reaction monitoring (SRM) and quantitation by high mass accuracy analysis of product ions, thereby minimizing interferences from contaminating ions. Charge-switch derivatization of LA, AA, and DHA metabolites with N-(4-aminomethylphenyl)-pyridinium resulted in a 10- to 30-fold increase in ionization efficiency. Improved quantitation was accompanied by decreased false positive interferences through accurate mass measurements of diagnostic product ions during SRM transitions by ratiometric comparisons with stable isotope internal standards. The limits of quantitation were between 0.05 and 6.0pg, with a dynamic range of 3 to 4 orders of magnitude (correlation coefficient r(2)>0.99). This approach was used to quantitate the levels of representative fatty acid metabolites from wild-type (WT) and iPLA2γ(-/-) mouse liver identifying the role of iPLA2γ in hepatic lipid second messenger production. Collectively, these results demonstrate the utility of high mass accuracy product ion analysis in conjunction with charge-switch derivatization for the highly specific quantitation of diminutive amounts of LA, AA, and DHA metabolites in biologic systems. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Quantitative analysis of impurities in aluminum alloys by laser-induced breakdown spectroscopy without internal calibration

    Institute of Scientific and Technical Information of China (English)

    LI Hong-kun; LIU Ming; CHEN Zhi-jiang; LI Run-hua

    2008-01-01

    To develop a fast and sensitive alloy elemental analysis method, a laser-induced breakdown spectroscopy(LIBS) system was established and used to carry out quantitative analysis of impurities in aluminum alloys in air at atmospheric pressure. A digital storage oscilloscope was used as signal recording instrument, instead of traditional gate integrator or Boxcar averager, to reduce the cost of the whole system. Linear calibration curves in the concentration range of 4×10-5-10-2 are built for Mg, Cr, Mn, Cu and Zn using absolute line intensity without internal calibrations. Limits of detection for these five elements in aluminum alloy are determined to be (2-90)×10-6. It is demonstrated that LIBS can provide quantitative trace elemental analysis in alloys even without internal calibration. This approach is easy to use in metallurgy industries and relative research fields.

  5. The Quantitative Basis of the Arabidopsis Innate Immune System to Endemic Pathogens Depends on Pathogen Genetics.

    Directory of Open Access Journals (Sweden)

    Jason A Corwin

    2016-02-01

    Full Text Available The most established model of the eukaryotic innate immune system is derived from examples of large effect monogenic quantitative resistance to pathogens. However, many host-pathogen interactions involve many genes of small to medium effect and exhibit quantitative resistance. We used the Arabidopsis-Botrytis pathosystem to explore the quantitative genetic architecture underlying host innate immune system in a population of Arabidopsis thaliana. By infecting a diverse panel of Arabidopsis accessions with four phenotypically and genotypically distinct isolates of the fungal necrotroph B. cinerea, we identified a total of 2,982 genes associated with quantitative resistance using lesion area and 3,354 genes associated with camalexin production as measures of the interaction. Most genes were associated with resistance to a specific Botrytis isolate, which demonstrates the influence of pathogen genetic variation in analyzing host quantitative resistance. While known resistance genes, such as receptor-like kinases (RLKs and nucleotide-binding site leucine-rich repeat proteins (NLRs, were found to be enriched among associated genes, they only account for a small fraction of the total genes associated with quantitative resistance. Using publically available co-expression data, we condensed the quantitative resistance associated genes into co-expressed gene networks. GO analysis of these networks implicated several biological processes commonly connected to disease resistance, including defense hormone signaling and ROS production, as well as novel processes, such as leaf development. Validation of single gene T-DNA knockouts in a Col-0 background demonstrate a high success rate (60% when accounting for differences in environmental and Botrytis genetic variation. This study shows that the genetic architecture underlying host innate immune system is extremely complex and is likely able to sense and respond to differential virulence among pathogen

  6. Quantitative chemical analysis of ocular melanosomes in the TEM.

    Science.gov (United States)

    Eibl, O; Schultheiss, S; Blitgen-Heinecke, P; Schraermeyer, U

    2006-01-01

    Melanosomes in retinal tissues of a human, monkey and rat were analyzed by EDX in the TEM. Samples were prepared by ultramicrotomy at different thicknesses. The material was mounted on Al grids and samples were analyzed in a Zeiss 912 TEM equipped with an Omega filter and EDX detector with ultrathin window. Melanosomes consist of C and O as main components, mole fractions are about 90 and 3-10 at.%, respectively, and small mole fraction ratios, between 2 and 0.1 at.%, of Na, Mg, K, Si, P, S, Cl, Ca. All elements were measured quantitatively by standardless EDX with high precision. Mole fractions of transition metals Fe, Cu and Zn were also measured. For Fe a mole fraction ratio of less than 0.1at.% was found and gives the melanin its paramagnetic properties. Its mole fraction is however close to or below the minimum detectable mass fraction of the used equipment. Only in the human eye and only in the retinal pigment epitelium (rpe) the mole fractions of Zn (0.1 at.% or 5000 microg/g) and Cu were clearly beyond the minimum detectable mass fraction. In the rat and monkey eye the mole fraction of Zn was at or below the minimum detectable mass fraction and could not be measured quantitatively. The obtained results yielded the chemical composition of the melanosomes in the choroidal tissue and the retinal pigment epitelium (rpe) of the three different species. The results of the chemical analysis are discussed by mole fraction correlation diagrams. Similarities and differences between the different species are outlined. Correlation behavior was found to hold over species, e.g. the Ca-O correlation. It indicates that Ca is bound to oxygen rich sites in the melanin. These are the first quantitative analyses of melanosomes by EDX reported so far. The quantitative chemical analysis should open a deeper understanding of the metabolic processes in the eye that are of central importance for the understanding of a large number of eye-related diseases. The chemical analysis also

  7. Quantitative analysis of intermolecular interactions in orthorhombic rubrene

    Directory of Open Access Journals (Sweden)

    Venkatesha R. Hathwar

    2015-09-01

    Full Text Available Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically, the presence of Cπ...Cπ interactions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. The quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.

  8. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    Science.gov (United States)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  9. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    Science.gov (United States)

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  10. Quantitative analysis of sensor for pressure waveform measurement

    Directory of Open Access Journals (Sweden)

    Tyan Chu-Chang

    2010-01-01

    Full Text Available Abstract Background Arterial pressure waveforms contain important diagnostic and physiological information since their contour depends on a healthy cardiovascular system 1. A sensor was placed at the measured artery and some contact pressure was used to measure the pressure waveform. However, where is the location of the sensor just about enough to detect a complete pressure waveform for the diagnosis? How much contact pressure is needed over the pulse point? These two problems still remain unresolved. Method In this study, we propose a quantitative analysis to evaluate the pressure waveform for locating the position and applying the appropriate force between the sensor and the radial artery. The two-axis mechanism and the modified sensor have been designed to estimate the radial arterial width and detect the contact pressure. The template matching method was used to analyze the pressure waveform. In the X-axis scan, we found that the arterial diameter changed waveform (ADCW and the pressure waveform would change from small to large and then back to small again when the sensor was moved across the radial artery. In the Z-axis scan, we also found that the ADCW and the pressure waveform would change from small to large and then back to small again when the applied contact pressure continuously increased. Results In the X-axis scan, the template correlation coefficients of the left and right boundaries of the radial arterial width were 0.987 ± 0.016 and 0.978 ± 0.028, respectively. In the Z-axis scan, when the excessive contact pressure was more than 100 mm Hg, the template correlation was below 0.983. In applying force, when using the maximum amplitude as the criteria level, the lower contact pressure (r = 0.988 ± 0.004 was better than the higher contact pressure (r = 0.976 ± 0.012. Conclusions Although, the optimal detective position has to be close to the middle of the radial arterial, the pressure waveform also has a good completeness with

  11. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  12. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    Science.gov (United States)

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  13. [Quantitative analysis of butachlor, oxadiazon and simetryn by gas chromatography].

    Science.gov (United States)

    Liu, F; Mu, W; Wang, J

    1999-03-01

    The quantitative analysis of the ingredients in 26% B-O-S (butachlor, oxadiazon and simetryn) emulsion by gas chromatographic method was carried out with a 5% SE-30 on Chromosorb AW DMCS, 2 m x 3 mm i.d., glass column at column temperature of 210 degrees C and detector temperature of 230 degrees C. The internal standard is di-n-butyl sebacate. The retentions of simetryn, internal standard, butachlor and oxadiazon were 6.5, 8.3, 9.9 and 11.9 min respectively. This method has a recovery of 98.62%-100.77% and the coefficients of variation of this analysis of butachlor, oxadiazon and simetryn were 0.46%, 0.32% and 0.57% respectively. All coefficients of linear correlation were higher than 0.999.

  14. Application of quantitative signal detection in the Dutch spontaneous reporting system for adverse drug reactions.

    Science.gov (United States)

    van Puijenbroek, Eugène; Diemont, Willem; van Grootheest, Kees

    2003-01-01

    The primary aim of spontaneous reporting systems (SRSs) is the timely detection of unknown adverse drug reactions (ADRs), or signal detection. Generally this is carried out by a systematic manual review of every report sent to an SRS. Statistical analysis of the data sets of an SRS, or quantitative signal detection, can provide additional information concerning a possible relationship between a drug and an ADR. We describe the role of quantitative signal detection and the way it is applied at the Netherlands Pharmacovigilance Centre Lareb. Results of the statistical analysis are implemented in the traditional case-by-case analysis. In addition, for data-mining purposes, a list of associations of ADRs and suspected drugs that are disproportionally present in the database is periodically generated. Finally, quantitative signal generation can be used to study more complex relationships, such as drug-drug interactions and syndromes. The results of quantitative signal detection should be considered as an additional source of information, complementary to the traditional analysis. Techniques for the detection of drug interactions and syndromes offer a new challenge for pharmacovigilance in the near future.

  15. Quantitative Vulnerability Assessment of Cyber Security for Distribution Automation Systems

    Directory of Open Access Journals (Sweden)

    Xiaming Ye

    2015-06-01

    Full Text Available The distribution automation system (DAS is vulnerable to cyber-attacks due to the widespread use of terminal devices and standard communication protocols. On account of the cost of defense, it is impossible to ensure the security of every device in the DAS. Given this background, a novel quantitative vulnerability assessment model of cyber security for DAS is developed in this paper. In the assessment model, the potential physical consequences of cyber-attacks are analyzed from two levels: terminal device level and control center server level. Then, the attack process is modeled based on game theory and the relationships among different vulnerabilities are analyzed by introducing a vulnerability adjacency matrix. Finally, the application process of the proposed methodology is illustrated through a case study based on bus 2 of the Roy Billinton Test System (RBTS. The results demonstrate the reasonability and effectiveness of the proposed methodology.

  16. Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.

    Science.gov (United States)

    Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed

    2016-04-01

    Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended

  17. Glioblastoma multiforme: exploratory radiogenomic analysis by using quantitative image features.

    Science.gov (United States)

    Gevaert, Olivier; Mitchell, Lex A; Achrol, Achal S; Xu, Jiajing; Echegaray, Sebastian; Steinberg, Gary K; Cheshier, Samuel H; Napel, Sandy; Zaharchuk, Greg; Plevritis, Sylvia K

    2014-10-01

    To derive quantitative image features from magnetic resonance (MR) images that characterize the radiographic phenotype of glioblastoma multiforme (GBM) lesions and to create radiogenomic maps associating these features with various molecular data. Clinical, molecular, and MR imaging data for GBMs in 55 patients were obtained from the Cancer Genome Atlas and the Cancer Imaging Archive after local ethics committee and institutional review board approval. Regions of interest (ROIs) corresponding to enhancing necrotic portions of tumor and peritumoral edema were drawn, and quantitative image features were derived from these ROIs. Robust quantitative image features were defined on the basis of an intraclass correlation coefficient of 0.6 for a digital algorithmic modification and a test-retest analysis. The robust features were visualized by using hierarchic clustering and were correlated with survival by using Cox proportional hazards modeling. Next, these robust image features were correlated with manual radiologist annotations from the Visually Accessible Rembrandt Images (VASARI) feature set and GBM molecular subgroups by using nonparametric statistical tests. A bioinformatic algorithm was used to create gene expression modules, defined as a set of coexpressed genes together with a multivariate model of cancer driver genes predictive of the module's expression pattern. Modules were correlated with robust image features by using the Spearman correlation test to create radiogenomic maps and to link robust image features with molecular pathways. Eighteen image features passed the robustness analysis and were further analyzed for the three types of ROIs, for a total of 54 image features. Three enhancement features were significantly correlated with survival, 77 significant correlations were found between robust quantitative features and the VASARI feature set, and seven image features were correlated with molecular subgroups (P < .05 for all). A radiogenomics map was

  18. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    Science.gov (United States)

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Multiparent intercross populations in analysis of quantitative traits

    Indian Academy of Sciences (India)

    Sujay Rakshit; Arunita Rakshit; J. V. Patil

    2011-04-01

    Most traits of interest to medical, agricultural and animal scientists show continuous variation and complex mode of inheritance. DNA-based markers are being deployed to analyse such complex traits, that are known as quantitative trait loci (QTL). In conventional QTL analysis, F2, backcross populations, recombinant inbred lines, backcross inbred lines and double haploids from biparental crosses are commonly used. Introgression lines and near isogenic lines are also being used for QTL analysis. However, such populations have major limitations like predominantly relying on the recombination events taking place in the F1 generation and mapping of only the allelic pairs present in the two parents. The second generation mapping resources like association mapping, nested association mapping and multiparent intercross populations potentially address the major limitations of available mapping resources. The potential of multiparent intercross populations in gene mapping has been discussed here. In such populations both linkage and association analysis can be conductted without encountering the limitations of structured populations. In such populations, larger genetic variation in the germplasm is accessed and various allelic and cytoplasmic interactions are assessed. For all practical purposes, across crop species, use of eight founders and a fixed population of 1000 individuals are most appropriate. Limitations with multiparent intercross populations are that they require longer time and more resource to be generated and they are likely to show extensive segregation for developmental traits, limiting their use in the analysis of complex traits. However, multiparent intercross population resources are likely to bring a paradigm shift towards QTL analysis in plant species.

  20. Phenotypic analysis of Arabidopsis mutants: quantitative analysis of root growth.

    Science.gov (United States)

    Doerner, Peter

    2008-03-01

    INTRODUCTIONThe growth of plant roots is very easy to measure and is particularly straightforward in Arabidopsis thaliana, because the increase in organ size is essentially restricted to one dimension. The precise measurement of root apical growth can be used to accurately determine growth activity (the rate of growth at a given time) during development in mutants, transgenic backgrounds, or in response to experimental treatments. Root growth is measured in a number of ways, the simplest of which is to grow the seedlings in a Petri dish and record the position of the advancing root tip at appropriate time points. The increase in root length is measured with a ruler and the data are entered into Microsoft Excel for analysis. When dealing with large numbers of seedlings, however, this procedure can be tedious, as well as inaccurate. An alternative approach, described in this protocol, uses "snapshots" of the growing plants, which are taken using gel-documentation equipment (i.e., a video camera with a frame-grabber unit, now commonly used to capture images from ethidium-bromide-stained electrophoresis gels). The images are analyzed using publicly available software (NIH-Image), which allows the user simply to cut and paste data into Microsoft Excel.

  1. Epistasis analysis for quantitative traits by functional regression model.

    Science.gov (United States)

    Zhang, Futao; Boerwinkle, Eric; Xiong, Momiao

    2014-06-01

    The critical barrier in interaction analysis for rare variants is that most traditional statistical methods for testing interactions were originally designed for testing the interaction between common variants and are difficult to apply to rare variants because of their prohibitive computational time and poor ability. The great challenges for successful detection of interactions with next-generation sequencing (NGS) data are (1) lack of methods for interaction analysis with rare variants, (2) severe multiple testing, and (3) time-consuming computations. To meet these challenges, we shift the paradigm of interaction analysis between two loci to interaction analysis between two sets of loci or genomic regions and collectively test interactions between all possible pairs of SNPs within two genomic regions. In other words, we take a genome region as a basic unit of interaction analysis and use high-dimensional data reduction and functional data analysis techniques to develop a novel functional regression model to collectively test interactions between all possible pairs of single nucleotide polymorphisms (SNPs) within two genome regions. By intensive simulations, we demonstrate that the functional regression models for interaction analysis of the quantitative trait have the correct type 1 error rates and a much better ability to detect interactions than the current pairwise interaction analysis. The proposed method was applied to exome sequence data from the NHLBI's Exome Sequencing Project (ESP) and CHARGE-S study. We discovered 27 pairs of genes showing significant interactions after applying the Bonferroni correction (P-values < 4.58 × 10(-10)) in the ESP, and 11 were replicated in the CHARGE-S study.

  2. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  3. anNET: a tool for network-embedded thermodynamic analysis of quantitative metabolome data

    Directory of Open Access Journals (Sweden)

    Zamboni Nicola

    2008-04-01

    Full Text Available Abstract Background Compared to other omics techniques, quantitative metabolomics is still at its infancy. Complex sample preparation and analytical procedures render exact quantification extremely difficult. Furthermore, not only the actual measurement but also the subsequent interpretation of quantitative metabolome data to obtain mechanistic insights is still lacking behind the current expectations. Recently, the method of network-embedded thermodynamic (NET analysis was introduced to address some of these open issues. Building upon principles of thermodynamics, this method allows for a quality check of measured metabolite concentrations and enables to spot metabolic reactions where active regulation potentially controls metabolic flux. So far, however, widespread application of NET analysis in metabolomics labs was hindered by the absence of suitable software. Results We have developed in Matlab a generalized software called 'anNET' that affords a user-friendly implementation of the NET analysis algorithm. anNET supports the analysis of any metabolic network for which a stoichiometric model can be compiled. The model size can span from a single reaction to a complete genome-wide network reconstruction including compartments. anNET can (i test quantitative data sets for thermodynamic consistency, (ii predict metabolite concentrations beyond the actually measured data, (iii identify putative sites of active regulation in the metabolic reaction network, and (iv help in localizing errors in data sets that were found to be thermodynamically infeasible. We demonstrate the application of anNET with three published Escherichia coli metabolome data sets. Conclusion Our user-friendly and generalized implementation of the NET analysis method in the software anNET allows users to rapidly integrate quantitative metabolome data obtained from virtually any organism. We envision that use of anNET in labs working on quantitative metabolomics will provide the

  4. Analysis of hybrid solar systems

    Science.gov (United States)

    Swisher, J.

    1980-10-01

    The TRNSYS simulation program was used to evaluate the performance of active charge/passive discharge solar systems with water as the working fluid. TRNSYS simulations are used to evaluate the heating performance and cooling augmentation provided by systems in several climates. The results of the simulations are used to develop a simplified analysis tool similar to the F-chart and Phi-bar procedures used for active systems. This tool, currently in a preliminary stage, should provide the designer with quantitative performance estimates for comparison with other passive, active, and nonsolar heating and cooling designs.

  5. Quantitative CT analysis of small pulmonary vessels in lymphangioleiomyomatosis

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Katsutoshi, E-mail: kando@juntendo.ac.jp [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Tobino, Kazunori [Department of Respiratory Medicine, Iizuka Hospital, 3-83 Yoshio-Machi, Iizuka-City, Fukuoka 820-8505 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Kurihara, Masatoshi; Kataoka, Hideyuki [Pneumothorax Center, Nissan Tamagawa Hospital, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Doi, Tokuhide [Fukuoka Clinic, 7-18-11 Umeda, Adachi-Ku, Tokyo 123-0851 (Japan); Hoshika, Yoshito [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan); Takahashi, Kazuhisa [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); Seyama, Kuniaki [Department of Internal Medicine, Division of Respiratory Medicine, Juntendo University Graduate School of Medicine, 2-1-1 Hongo, Bunkyo-Ku, Tokyo 113-8421 (Japan); The Study Group of Pneumothorax and Cystic Lung Diseases, 4-8-1 Seta, Setagaya-Ku, Tokyo 158-0095 (Japan)

    2012-12-15

    Backgrounds: Lymphangioleiomyomatosis (LAM) is a destructive lung disease that share clinical, physiologic, and radiologic features with chronic obstructive pulmonary disease (COPD). This study aims to identify those features that are unique to LAM by using quantitative CT analysis. Methods: We measured total cross-sectional areas of small pulmonary vessels (CSA) less than 5 mm{sup 2} and 5–10 mm{sup 2} and calculated percentages of those lung areas (%CSA), respectively, in 50 LAM and 42 COPD patients. The extent of cystic destruction (LAA%) and mean parenchymal CT value were also calculated and correlated with pulmonary function. Results: The diffusing capacity for carbon monoxide/alveolar volume (DL{sub CO}/VA %predicted) was similar for both groups (LAM, 44.4 ± 19.8% vs. COPD, 45.7 ± 16.0%, p = 0.763), but less tissue damage occurred in LAM than COPD (LAA% 21.7 ± 16.3% vs. 29.3 ± 17.0; p < 0.05). Pulmonary function correlated negatively with LAA% (p < 0.001) in both groups, yet the correlation with %CSA was significant only in COPD (p < 0.001). When the same analysis was conducted in two groups with equal levels of LAA% and DL{sub CO}/VA %predicted, %CSA and mean parenchymal CT value were still greater for LAM than COPD (p < 0.05). Conclusions: Quantitative CT analysis revealing a correlation between cystic destruction and CSA in COPD but not LAM indicates that this approach successfully reflects different mechanisms governing the two pathologic courses. Such determinations of small pulmonary vessel density may serve to differentiate LAM from COPD even in patients with severe lung destruction.

  6. The Quantitative Analysis of Chennai Automotive Industry Cluster

    Science.gov (United States)

    Bhaskaran, Ethirajan

    2016-07-01

    Chennai, also called as Detroit of India due to presence of Automotive Industry producing over 40 % of the India's vehicle and components. During 2001-2002, the Automotive Component Industries (ACI) in Ambattur, Thirumalizai and Thirumudivakkam Industrial Estate, Chennai has faced problems on infrastructure, technology, procurement, production and marketing. The objective is to study the Quantitative Performance of Chennai Automotive Industry Cluster before (2001-2002) and after the CDA (2008-2009). The methodology adopted is collection of primary data from 100 ACI using quantitative questionnaire and analyzing using Correlation Analysis (CA), Regression Analysis (RA), Friedman Test (FMT), and Kruskall Wallis Test (KWT).The CA computed for the different set of variables reveals that there is high degree of relationship between the variables studied. The RA models constructed establish the strong relationship between the dependent variable and a host of independent variables. The models proposed here reveal the approximate relationship in a closer form. KWT proves, there is no significant difference between three locations clusters with respect to: Net Profit, Production Cost, Marketing Costs, Procurement Costs and Gross Output. This supports that each location has contributed for development of automobile component cluster uniformly. The FMT proves, there is no significant difference between industrial units in respect of cost like Production, Infrastructure, Technology, Marketing and Net Profit. To conclude, the Automotive Industries have fully utilized the Physical Infrastructure and Centralised Facilities by adopting CDA and now exporting their products to North America, South America, Europe, Australia, Africa and Asia. The value chain analysis models have been implemented in all the cluster units. This Cluster Development Approach (CDA) model can be implemented in industries of under developed and developing countries for cost reduction and productivity

  7. QuASAR: quantitative allele-specific analysis of reads.

    Science.gov (United States)

    Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger

    2015-04-15

    Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Quantitative evaluation of esophageal scintigraphy in systemic sclerosis

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, Masaya; Nakajima, Kenichi; Konishi, Shouta; Sato, Shinichi; Takehara, Kazuhiko; Tonami, Norihisa [Kanazawa Univ. (Japan). School of Medicine

    2001-11-01

    Esophageal involvement by systemic sclerosis (SSc) is frequent. The purpose of this study was to evaluate esophageal motility disorders quantitatively. We investigate esophageal scintigraphy in 22 patients with SSc. Esophageal scintigraphy was obtained with swallowing physiological saline in supine position, and swallowing soup in supine and sitting positions. Data was acquired with 0.5 sec per frame for 192 frames in the anterior view. We employed a condensed image as a visual evaluation, half-life and retention rate as a quantitative evaluation, which were obtained from time-activity curves generated from regions of interest on the whole esophagus. The half-life and retention rate were compared with classification of Barnett, stages of SSc, and modified Rodnan total skin score (TSS). No significant differences were seen in classification of Barnett and the stages of SSc. No significant difference was seen between swallowing water and soup in the supine position. The retention rate was significantly prolonged in the supine position than in the sitting position. The retention rate of soup study in the sitting position correlated with TSS (r=0.61). Esophageal scintigraphy in the sitting position is useful in evaluation of esophageal motility in SSc. (author)

  9. Quantitative modeling and data analysis of SELEX experiments

    Science.gov (United States)

    Djordjevic, Marko; Sengupta, Anirvan M.

    2006-03-01

    SELEX (systematic evolution of ligands by exponential enrichment) is an experimental procedure that allows the extraction, from an initially random pool of DNA, of those oligomers with high affinity for a given DNA-binding protein. We address what is a suitable experimental and computational procedure to infer parameters of transcription factor-DNA interaction from SELEX experiments. To answer this, we use a biophysical model of transcription factor-DNA interactions to quantitatively model SELEX. We show that a standard procedure is unsuitable for obtaining accurate interaction parameters. However, we theoretically show that a modified experiment in which chemical potential is fixed through different rounds of the experiment allows robust generation of an appropriate dataset. Based on our quantitative model, we propose a novel bioinformatic method of data analysis for such a modified experiment and apply it to extract the interaction parameters for a mammalian transcription factor CTF/NFI. From a practical point of view, our method results in a significantly improved false positive/false negative trade-off, as compared to both the standard information theory based method and a widely used empirically formulated procedure.

  10. Quantitative colorimetric-imaging analysis of nickel in iron meteorites.

    Science.gov (United States)

    Zamora, L Lahuerta; López, P Alemán; Fos, G M Antón; Algarra, R Martín; Romero, A M Mellado; Calatayud, J Martínez

    2011-02-15

    A quantitative analytical imaging approach for determining the nickel content of metallic meteorites is proposed. The approach uses a digital image of a series of standard solutions of the nickel-dimethylglyoxime coloured chelate and a meteorite sample solution subjected to the same treatment as the nickel standards for quantitation. The image is processed with suitable software to assign a colour-dependent numerical value (analytical signal) to each standard. Such a value is directly proportional to the analyte concentration, which facilitates construction of a calibration graph where the value for the unknown sample can be interpolated to calculate the nickel content of the meteorite. The results thus obtained were validated by comparison with the official, ISO-endorsed spectrophotometric method for nickel. The proposed method is fairly simple and inexpensive; in fact, it uses a commercially available digital camera as measuring instrument and the images it provides are processed with highly user-friendly public domain software (specifically, ImageJ, developed by the National Institutes of Health and freely available for download on the Internet). In a scenario dominated by increasingly sophisticated and expensive equipment, the proposed method provides a cost-effective alternative based on simple, robust hardware that is affordable and can be readily accessed worldwide. This can be especially advantageous for countries were available resources for analytical equipment investments are scant. The proposed method is essentially an adaptation of classical chemical analysis to current, straightforward, robust, cost-effective instrumentation. Copyright © 2010 Elsevier B.V. All rights reserved.

  11. Quantitative analysis of tumor burden in mouse lung via MRI.

    Science.gov (United States)

    Tidwell, Vanessa K; Garbow, Joel R; Krupnick, Alexander S; Engelbach, John A; Nehorai, Arye

    2012-02-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Preclinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metastatic lung cancer. However, quantitative analysis of lung-tumor burden in mice by MRI presents significant challenges. Herein, a method for measuring tumor burden based upon average lung-image intensity is described and validated. The method requires accurate lung segmentation; its efficiency and throughput would be greatly aided by the ability to automatically segment the lungs. A technique for automated lung segmentation in the presence of varying tumor burden levels is presented. The method includes development of a new, two-dimensional parametric model of the mouse lungs and a multi-faceted cost function to optimally fit the model parameters to each image. Results demonstrate a strong correlation (0.93), comparable with that of fully manual expert segmentation, between the automated method's tumor-burden metric and the tumor burden measured by lung weight.

  12. The Impact of Arithmetic Skills on Mastery of Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Bruce K. Blaylock

    2012-01-01

    Full Text Available Over the past several years math education has moved from a period where all math calculations were done by hand to an era where most calculations are done using a calculator or computer. There are certainly benefits to this approach, but when one concomitantly recognizes the declining scores on national standardized mathematics exams, it raises the question, “Could the lack of technology-assisted arithmetic manipulation skills have a carryover to understanding higher-level mathematical concepts or is it just a spurious correlation?” Eighty-seven students were tested for their ability to do simple arithmetic and algebra by hand. These scores were then regressed on three important areas of quantitative analysis: recognizing the appropriate tool to use in an analysis, creating a model to carry out the analysis, and interpreting the results of the analysis. The study revealed a significant relationship between the ability to accurately do arithmetic calculations and the ability to recognize the appropriate tool and creating a model. It found no significant relationship between results interpretation and arithmetic skills.

  13. Quantitative analysis of complex casein hydrolysates based on chromatography and membrane

    Institute of Scientific and Technical Information of China (English)

    Qi Wei; Yu Yanjun; He Zhimin

    2006-01-01

    The enzymatic hydrolysates of casein are so complex that there is no effective method to do quantitative analysis.The common techniques,such as high performance chromatography and SDS-PAGE,can only carry out qualitative analysis.On the basis of membrane separation and high performance size exclusion chromatography (HPSEC),standard peptides with different molecular mass range were prepared,and the linear relationships between mass concentration of the standard peptides and the ultraviolet absorption of corresponding peak areas were established.Consequently,mass concentration of the different hydrolysates at different reaction times could be accurately calculated.The combination of chromatography and membrane separation is of great importance to the quantitative analysis of the complex hydrolysates,which can also be applied to the other macromolecular systems,such as carbohydrates.

  14. Formal modeling and quantitative evaluation for information system survivability based on PEPA

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Hui-qiang; ZHAO Guo-sheng

    2008-01-01

    Survivability should be considered beyond security for information system. To assess system survivability accurately, for improvement, a formal modeling and analysis method based on stochastic process algebra is proposed in this article. By abstracting the interactive behaviors between intruders and information system, a transferring graph of system state oriented survivability is constructed. On that basis, parameters are defined and system behaviors are characterized precisely with performance evaluation process algebra (PEPA), simultaneously considering the influence of different attack modes. Ultimately the formal model for survivability is established and quantitative analysis results are obtained by PEPA Workbench tool. Simulation experiments show the effectiveness and feasibility of the developed method, and it can help to direct the designation of survivable system.

  15. Quantitative analysis of a scar's pliability, perfusion and metrology

    Science.gov (United States)

    Gonzalez, Mariacarla; Sevilla, Nicole; Chue-Sang, Joseph; Ramella-Roman, Jessica C.

    2017-02-01

    The primary effect of scarring is the loss of function in the affected area. Scarring also leads to physical and psychological problems that could be devastating to the patient's life. Currently, scar assessment is highly subjective and physician dependent. The examination relies on the expertise of the physician to determine the characteristics of the scar by touch and visual examination using the Vancouver scar scale (VSS), which categorizes scars depending on pigmentation, pliability, height and vascularity. In order to establish diagnostic guidelines for scar formation, a quantitative, accurate assessment method needs to be developed. An instrument capable of measuring all categories was developed; three of the aforementioned parameters will be explored. In order to look at pliability, a durometer which measures the amount of resistance a surface exerts to prevent the permanent indentation of the surface is used due to its simplicity and quantitative output. To look at height and vascularity, a profilometry system that collects the location of the scar in three-dimensions and laser speckle imaging (LSI), which shows the dynamic changes in perfusion, respectively, are used. Gelatin phantoms were utilized to measure pliability. Finally, dynamic changes in skin perfusion of volunteers' forearms undergoing pressure cuff occlusion were measured, along with incisional scars.

  16. A Quantitative Proteomic Analysis of In Vitro Assembled Chromatin.

    Science.gov (United States)

    Völker-Albert, Moritz Carl; Pusch, Miriam Caroline; Fedisch, Andreas; Schilcher, Pierre; Schmidt, Andreas; Imhof, Axel

    2016-03-01

    The structure of chromatin is critical for many aspects of cellular physiology and is considered to be the primary medium to store epigenetic information. It is defined by the histone molecules that constitute the nucleosome, the positioning of the nucleosomes along the DNA and the non-histone proteins that associate with it. These factors help to establish and maintain a largely DNA sequence-independent but surprisingly stable structure. Chromatin is extensively disassembled and reassembled during DNA replication, repair, recombination or transcription in order to allow the necessary factors to gain access to their substrate. Despite such constant interference with chromatin structure, the epigenetic information is generally well maintained. Surprisingly, the mechanisms that coordinate chromatin assembly and ensure proper assembly are not particularly well understood. Here, we use label free quantitative mass spectrometry to describe the kinetics of in vitro assembled chromatin supported by an embryo extract prepared from preblastoderm Drosophila melanogaster embryos. The use of a data independent acquisition method for proteome wide quantitation allows a time resolved comparison of in vitro chromatin assembly. A comparison of our in vitro data with proteomic studies of replicative chromatin assembly in vivo reveals an extensive overlap showing that the in vitro system can be used for investigating the kinetics of chromatin assembly in a proteome-wide manner. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  17. Quantitative systems pharmacology: a promising approach for translational pharmacology.

    Science.gov (United States)

    Gadkar, K; Kirouac, D; Parrott, N; Ramanujan, S

    Biopharmaceutical companies have increasingly been exploring Quantitative Systems Pharmacology (QSP) as a potential avenue to address current challenges in drug development. In this paper, we discuss the application of QSP modeling approaches to address challenges in the translational of preclinical findings to the clinic, a high risk area of drug development. Three cases have been highlighted with QSP models utilized to inform different questions in translational pharmacology. In the first, a mechanism based asthma model is used to evaluate efficacy and inform biomarker strategy for a novel bispecific antibody. In the second case study, a mitogen-activated protein kinase (MAPK) pathway signaling model is used to make translational predictions on clinical response and evaluate novel combination therapies. In the third case study, a physiologically based pharmacokinetic (PBPK) model it used to guide administration of oseltamivir in pediatric patients.

  18. Quantitative microstructure analysis of polymer-modified mortars.

    Science.gov (United States)

    Jenni, A; Herwegh, M; Zurbriggen, R; Aberle, T; Holzer, L

    2003-11-01

    Digital light, fluorescence and electron microscopy in combination with wavelength-dispersive spectroscopy were used to visualize individual polymers, air voids, cement phases and filler minerals in a polymer-modified cementitious tile adhesive. In order to investigate the evolution and processes involved in formation of the mortar microstructure, quantifications of the phase distribution in the mortar were performed including phase-specific imaging and digital image analysis. The required sample preparation techniques and imaging related topics are discussed. As a form of case study, the different techniques were applied to obtain a quantitative characterization of a specific mortar mixture. The results indicate that the mortar fractionates during different stages ranging from the early fresh mortar until the final hardened mortar stage. This induces process-dependent enrichments of the phases at specific locations in the mortar. The approach presented provides important information for a comprehensive understanding of the functionality of polymer-modified mortars.

  19. Ozone Determination: A Comparison of Quantitative Analysis Methods

    Directory of Open Access Journals (Sweden)

    Rachmat Triandi Tjahjanto

    2012-10-01

    Full Text Available A comparison of ozone quantitative analysis methods by using spectrophotometric and volumetric method has been studied. The aim of this research is to determine the better method by considering the effect of reagent concentration and volume on the measured ozone concentration. Ozone which was analyzed in this research was synthesized from air, then it is used to ozonize methyl orange and potassium iodide solutions at different concentration and volume. Ozonation was held for 20 minutes with 363 mL/minutes air flow rates. The concentrations of ozonized methyl orange and potassium iodide solutions was analyzed by spectrophotometric and volumetric method, respectively. The result of this research shows that concentration and volume of reagent having an effect on the measured ozone concentration. Based on the results of both methods, it can be concluded that volumetric method is better than spectrophotometric method.

  20. Quantitative genetic analysis of injury liability in infants and toddlers

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, K.; Matheny, A.P. Jr. [Univ. of Louisville Medical School, KY (United States)

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  1. Quantitative analysis of gallstones using laser-induced breakdown spectroscopy.

    Science.gov (United States)

    Singh, Vivek K; Singh, Vinita; Rai, Awadhesh K; Thakur, Surya N; Rai, Pradeep K; Singh, Jagdish P

    2008-11-01

    The utility of laser-induced breakdown spectroscopy (LIBS) for categorizing different types of gallbladder stone has been demonstrated by analyzing their major and minor constituents. LIBS spectra of three types of gallstone have been recorded in the 200-900 nm spectral region. Calcium is found to be the major element in all types of gallbladder stone. The spectrophotometric method has been used to classify the stones. A calibration-free LIBS method has been used for the quantitative analysis of metal elements, and the results have been compared with those obtained from inductively coupled plasma atomic emission spectroscopy (ICP-AES) measurements. The single-shot LIBS spectra from different points on the cross section (in steps of 0.5 mm from one end to the other) of gallstones have also been recorded to study the variation of constituents from the center to the surface. The presence of different metal elements and their possible role in gallstone formation is discussed.

  2. Quantitative image analysis of WE43-T6 cracking behavior

    Science.gov (United States)

    Ahmad, A.; Yahya, Z.

    2013-06-01

    Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.

  3. qfasar: quantitative fatty acid signature analysis with R

    Science.gov (United States)

    Bromaghin, Jeffrey

    2017-01-01

    Knowledge of predator diets provides essential insights into their ecology, yet diet estimation is challenging and remains an active area of research.Quantitative fatty acid signature analysis (QFASA) is a popular method of estimating diet composition that continues to be investigated and extended. However, software to implement QFASA has only recently become publicly available.I summarize a new R package, qfasar, for diet estimation using QFASA methods. The package also provides functionality to evaluate and potentially improve the performance of a library of prey signature data, compute goodness-of-fit diagnostics, and support simulation-based research. Several procedures in the package have not previously been published.qfasar makes traditional and recently published QFASA diet estimation methods accessible to ecologists for the first time. Use of the package is illustrated with signature data from Chukchi Sea polar bears and potential prey species.

  4. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  5. Quantitative analysis of forest island pattern in selected Ohio landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Bowen, G.W.; Burgess, R.L.

    1981-07-01

    The purpose of this study was to quantitatively describe the various aspects of regional distribution patterns of forest islands and relate those patterns to other landscape features. Several maps showing the forest cover of various counties in Ohio were selected as representative examples of forest patterns to be quantified. Ten thousand hectare study areas (landscapes) were delineated on each map. A total of 15 landscapes representing a wide variety of forest island patterns was chosen. Data were converted into a series of continuous variables which contained information pertinent to the sizes, shape, numbers, and spacing of woodlots within a landscape. The continuous variables were used in a factor analysis to describe the variation among landscapes in terms of forest island pattern. The results showed that forest island patterns are related to topography and other environmental features correlated with topography.

  6. Quantitative analysis of secretome from adipocytes regulated by insulin

    Institute of Scientific and Technical Information of China (English)

    Hu Zhou; Yuanyuan Xiao; Rongxia Li; Shangyu Hong; Sujun Li; Lianshui Wang; Rong Zeng; Kan Liao

    2009-01-01

    Adipocyte is not only a central player involved in storage and release of energy, but also in regulation of energy metabolism in other organs via secretion of pep-tides and proteins. During the pathogenesis of insulin resistance and type 2 diabetes, adipocytes are subjected to the increased levels of insulin, which may have a major impact on the secretion of adipokines. We have undertaken cleavable isotope-coded affinity tag (clCAT) and label-free quantitation approaches to identify and quantify secretory factors that are differen-tially secreted by 3T3-LI adipocytes with or without insulin treatment. Combination of clCAT and label-free results, there are 317 proteins predicted or annotated as secretory proteins. Among these secretory proteins, 179 proteins and 53 proteins were significantly up-regulated and down-regulated, respectively. A total of 77 reported adipokines were quantified in our study, such as adiponectin, cathepsin D, cystatin C, resistin, and transferrin. Western blot analysis of these adipo-kines confirmed the quantitative results from mass spectrometry, and revealed individualized secreting pat-terns of these proteins by increasing insulin dose. In addition, 240 proteins were newly identified and quanti-fied as secreted proteins from 3T3-L1 adipocytes in our study, most of which were up-regulated upon insulin treatment. Further comprehensive bioinformatics analysis revealed that the secretory proteins in extra-cellular matrix-receptor interaction pathway and glycan structure degradation pathway were significantly up-regulated by insulin stimulation.

  7. Quantitative study of FORC diagrams in thermally corrected Stoner- Wohlfarth nanoparticles systems

    Science.gov (United States)

    De Biasi, E.; Curiale, J.; Zysler, R. D.

    2016-12-01

    The use of FORC diagrams is becoming increasingly popular among researchers devoted to magnetism and magnetic materials. However, a thorough interpretation of this kind of diagrams, in order to achieve quantitative information, requires an appropriate model of the studied system. For that reason most of the FORC studies are used for a qualitative analysis. In magnetic systems thermal fluctuations "blur" the signatures of the anisotropy, volume and particle interactions distributions, therefore thermal effects in nanoparticles systems conspire against a proper interpretation and analysis of these diagrams. Motivated by this fact, we have quantitatively studied the degree of accuracy of the information extracted from FORC diagrams for the special case of single-domain thermal corrected Stoner- Wohlfarth (easy axes along the external field orientation) nanoparticles systems. In this work, the starting point is an analytical model that describes the behavior of a magnetic nanoparticles system as a function of field, anisotropy, temperature and measurement time. In order to study the quantitative degree of accuracy of our model, we built FORC diagrams for different archetypical cases of magnetic nanoparticles. Our results show that from the quantitative information obtained from the diagrams, under the hypotheses of the proposed model, is possible to recover the features of the original system with accuracy above 95%. This accuracy is improved at low temperatures and also it is possible to access to the anisotropy distribution directly from the FORC coercive field profile. Indeed, our simulations predict that the volume distribution plays a secondary role being the mean value and its deviation the only important parameters. Therefore it is possible to obtain an accurate result for the inversion and interaction fields despite the features of the volume distribution.

  8. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates pept

  9. Quantitative Analysis of Matrine in Liquid Crystalline Nanoparticles by HPLC

    Directory of Open Access Journals (Sweden)

    Xinsheng Peng

    2014-01-01

    Full Text Available A reversed-phase high-performance liquid chromatographic method has been developed to quantitatively determine matrine in liquid crystal nanoparticles. The chromatographic method is carried out using an isocratic system. The mobile phase was composed of methanol-PBS(pH6.8-triethylamine (50 : 50 : 0.1% with a flow rate of 1 mL/min with SPD-20A UV/vis detector and the detection wavelength was at 220 nm. The linearity of matrine is in the range of 1.6 to 200.0 μg/mL. The regression equation is y=10706x-2959 (R2=1.0. The average recovery is 101.7%; RSD=2.22%  (n=9. This method provides a simple and accurate strategy to determine matrine in liquid crystalline nanoparticle.

  10. Comparison of qualitative and quantitative analysis of capillaroscopic findings in patients with rheumatic diseases.

    Science.gov (United States)

    Lambova, Sevdalina Nikolova; Hermann, Walter; Müller-Ladner, Ulf

    2012-12-01

    No guidelines for the application of qualitative and quantitative analysis of the capillaroscopic examination in the rheumatologic practice exist. The aims of the study were to compare qualitative and quantitative analysis of key capillaroscopic parameters in patients with common rheumatic diseases and to assess the reproducibility of the qualitative evaluation of the capillaroscopic parameters, performed by two different investigators. Two hundred capillaroscopic images from 93 patients with different rheumatic diseases were analysed quantitatively and qualitatively by two different investigators. The distribution of the images according to the diagnosis and the microvascular abnormalities was as follows-group 1: 73 images from systemic sclerosis patients ("scleroderma" type pattern), group 2: 10 images from dermatomyositis ("scleroderma-like" pattern), group 3: 25 images from undifferentiated connective tissue disease and different forms of overlap (24 "scleroderma-like"), group 4: 26 images from systemic lupus erythematosus patients, group 5: 46 images from rheumatoid arthritis and group 6: 20 images from primary Raynaud's phenomenon patients. All the images were mixed and blindly presented to both investigators. For comparison of the quantitative and qualitative method, investigator 1 assessed presence of dilated, giant capillaries and avascular areas quantitatively by the available software programme and his estimates were compared with the results of investigator 2, who assessed the parameters qualitatively. In addition, the capillaroscopic images were evaluated qualitatively by the investigator 1 and 2 for presence of dilated, giant capillaries, avascular areas and haemorrhages. The comparison of the quantitative and qualitative assessment of the two investigators demonstrated statistically significant difference between the two methods for the detection of dilated and giant capillaries (P 0.05). As we further analysed the results for the capillaroscopic

  11. Automatic quantitative analysis of cardiac MR perfusion images

    Science.gov (United States)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  12. QTL analysis for some quantitative traits in bread wheat

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Quantitative trait loci (QTL) analysis was conducted in bread wheat for 14 important traits utilizing data from four different mapping populations involving different approaches of QTL analysis. Analysis for grain protein content (GPC) suggested that the major part of genetic variation for this trait is due to environmental interactions. In contrast, pre-harvest sprouting tolerance (PHST) was controlled mainly by main effect QTL (M-QTL) with very little genetic variation due to environmental interactions; a major QTL for PHST was detected on chromosome arm 3AL. For grain weight, one QTL each was detected on chromosome arms 1AS, 2BS and 7AS. QTL for 4 growth related traits taken together detected by different methods ranged from 37 to 40; nine QTL that were detected by single-locus as well as two-locus analyses were all M-QTL. Similarly, single-locus and two-locus QTL analyses for seven yield and yield contributing traits in two populations respectively allowed detection of 25 and 50 QTL by composite interval mapping (CIM), 16 and 25 QTL by multiple-trait composite interval mapping (MCIM) and 38 and 37 QTL by two-locus analyses. These studies should prove useful in QTL cloning and wheat improvement through marker aided selection.

  13. Quantitative polymerase chain reaction analysis by deconvolution of internal standard.

    Science.gov (United States)

    Hirakawa, Yasuko; Medh, Rheem D; Metzenberg, Stan

    2010-04-29

    Quantitative Polymerase Chain Reaction (qPCR) is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise) results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  14. Quantitative polymerase chain reaction analysis by deconvolution of internal standard

    Directory of Open Access Journals (Sweden)

    Metzenberg Stan

    2010-04-01

    Full Text Available Abstract Background Quantitative Polymerase Chain Reaction (qPCR is a collection of methods for estimating the number of copies of a specific DNA template in a sample, but one that is not universally accepted because it can lead to highly inaccurate (albeit precise results. The fundamental problem is that qPCR methods use mathematical models that explicitly or implicitly apply an estimate of amplification efficiency, the error of which is compounded in the analysis to unacceptable levels. Results We present a new method of qPCR analysis that is efficiency-independent and yields accurate and precise results in controlled experiments. The method depends on a computer-assisted deconvolution that finds the point of concordant amplification behavior between the "unknown" template and an admixed amplicon standard. We apply the method to demonstrate dexamethasone-induced changes in gene expression in lymphoblastic leukemia cell lines. Conclusions This method of qPCR analysis does not use any explicit or implicit measure of efficiency, and may therefore be immune to problems inherent in other qPCR approaches. It yields an estimate of absolute initial copy number of template, and controlled tests show it generates accurate results.

  15. Quantitative assessment of human motion using video motion analysis

    Science.gov (United States)

    Probe, John D.

    1993-01-01

    In the study of the dynamics and kinematics of the human body a wide variety of technologies has been developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development, coupled with recent advances in video technology, have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System (APAS) to develop data on shirtsleeved and space-suited human performance in order to plan efficient on-orbit intravehicular and extravehicular activities. APAS is a fully integrated system of hardware and software for biomechanics and the analysis of human performance and generalized motion measurement. Major components of the complete system include the video system, the AT compatible computer, and the proprietary software.

  16. The Quantitative Analysis to Inferior Oil with Electronic Nose Based on Adaptive Multilayer Stochastic Resonance

    Directory of Open Access Journals (Sweden)

    Hong Men

    2011-09-01

    Full Text Available This study makes the three acryl glycerin polymers, oxidation three acryl glycerins, and low carbon number fatty acid as inferior oil feature index. Using double steady state stochastic resonance signal-to-noise ratio analysis methods make the quantitative analysis to inferior oil. This paper analyzes the stochastic resonance. Introduces the principle detection system structure based on adaptive multilayer stochastic resonance algorithm in inferior oil quantitativeanalysis; and make adaptive double stochastic resonance model and inferior oil as example, give the simulation and numerical analysis of this model of the system. The results show that the system can obtain more accurate quality the proportion of the inferior oil information. At the same time, this method can effectively solve the semiconductor gas sensors of the baseline drift problem. The method of stochastic resonance has a lot of application prospect in improving the system performance.

  17. Quantitative Performance Analysis of the SPEC OMPM2001 Benchmarks

    Directory of Open Access Journals (Sweden)

    Vishal Aslot

    2003-01-01

    Full Text Available The state of modern computer systems has evolved to allow easy access to multiprocessor systems by supporting multiple processors on a single physical package. As the multiprocessor hardware evolves, new ways of programming it are also developed. Some inventions may merely be adopting and standardizing the older paradigms. One such evolving standard for programming shared-memory parallel computers is the OpenMP API. The Standard Performance Evaluation Corporation (SPEC has created a suite of parallel programs called SPEC OMP to compare and evaluate modern shared-memory multiprocessor systems using the OpenMP standard. We have studied these benchmarks in detail to understand their performance on a modern architecture. In this paper, we present detailed measurements of the benchmarks. We organize, summarize, and display our measurements using a Quantitative Model. We present a detailed discussion and derivation of the model. Also, we discuss the important loops in the SPEC OMPM2001 benchmarks and the reasons for less than ideal speedup on our platform.

  18. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Directory of Open Access Journals (Sweden)

    Erin M Siegel

    Full Text Available Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2. A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003. Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  19. Quantitative analysis and parametric display of regional myocardial mechanics

    Science.gov (United States)

    Eusemann, Christian D.; Bellemann, Matthias E.; Robb, Richard A.

    2000-04-01

    Quantitative assessment of regional heart motion has significant potential for more accurate diagnosis of heart disease and/or cardiac irregularities. Local heart motion may be studied from medical imaging sequences. Using functional parametric mapping, regional myocardial motion during a cardiac cycle can be color mapped onto a deformable heart model to obtain better understanding of the structure- to-function relationships in the myocardium, including regional patterns of akinesis or diskinesis associated with ischemia or infarction. In this study, 3D reconstructions were obtained from the Dynamic Spatial Reconstructor at 15 time points throughout one cardiac cycle of pre-infarct and post-infarct hearts. Deformable models were created from the 3D images for each time point of the cardiac cycles. Form these polygonal models, regional excursions and velocities of each vertex representing a unit of myocardium were calculated for successive time-intervals. The calculated results were visualized through model animations and/or specially formatted static images. The time point of regional maximum velocity and excursion of myocardium through the cardiac cycle was displayed using color mapping. The absolute value of regional maximum velocity and maximum excursion were displayed in a similar manner. Using animations, the local myocardial velocity changes were visualized as color changes on the cardiac surface during the cardiac cycle. Moreover, the magnitude and direction of motion for individual segments of myocardium could be displayed. Comparison of these dynamic parametric displays suggest that the ability to encode quantitative functional information on dynamic cardiac anatomy enhances the diagnostic value of 4D images of the heart. Myocardial mechanics quantified this way adds a new dimension to the analysis of cardiac functional disease, including regional patterns of akinesis and diskinesis associated with ischemia and infarction. Similarly, disturbances in

  20. Quantitative DNA methylation analysis of candidate genes in cervical cancer.

    Science.gov (United States)

    Siegel, Erin M; Riggs, Bridget M; Delmas, Amber L; Koch, Abby; Hakam, Ardeshir; Brown, Kevin D

    2015-01-01

    Aberrant DNA methylation has been observed in cervical cancer; however, most studies have used non-quantitative approaches to measure DNA methylation. The objective of this study was to quantify methylation within a select panel of genes previously identified as targets for epigenetic silencing in cervical cancer and to identify genes with elevated methylation that can distinguish cancer from normal cervical tissues. We identified 49 women with invasive squamous cell cancer of the cervix and 22 women with normal cytology specimens. Bisulfite-modified genomic DNA was amplified and quantitative pyrosequencing completed for 10 genes (APC, CCNA, CDH1, CDH13, WIF1, TIMP3, DAPK1, RARB, FHIT, and SLIT2). A Methylation Index was calculated as the mean percent methylation across all CpG sites analyzed per gene (~4-9 CpG site) per sequence. A binary cut-point was defined at >15% methylation. Sensitivity, specificity and area under ROC curve (AUC) of methylation in individual genes or a panel was examined. The median methylation index was significantly higher in cases compared to controls in 8 genes, whereas there was no difference in median methylation for 2 genes. Compared to HPV and age, the combination of DNA methylation level of DAPK1, SLIT2, WIF1 and RARB with HPV and age significantly improved the AUC from 0.79 to 0.99 (95% CI: 0.97-1.00, p-value = 0.003). Pyrosequencing analysis confirmed that several genes are common targets for aberrant methylation in cervical cancer and DNA methylation level of four genes appears to increase specificity to identify cancer compared to HPV detection alone. Alterations in DNA methylation of specific genes in cervical cancers, such as DAPK1, RARB, WIF1, and SLIT2, may also occur early in cervical carcinogenesis and should be evaluated.

  1. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Science.gov (United States)

    Wandinger, Sebastian K; Lahortiga, Idoya; Jacobs, Kris; Klammer, Martin; Jordan, Nicole; Elschenbroich, Sarah; Parade, Marc; Jacoby, Edgar; Linders, Joannes T M; Brehmer, Dirk; Cools, Jan; Daub, Henrik

    2016-01-01

    The four members of the epidermal growth factor receptor (EGFR/ERBB) family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1) treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS) experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  2. Quantitative Phosphoproteomics Analysis of ERBB3/ERBB4 Signaling.

    Directory of Open Access Journals (Sweden)

    Sebastian K Wandinger

    Full Text Available The four members of the epidermal growth factor receptor (EGFR/ERBB family form homo- and heterodimers which mediate ligand-specific regulation of many key cellular processes in normal and cancer tissues. While signaling through the EGFR has been extensively studied on the molecular level, signal transduction through ERBB3/ERBB4 heterodimers is less well understood. Here, we generated isogenic mouse Ba/F3 cells that express full-length and functional membrane-integrated ERBB3 and ERBB4 or ERBB4 alone, to serve as a defined cellular model for biological and phosphoproteomics analysis of ERBB3/ERBB4 signaling. ERBB3 co-expression significantly enhanced Ba/F3 cell proliferation upon neuregulin-1 (NRG1 treatment. For comprehensive signaling studies we performed quantitative mass spectrometry (MS experiments to compare the basal ERBB3/ERBB4 cell phosphoproteome to NRG1 treatment of ERBB3/ERBB4 and ERBB4 cells. We employed a workflow comprising differential isotope labeling with mTRAQ reagents followed by chromatographic peptide separation and final phosphopeptide enrichment prior to MS analysis. Overall, we identified 9686 phosphorylation sites which could be confidently localized to specific residues. Statistical analysis of three replicate experiments revealed 492 phosphorylation sites which were significantly changed in NRG1-treated ERBB3/ERBB4 cells. Bioinformatics data analysis recapitulated regulation of mitogen-activated protein kinase and Akt pathways, but also indicated signaling links to cytoskeletal functions and nuclear biology. Comparative assessment of NRG1-stimulated ERBB4 Ba/F3 cells revealed that ERBB3 did not trigger defined signaling pathways but more broadly enhanced phosphoproteome regulation in cells expressing both receptors. In conclusion, our data provide the first global picture of ERBB3/ERBB4 signaling and provide numerous potential starting points for further mechanistic studies.

  3. Quantitative analysis of cryptic splicing associated with TDP-43 depletion.

    Science.gov (United States)

    Humphrey, Jack; Emmett, Warren; Fratta, Pietro; Isaacs, Adrian M; Plagnol, Vincent

    2017-05-26

    Reliable exon recognition is key to the splicing of pre-mRNAs into mature mRNAs. TDP-43 is an RNA-binding protein whose nuclear loss and cytoplasmic aggregation are a hallmark pathology in amyotrophic lateral sclerosis and frontotemporal dementia (ALS/FTD). TDP-43 depletion causes the aberrant inclusion of cryptic exons into a range of transcripts, but their extent, relevance to disease pathogenesis and whether they are caused by other RNA-binding proteins implicated in ALS/FTD are unknown. We developed an analysis pipeline to discover and quantify cryptic exon inclusion and applied it to publicly available human and murine RNA-sequencing data. We detected widespread cryptic splicing in TDP-43 depletion datasets but almost none in another ALS/FTD-linked protein FUS. Sequence motif and iCLIP analysis of cryptic exons demonstrated that they are bound by TDP-43. Unlike the cryptic exons seen in hnRNP C depletion, those repressed by TDP-43 cannot be linked to transposable elements. Cryptic exons are poorly conserved and inclusion overwhelmingly leads to nonsense-mediated decay of the host transcript, with reduced transcript levels observed in differential expression analysis. RNA-protein interaction data on 73 different RNA-binding proteins showed that, in addition to TDP-43, 7 specifically bind TDP-43 linked cryptic exons. This suggests that TDP-43 competes with other splicing factors for binding to cryptic exons and can repress cryptic exon inclusion. Our quantitative analysis pipeline confirms the presence of cryptic exons during the depletion of TDP-43 but not FUS providing new insight into to RNA-processing dysfunction as a cause or consequence in ALS/FTD.

  4. Quantitative Anatomic Analysis of the Native Ligamentum Teres

    Science.gov (United States)

    Mikula, Jacob D.; Slette, Erik L.; Chahla, Jorge; Brady, Alex W.; Locks, Renato; Trindade, Christiano A. C.; Rasmussen, Matthew T.; LaPrade, Robert F.; Philippon, Marc J.

    2017-01-01

    Background: While recent studies have addressed the biomechanical function of the ligamentum teres and provided descriptions of ligamentum teres reconstruction techniques, its detailed quantitative anatomy remains relatively undocumented. Moreover, there is a lack of consensus in the literature regarding the number and morphology of the acetabular attachments of the ligamentum teres. Purpose: To provide a clinically relevant quantitative anatomic description of the native human ligamentum teres. Study Design: Descriptive laboratory study. Methods: Ten human cadaveric hemipelvises, complete with femurs (mean age, 59.6 years; range, 47-65 years), were dissected free of all extra-articular soft tissues to isolate the ligamentum teres and its attachments. A coordinate measuring device was used to quantify the attachment areas and their relationships to pertinent open and arthroscopic landmarks on both the acetabulum and the femur. The clock face reference system was utilized to describe acetabular anatomy, and all anatomic relationships were described using the mean and 95% confidence intervals. Results: There were 6 distinct attachments to the acetabulum and 1 to the femur. The areas of the acetabular and femoral attachment footprints of the ligamentum teres were 434 mm2 (95% CI, 320-549 mm2) and 84 mm2 (95% CI, 65-104 mm2), respectively. The 6 acetabular clock face locations were as follows: anterior attachment, 4:53 o’clock (95% CI, 4:45-5:02); posterior attachment, 6:33 o’clock (95% CI, 6:23-6:43); ischial attachment, 8:07 o’clock (95% CI, 7:47-8:26); iliac attachment, 1:49 o’clock (95% CI, 1:04-2:34); and a smaller pubic attachment that was located at 3:50 o’clock (95% CI, 3:41-4:00). The ischial attachment possessed the largest cross-sectional attachment area (127.3 mm2; 95% CI, 103.0-151.7 mm2) of all the acetabular attachments of the ligamentum teres. Conclusion: The most important finding of this study was that the human ligamentum teres had 6

  5. Quantitative transformation for implementation of adder circuits in physical systems.

    Science.gov (United States)

    Jones, Jeff; Whiting, James G H; Adamatzky, Andrew

    2015-08-01

    Computing devices are composed of spatial arrangements of simple fundamental logic gates. These gates may be combined to form more complex adding circuits and, ultimately, complete computer systems. Implementing classical adding circuits using unconventional, or even living substrates such as slime mould Physarum polycephalum, is made difficult and often impractical by the challenges of branching fan-out of inputs and regions where circuit lines must cross without interference. In this report we explore whether it is possible to avoid spatial propagation, branching and crossing completely in the design of adding circuits. We analyse the input and output patterns of a single-bit full adder circuit. A simple quantitative transformation of the input patterns which considers the total number of bits in the input string allows us to map the respective input combinations to the correct outputs patterns of the full adder circuit, reducing the circuit combinations from a 2:1 mapping to a 1:1 mapping. The mapping of inputs to outputs also shows an incremental linear progression, suggesting its implementation in a range of physical systems. We demonstrate an example implementation, first in simulation, inspired by self-oscillatory dynamics of the acellular slime mould P. polycephalum. We then assess the potential implementation using plasmodium of slime mould itself. This simple transformation may enrich the potential for using unconventional computing substrates to implement digital circuits.

  6. Comprehensive Quantitative Analysis of Ovarian and Breast Cancer Tumor Peptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Zhe; Wu, Chaochao; Xie, Fang; Slysz, Gordon W.; Tolic, Nikola; Monroe, Matthew E.; Petyuk, Vladislav A.; Payne, Samuel H.; Fujimoto, Grant M.; Moore, Ronald J.; Fillmore, Thomas L.; Schepmoes, Athena A.; Levine, Douglas; Townsend, Reid; Davies, Sherri; Li, Shunqiang; Ellis, Matthew; Boja, Emily; Rivers, Robert; Rodriguez, Henry; Rodland, Karin D.; Liu, Tao; Smith, Richard D.

    2015-01-02

    Aberrant degradation of proteins is associated with many pathological states, including cancers. Mass spectrometric analysis of tumor peptidomes, the intracellular and intercellular products of protein degradation, has the potential to provide biological insights on proteolytic processing in cancer. However, attempts to use the information on these smaller protein degradation products from tumors for biomarker discovery and cancer biology studies have been fairly limited to date, largely due to the lack of effective approaches for robust peptidomics identification and quantification, and the prevalence of confounding factors and biases associated with sample handling and processing. Herein, we have developed an effective and robust analytical platform for comprehensive analyses of tissue peptidomes, which is suitable for high throughput quantitative studies. The reproducibility and coverage of the platform, as well as the suitability of clinical ovarian tumor and patient-derived breast tumor xenograft samples with post-excision delay of up to 60 min before freezing for peptidomics analysis, have been demonstrated. Moreover, our data also show that the peptidomics profiles can effectively separate breast cancer subtypes, reflecting tumor-associated protease activities. Peptidomics complements results obtainable from conventional bottom-up proteomics, and provides insights not readily obtainable from such approaches.

  7. Correlation between two methods of florbetapir PET quantitative analysis.

    Science.gov (United States)

    Breault, Christopher; Piper, Jonathan; Joshi, Abhinay D; Pirozzi, Sara D; Nelson, Aaron S; Lu, Ming; Pontecorvo, Michael J; Mintun, Mark A; Devous, Michael D

    2017-01-01

    This study evaluated performance of a commercially available standardized software program for calculation of florbetapir PET standard uptake value ratios (SUVr) in comparison with an established research method. Florbetapir PET images for 183 subjects clinically diagnosed as cognitively normal (CN), mild cognitive impairment (MCI) or probable Alzheimer's disease (AD) (45 AD, 60 MCI, and 78 CN) were evaluated using two software processing algorithms. The research method uses a single florbetapir PET template generated by averaging both amyloid positive and amyloid negative registered brains together. The commercial software simultaneously optimizes the registration between the florbetapir PET images and three templates: amyloid negative, amyloid positive, and an average. Cortical average SUVr values were calculated across six predefined anatomic regions with respect to the whole cerebellum reference region. SUVr values were well correlated between the two methods (r2 = 0.98). The relationship between the methods computed from the regression analysis is: Commercial method SUVr = (0.9757*Research SUVr) + 0.0299. A previously defined cutoff SUVr of 1.1 for distinguishing amyloid positivity by the research method corresponded to 1.1 (95% CI = 1.098, 1.11) for the commercial method. This study suggests that the commercial method is comparable to the published research method of SUVr analysis for florbetapir PET images, thus facilitating the potential use of standardized quantitative approaches to PET amyloid imaging.

  8. Quantitative evaluation of midpalatal suture maturation via fractal analysis

    Science.gov (United States)

    Kwak, Kyoung Ho; Kim, Yong-Il; Kim, Yong-Deok

    2016-01-01

    Objective The purpose of this study was to determine whether the results of fractal analysis can be used as criteria for midpalatal suture maturation evaluation. Methods The study included 131 subjects aged over 18 years of age (range 18.1–53.4 years) who underwent cone-beam computed tomography. Skeletonized images of the midpalatal suture were obtained via image processing software and used to calculate fractal dimensions. Correlations between maturation stage and fractal dimensions were calculated using Spearman's correlation coefficient. Optimal fractal dimension cut-off values were determined using a receiver operating characteristic curve. Results The distribution of maturation stages of the midpalatal suture according to the cervical vertebrae maturation index was highly variable, and there was a strong negative correlation between maturation stage and fractal dimension (−0.623, p Fractal dimension was a statistically significant indicator of dichotomous results with regard to maturation stage (area under curve = 0.794, p fractal dimension was used to predict the resulting variable that splits maturation stages into ABC and D or E yielded an optimal fractal dimension cut-off value of 1.0235. Conclusions There was a strong negative correlation between fractal dimension and midpalatal suture maturation. Fractal analysis is an objective quantitative method, and therefore we suggest that it may be useful for the evaluation of midpalatal suture maturation. PMID:27668195

  9. Therapeutic electrical stimulation for spasticity: quantitative gait analysis.

    Science.gov (United States)

    Pease, W S

    1998-01-01

    Improvement in motor function following electrical stimulation is related to strengthening of the stimulated spastic muscle and inhibition of the antagonist. A 26-year-old man with familial spastic paraparesis presented with gait dysfunction and bilateral lower limb spastic muscle tone. Clinically, muscle strength and sensation were normal. He was considered appropriate for a trial of therapeutic electrical stimulation following failed trials of physical therapy and baclofen. No other treatment was used concurrent with the electrical stimulation. Before treatment, quantitative gait analysis revealed 63% of normal velocity and a crouched gait pattern, associated with excessive electromyographic activity in the hamstrings and gastrocnemius muscles. Based on these findings, bilateral stimulation of the quadriceps and anterior compartment musculature was performed two to three times per week for three months. Repeat gait analysis was conducted three weeks after the cessation of stimulation treatment. A 27% increase in velocity was noted associated with an increase in both cadence and right step length. Right hip and bilateral knee stance motion returned to normal (rather than "crouched"). No change in the timing of dynamic electromyographic activity was seen. These findings suggest a role for the use of electrical stimulation for rehabilitation of spasticity. The specific mechanism of this improvement remains uncertain.

  10. QUANTITATIVE EEG COMPARATIVE ANALYSIS BETWEEN AUTISM SPECTRUM DISORDER (ASD AND ATTENTION DEFICIT HYPERACTIVITY DISORDER (ADHD

    Directory of Open Access Journals (Sweden)

    Plamen D. Dimitrov

    2017-01-01

    Full Text Available Background: Autism is a mental developmental disorder, manifested in the early childhood. Attention deficit hyperactivity disorder is another psychiatric condition of the neurodevelopmental type. Both disorders affect information processing in the nervous system, altering the mechanisms which control how neurons and their synapses are connected and organized. Purpose: To examine if quantitative EEG assessment is sensitive and simple enough to differentiate autism from attention deficit hyperactivity disorder and neurologically typical children. Material and methods: Quantitative EEG is a type of electrophysiological assessment that uses computerized mathematical analysis to convert the raw waveform data into different frequency ranges. Each frequency range is averaged across a sample of data and quantified into mean amplitude (voltage in microvolts mV. We performed quantitative EEG analysis and compared 4 cohorts of children (aged from 3 to 7 years: with autism (high [n=27] and low [n=52] functioning, with attention deficit hyperactivity disorder [n=34], and with typical behavior [n75]. Results: Our preliminary results show that there are significant qEEG differences between the groups of patients and the control cohort. The changes affect the potential levels of delta-, theta-, alpha-, and beta- frequency spectrums. Conclusion: The present study shows some significant quantitative EEG findings in autistic patients. This is a step forward in our efforts, aimed at defining specific neurophysiologic changes, in order to develop and refine strategies for early diagnosis of autism spectrum disorders, differentiation from other development conditions in childhood, detection of specific biomarkers and early initiation of treatment.

  11. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  12. A Classifier Model based on the Features Quantitative Analysis for Facial Expression Recognition

    Directory of Open Access Journals (Sweden)

    Amir Jamshidnezhad

    2011-01-01

    Full Text Available In recent decades computer technology has considerable developed in use of intelligent systems for classification. The development of HCI systems is highly depended on accurate understanding of emotions. However, facial expressions are difficult to classify by a mathematical models because of natural quality. In this paper, quantitative analysis is used in order to find the most effective features movements between the selected facial feature points. Therefore, the features are extracted not only based on the psychological studies, but also based on the quantitative methods to arise the accuracy of recognitions. Also in this model, fuzzy logic and genetic algorithm are used to classify facial expressions. Genetic algorithm is an exclusive attribute of proposed model which is used for tuning membership functions and increasing the accuracy.

  13. Quantitative Analysis for Monitoring Formulation of Lubricating Oil Using Terahertz Time-Domain Transmission Spectroscopy

    Institute of Scientific and Technical Information of China (English)

    TIAN Lu; ZHAO Kun; ZHOU Qing-Li; SHI Yu-Lei; ZHANG Cun-Lin

    2012-01-01

    The quantitative analysis of zinc isopropyl-isooctyl-dithiophosphate (T204) mixed with lube base oil from Korea with viscosity index 70 (T204-Korea70) is presented by using terahertz time-domain spectroscopy (THz-TDS). Compared with the middle-infrared spectra of zinc n-butyl-isooctyl-dithiophosphate (T202) and T204, THz spectra of T202 and T204 show the weak broad absorption bands. Then, the absorption coefficients of the T204-Korea70 system follow Beer's law at the concentration from 0.124 to 4.024%. The experimental absorption spectra of T204-Korea70 agree with the calculated ones based on the standard absorption coefficients of T204 and Korea70. The quantitative analysis enables a strategy to monitor the formulation of lubricating oil in real time.%The quantitative analysis of zinc isopropyl-isooctyl-dithiophosphate (T204) mixed with lube base oil from Korea with viscosity index 70 (T204-Korea70) is presented by using terahertz time-domain spectroscopy (THz-TDS).Compared with the middle-infrared spectra of zinc n-butyl-isooctyl-dithiophosphate (T202) and T204,THz spectra of T202 and T204 show the weak broad absorption bands.Then,the absorption coefficients of the T204-Korea70 system follow Beer's law at the concentration from 0.124 to 4.024%.The experimental absorption spectra of T204-Korea70 agree with the calculated ones based on the standard absorption coefficients of T204 and Korea70.The quantitative analysis enables a strategy to monitor the formulation of lubricating oil in real time.

  14. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    Science.gov (United States)

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  15. Model exploration and analysis for quantitative safety refinement in probabilistic B

    CERN Document Server

    Ndukwu, Ukachukwu; 10.4204/EPTCS.55.7

    2011-01-01

    The role played by counterexamples in standard system analysis is well known; but less common is a notion of counterexample in probabilistic systems refinement. In this paper we extend previous work using counterexamples to inductive invariant properties of probabilistic systems, demonstrating how they can be used to extend the technique of bounded model checking-style analysis for the refinement of quantitative safety specifications in the probabilistic B language. In particular, we show how the method can be adapted to cope with refinements incorporating probabilistic loops. Finally, we demonstrate the technique on pB models summarising a one-step refinement of a randomised algorithm for finding the minimum cut of undirected graphs, and that for the dependability analysis of a controller design.

  16. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    Science.gov (United States)

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  17. Quantitative analysis of left ventricular strain using cardiac computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Buss, Sebastian J., E-mail: sebastian.buss@med.uni-heidelberg.de [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Schulz, Felix; Mereles, Derliz [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Hosch, Waldemar [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Galuschky, Christian; Schummers, Georg; Stapf, Daniel [TomTec Imaging Systems GmbH, Munich (Germany); Hofmann, Nina; Giannitsis, Evangelos; Hardt, Stefan E. [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany); Kauczor, Hans-Ulrich [Department of Diagnostic and Interventional Radiology, University of Heidelberg, 69120 Heidelberg (Germany); Katus, Hugo A.; Korosoglou, Grigorios [Department of Cardiology, University of Heidelberg, 69120 Heidelberg (Germany)

    2014-03-15

    Objectives: To investigate whether cardiac computed tomography (CCT) can determine left ventricular (LV) radial, circumferential and longitudinal myocardial deformation in comparison to two-dimensional echocardiography in patients with congestive heart failure. Background: Echocardiography allows for accurate assessment of strain with high temporal resolution. A reduced strain is associated with a poor prognosis in cardiomyopathies. However, strain imaging is limited in patients with poor echogenic windows, so that, in selected cases, tomographic imaging techniques may be preferable for the evaluation of myocardial deformation. Methods: Consecutive patients (n = 27) with congestive heart failure who underwent a clinically indicated ECG-gated contrast-enhanced 64-slice dual-source CCT for the evaluation of the cardiac veins prior to cardiac resynchronization therapy (CRT) were included. All patients underwent additional echocardiography. LV radial, circumferential and longitudinal strain and strain rates were analyzed in identical midventricular short axis, 4-, 2- and 3-chamber views for both modalities using the same prototype software algorithm (feature tracking). Time for analysis was assessed for both modalities. Results: Close correlations were observed for both techniques regarding global strain (r = 0.93, r = 0.87 and r = 0.84 for radial, circumferential and longitudinal strain, respectively, p < 0.001 for all). Similar trends were observed for regional radial, longitudinal and circumferential strain (r = 0.88, r = 0.84 and r = 0.94, respectively, p < 0.001 for all). The number of non-diagnostic myocardial segments was significantly higher with echocardiography than with CCT (9.6% versus 1.9%, p < 0.001). In addition, the required time for complete quantitative strain analysis was significantly shorter for CCT compared to echocardiography (877 ± 119 s per patient versus 1105 ± 258 s per patient, p < 0.001). Conclusion: Quantitative assessment of LV strain

  18. Quantitative analysis of reptation of partially extended DNA in sub-30 nm nanoslits

    CERN Document Server

    Yeh, Jia-Wei; Taloni, Alessandro; Chen, Yeng-Long; Chou, Chia-Fu

    2015-01-01

    We observed reptation of single DNA molecules in fused silica nanoslits of sub-30 nm height. The reptation behavior and the effect of confinement are quantitatively characterized using orientation correlation and transverse fluctuation analysis. We show tube-like polymer motion arises for a tense polymer under strong quasi-2D confinement and interaction with surface- passivating polyvinylpyrrolidone (PVP) molecules in nanoslits, while etching- induced device surface roughness, chip bonding materials and DNA-intercalated dye-surface interaction, play minor roles. These findings have strong implications for the effect of surface modification in nanofluidic systems with potential applications for single molecule DNA analysis.

  19. Quantitative assessment of hip osteoarthritis based on image texture analysis.

    Science.gov (United States)

    Boniatis, I S; Costaridou, L I; Cavouras, D A; Panagiotopoulos, E C; Panayiotakis, G S

    2006-03-01

    A non-invasive method was developed to investigate the potential capacity of digital image texture analysis in evaluating the severity of hip osteoarthritis (OA) and in monitoring its progression. 19 textural features evaluating patterns of pixel intensity fluctuations were extracted from 64 images of radiographic hip joint spaces (HJS), corresponding to 32 patients with verified unilateral or bilateral OA. Images were enhanced employing custom developed software for the delineation of the articular margins on digitized pelvic radiographs. The severity of OA for each patient was assessed by expert orthopaedists employing the Kellgren and Lawrence (KL) scale. Additionally, an index expressing HJS-narrowing was computed considering patients from the unilateral OA-group. A textural feature that quantified pixel distribution non-uniformity (grey level non-uniformity, GLNU) demonstrated the strongest correlation with the HJS-narrowing index among all extracted features and utilized in further analysis. Classification rules employing GLNU feature were introduced to characterize a hip as normal or osteoarthritic and to assign it to one of three severity categories, formed in accordance with the KL scale. Application of the proposed rules resulted in relatively high classification accuracies in characterizing a hip as normal or osteoarthritic (90.6%) and in assigning it to the correct KL scale category (88.9%). Furthermore, the strong correlation between the HJS-narrowing index and the pathological GLNU (r = -0.9, p<0.001) was utilized to provide percentages quantifying hip OA-severity. Texture analysis may contribute in the quantitative assessment of OA-severity, in the monitoring of OA-progression and in the evaluation of a chondroprotective therapy.

  20. Nanotechnology patents in the automotive industry (a quantitative & qualitative analysis).

    Science.gov (United States)

    Prasad, Raghavendra; Bandyopadhyay, Tapas K

    2014-01-01

    The aim of the article is to present a trend in patent filings for application of nanotechnology to the automobile sector across the world, using the keyword-based patent search. Overviews of the patents related to nano technology in the automobile industry have been provided. The current work has started from the worldwide patent search to find the patents on nanotechnology in the automobile industry and classify the patents according to the various parts of an automobile to which they are related and the solutions which they are providing. In the next step various graphs have been produced to get an insight into various trends. In next step, analysis of patents in various classifications, have been performed. The trends shown in graphs provide the quantitative analysis whereas; the qualitative analysis has been done in another section. The classifications of patents based on the solution they provide have been performed by reading the claims, titles, abstract and full texts separately. Patentability of nano technology inventions have been discussed in a view to give an idea of requirements and statutory bars to the patentability of nanotechnology inventions. Another objective of the current work is to suggest appropriate framework for the companies regarding use of nano technology in the automobile industry and a suggestive strategy for patenting of the inventions related to the same. For example, US Patent, with patent number US2008-019426A1 discusses the invention related to Lubricant composition. This patent has been studied and classified to fall under classification of automobile parts. After studying this patent, it is deduced that, the problem of friction in engine is being solved by this patent. One classification is the "automobile part" based while other is the basis of "problem being solved". Hence, two classifications, namely reduction in friction and engine were created. Similarly, after studying all the patents, a similar matrix has been created.

  1. Quantitative uncertainty and sensitivity analysis of a PWR control rod ejection accident

    Energy Technology Data Exchange (ETDEWEB)

    Pasichnyk, I.; Perin, Y.; Velkov, K. [Gesellschaft flier Anlagen- und Reaktorsicherheit - GRS mbH, Boltzmannstasse 14, 85748 Garching bei Muenchen (Germany)

    2013-07-01

    The paper describes the results of the quantitative Uncertainty and Sensitivity (U/S) Analysis of a Rod Ejection Accident (REA) which is simulated by the coupled system code ATHLET-QUABOX/CUBBOX applying the GRS tool for U/S analysis SUSA/XSUSA. For the present study, a UOX/MOX mixed core loading based on a generic PWR is modeled. A control rod ejection is calculated for two reactor states: Hot Zero Power (HZP) and 30% of nominal power. The worst cases for the rod ejection are determined by steady-state neutronic simulations taking into account the maximum reactivity insertion in the system and the power peaking factor. For the U/S analysis 378 uncertain parameters are identified and quantified (thermal-hydraulic initial and boundary conditions, input parameters and variations of the two-group cross sections). Results for uncertainty and sensitivity analysis are presented for safety important global and local parameters. (authors)

  2. Quantitative comparison of three electrosurgical smoke evacuation systems

    Science.gov (United States)

    de Boorder, Tjeerd; Noordmans, Herke Jan; Grimbergen, Matthijs; Been, Stefan; Verdaasdonk, Rudolf

    2010-02-01

    Electrosurgical equipment used during surgery generate smoke consisting of particles, vapor, aerosols and potentially harmful biological agents. Smoke evacuation systems are used more commonly and various types are available. A special image enhancement technique was used to study the behavior of surgical smoke and the effectiveness of smoke evacuation systems. Three different smoke evacuation systems were investigated. Rapid vac (Valleylab Boulder CO) The Buffalo silent whisper turbo (Buffalo, NY) ERBE IES 300 ( Tübingen, Germany) A back scatter illumination technique in combination with a high speed camera was applied to image the dynamics of a smoke plume generated by vaporizing a homogenous meat paste irradiated with the beam of a 10 W cw CO2 laser moving at a constant speed. The three different smoke evacuation systems with their individual nozzles, were held 2 cm above the surface of the meat paste and were switched on and off at fixed intervals to mimic a clinical situation. For images analysis, software was developed to count 'smoke pixels' in the video frames as a quantification tool. For the observer's eye, there were no differences between the systems. However, images quantification showed significantly less 'smoke' for the Buffalo system. It is expected that the performance in a clinical situation is also influenced by additional conditions like nozzle design, airflow and noise level. Noise levels were measured at the tip of the nozzle, 80 cm from the tip, 140 cm from the tip. The Buffalo system is the loudest system at every distance measured.

  3. Quantitative analysis of polymorphic mixtures of ranitidine hydrochloride by Raman spectroscopy and principal components analysis.

    Science.gov (United States)

    Pratiwi, Destari; Fawcett, J Paul; Gordon, Keith C; Rades, Thomas

    2002-11-01

    Ranitidine hydrochloride exists as two polymorphs, forms I and II, both of which are used to manufacture commercial tablets. Raman spectroscopy can be used to differentiate the two forms but univariate methods of quantitative analysis of one polymorph as an impurity in the other lack sensitivity. We have applied principal components analysis (PCA) of Raman spectra to binary mixtures of the two polymorphs and to binary mixtures prepared by adding one polymorph to powdered tablets of the other. Based on absorption measurements of seven spectral regions, it was found that >97% of the spectral variation was accounted for by three principal components. Quantitative calibration models generated by multiple linear regression predicted a detection limit and quantitation limit for either forms I or II in mixtures of the two of 0.6 and 1.8%, respectively. This study demonstrates that PCA of Raman spectroscopic data provides a sensitive method for the quantitative analysis of polymorphic impurities of drugs in commercial tablets with a quantitation limit of less than 2%.

  4. Quantitative Assessment of Molecular Dynamics Sampling for Flexible Systems.

    Science.gov (United States)

    Nemec, Mike; Hoffmann, Daniel

    2017-02-14

    Molecular dynamics (MD) simulation is a natural method for the study of flexible molecules but at the same time is limited by the large size of the conformational space of these molecules. We ask by how much the MD sampling quality for flexible molecules can be improved by two means: the use of diverse sets of trajectories starting from different initial conformations to detect deviations between samples and sampling with enhanced methods such as accelerated MD (aMD) or scaled MD (sMD) that distort the energy landscape in controlled ways. To this end, we test the effects of these approaches on MD simulations of two flexible biomolecules in aqueous solution, Met-Enkephalin (5 amino acids) and HIV-1 gp120 V3 (a cycle of 35 amino acids). We assess the convergence of the sampling quantitatively with known, extensive measures of cluster number Nc and cluster distribution entropy Sc and with two new quantities, conformational overlap Oconf and density overlap Odens, both conveniently ranging from 0 to 1. These new overlap measures quantify self-consistency of sampling in multitrajectory MD experiments, a necessary condition for converged sampling. A comprehensive assessment of sampling quality of MD experiments identifies the combination of diverse trajectory sets and aMD as the most efficient approach among those tested. However, analysis of Odens between conventional and aMD trajectories also reveals that we have not completely corrected aMD sampling for the distorted energy landscape. Moreover, for V3, the courses of Nc and Odens indicate that much higher resources than those generally invested today will probably be needed to achieve convergence. The comparative analysis also shows that conventional MD simulations with insufficient sampling can be easily misinterpreted as being converged.

  5. PVeStA: A Parallel Statistical Model Checking and Quantitative Analysis Tool

    KAUST Repository

    AlTurki, Musab

    2011-01-01

    Statistical model checking is an attractive formal analysis method for probabilistic systems such as, for example, cyber-physical systems which are often probabilistic in nature. This paper is about drastically increasing the scalability of statistical model checking, and making such scalability of analysis available to tools like Maude, where probabilistic systems can be specified at a high level as probabilistic rewrite theories. It presents PVeStA, an extension and parallelization of the VeStA statistical model checking tool [10]. PVeStA supports statistical model checking of probabilistic real-time systems specified as either: (i) discrete or continuous Markov Chains; or (ii) probabilistic rewrite theories in Maude. Furthermore, the properties that it can model check can be expressed in either: (i) PCTL/CSL, or (ii) the QuaTEx quantitative temporal logic. As our experiments show, the performance gains obtained from parallelization can be very high. © 2011 Springer-Verlag.

  6. Quantitative capillary electrophoresis and its application in analysis of alkaloids in tea, coffee, coca cola, and theophylline tablets.

    Science.gov (United States)

    Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao

    2009-01-01

    A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.

  7. Quantitative risk analysis of urban flooding in lowland areas

    NARCIS (Netherlands)

    Ten Veldhuis, J.A.E.

    2010-01-01

    Urban flood risk analyses suffer from a lack of quantitative historical data on flooding incidents. Data collection takes place on an ad hoc basis and is usually restricted to severe events. The resulting data deficiency renders quantitative assessment of urban flood risks uncertain. The study repor

  8. APPLICATION OF NEOTAME IN CATCHUP: QUANTITATIVE DESCRIPTIVE AND PHYSICOCHEMICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    G. C. M. C. BANNWART

    2008-11-01

    Full Text Available

    In this study, fi ve prototypes of catchup were developed by replacing partially or totally the sucrose in the formulation by the sweetener Neotame (NTM. These prototypes were evaluated for their physicochemical characteristics and sensory profi le (Quantitative Descriptive Analysis. The main sensory differences observed among the prototypes were regarding to color, consistency, mouthfeel, sweet taste and tomato taste, for which lower means were obtained as the sugar level was decreased, and also in terms of salty taste, that had higher means with the decrease of sugar. In terms of bitter and sweetener aftertastes, the prototype 100% sweetened with NTM presented the higher mean score, but with no signifi cant difference when compared to other prototypes containing sucrose, for bitter taste, however, it had the highest mean score, statistically different from all the other prototypes. In terms of physicochemical characteristics, the differences were mainly in terms of consistency, solids and color. Despite the differences observed among the prototypes as the sugar level was reduced, it was concluded that NTM is a suitable sweetener for catchup, both for use in reduced calories and no sugar versions.

  9. Early child grammars: qualitative and quantitative analysis of morphosyntactic production.

    Science.gov (United States)

    Legendre, Géraldine

    2006-09-10

    This article reports on a series of 5 analyses of spontaneous production of verbal inflection (tense and person-number agreement) by 2-year-olds acquiring French as a native language. A formal analysis of the qualitative and quantitative results is developed using the unique resources of Optimality Theory (OT; Prince & Smolensky, 2004). It is argued that acquisition of morphosyntax proceeds via overlapping grammars (rather than through abrupt changes), which OT formalizes in terms of partial rather than total constraint rankings. Initially, economy of structure constraints take priority over faithfulness constraints that demand faithful expression of a speaker's intent, resulting in child production of tense that is comparable in level to that of child-directed speech. Using the independent Predominant Length of Utterance measure of syntactic development proposed in Vainikka, Legendre, and Todorova (1999), production of agreement is shown first to lag behind tense then to compete with tense at an intermediate stage of development. As the child's development progresses, faithfulness constraints become more dominant, and the overall production of tense and agreement becomes adult-like.

  10. European Identity in Russian Regions Bordering on Finland: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    A. O. Domanov

    2014-01-01

    Full Text Available Th e quantitative analysis of an opinion poll conducted in October 2013 in three Russian cities located near Finnish border (St-Petersburg, Kronstadt and Vyborg explores European identity of their citizens. Th is area was chosen to illustrate the crucial importance of space interpretation in spatial identity formation by using critical geopolitical approach. Th e study shows how diff erent images of space on the same territory act as intermediate variables between objective territorial characteristics and citizens’ identities. As the geographical position at the border of Russia provides the citizens with geopolitical alternatives to identify their location as a fortress defending the nation (as in the case of Kronstadt or a bridge between cultures, the given study allows us to compare reasons for these geopolitical choices of inhabitants. Furthermore, the research aims at bridging the gap in the studies of European and multiple identity in Russian regions and provides Northwest Russian perspective on the perpetual discussion about subjective Eastern border of Europe.

  11. Quantitative analysis of plasma interleiukin-6 by immunoassay on microchip

    Science.gov (United States)

    Abe, K.; Hashimoto, Y.; Yatsushiro, S.; Yamamura, S.; Tanaka, M.; Ooie, T.; Baba, Y.; Kataoka, M.

    2012-03-01

    Sandwich enzyme-linked immunoassay (ELISA) is one of the most frequently employed assays for clinical diagnosis, since this enables the investigator to identify specific protein biomarkers. However, the conventional assay using a 96-well microtitration plate is time- and sample-consuming, and therefore is not suitable for rapid diagnosis. To overcome these drawbacks, we performed a sandwich ELISA on a microchip. We employed the piezoelectric inkjet printing for deposition and fixation of 1st antibody on the microchannnel surface (300 μm width and 100 μm depth). Model analyte was interleukin-6 (IL-6) which was one of the inflammatory cytokine. After blocking the microchannel, antigen, biotin-labeled 2nd antibody, and avidin-labeled peroxidase were infused into the microchannel and incubated for 20 min, 10 min, and 5 min, respectively. This assay could detect 2 pg/ml and quantitatively measure the range of 0-32 pg/ml. Liner regression analysis of plasma IL-6 concentration obtained by microchip and conventional methods exhibited a significant relationship (R2 = 0.9964). This assay reduced the time for the antigen-antibody reaction to 1/6, and the consumption of samples and reagents to 1/50 compared with the conventional method. This assay enables us to determine plasma IL-6 with accuracy, high sensitivity, time saving ability, and low consumption of sample and reagents, and thus will be applicable to clinic diagnosis.

  12. Quantitative image analysis of HIV-1 infection in lymphoid tissue

    Energy Technology Data Exchange (ETDEWEB)

    Haase, A.T.; Zupancic, M.; Cavert, W. [Univ. of Minnesota Medical School, Minneapolis, MN (United States)] [and others

    1996-11-08

    Tracking human immunodeficiency virus-type 1 (HIV-1) infection at the cellular level in tissue reservoirs provides opportunities to better understand the pathogenesis of infection and to rationally design and monitor therapy. A quantitative technique was developed to determine viral burden in two important cellular compartments in lymphoid developed to determine viral burden in two important cellular compartments in lymphoid tissues. Image analysis and in situ hybridization were combined to show that in the presymptomatic stages of infection there is a large, relatively stable pool of virions on the surfaces of follicular dendritic cells and a smaller pool of productivity infected cells. Despite evidence of constraints on HIV-1 replication in the infected cell population in lymphoid tissues, estimates of the numbers of these cells and the virus they could produce are consistent with the quantities of virus that have been detected in the bloodstream. The cellular sources of virus production and storage in lymphoid tissues can now be studied with this approach over the course of infection and treatment. 22 refs., 2 figs., 2 tabs.

  13. Full-Range Public Health Leadership, Part 1: Quantitative Analysis

    Directory of Open Access Journals (Sweden)

    Erik L. Carlton

    2015-04-01

    Full Text Available Background. Workforce and leadership development are central to the future of public health. However, public health has been slow to translate and apply leadership models from other professions and to incorporate local perspectives in understanding public health leadership. Purpose. This study utilized the full-range leadership model in order to examine public health leadership. Specifically, it sought to measure leadership styles among local health department directors and to understand the context of leadership local health departments.Methods. Leadership styles among local health department directors (n=13 were examined using survey methodology. Quantitative analysis methods included descriptive statistics, boxplots, and Pearson bivariate correlations using SPSS v18.0. Findings. Self-reported leadership styles were highly correlated to leadership outcomes at the organizational level. However, they were not related to county health rankings. Results suggest the preeminence of leader behaviors and providing individual consideration to staff as compared to idealized attributes of leaders, intellectual stimulation, or inspirational motivation. Implications. Holistic leadership assessment instruments, such as the Multifactor Leadership Questionnaire (MLQ can be useful in assessing public health leaders approaches and outcomes. Comprehensive, 360-degree reviews may be especially helpful. Further research is needed to examine the effectiveness of public health leadership development models, as well as the extent that public health leadership impacts public health outcomes.

  14. Quantitative Analysis and Comparisons of EPON Protection Schemes

    Institute of Scientific and Technical Information of China (English)

    CHENHong; JINDepeng; ZENGLieguang; SULi

    2005-01-01

    This paper presents the relationship between the intensity of network damage and the network survivability. Then a method for quantitatively analyzing the survivability of tree network is studied. Based on the analysis, the survivability of Ethernet passive optical network (EPON) with three kinds of protection schemes (i.e., Trunk-fiber protection scheme, Node-fiber protection scheme, and Bus-fiber protection) is discussed. Following this, the comparisons of the survivability among these three kinds of protection schemes of F.PON are put forward. The simulation results show that, when the coverage area is the same, the survivability of EPON with Node-fiber protection scheme is better than that of EPON with Trunk-fiber protection scheme, and when the number and distribution of Optical network unit (ONU) are the same, the survivability of EPON with Bus-fiber protection scheme is better than that of EPON with Nodefiber protection scheme. Under the same constraints, the needed fiber of EPON with Bus-fiber protection scheme is the least when there are more than 12 ONU nodes. These results are useful not only for forecasting and evaluating the survivability of EPON access network, but also for its topology design.

  15. Quantitative analysis of regulatory flexibility under changing environmental conditions

    Science.gov (United States)

    Edwards, Kieron D; Akman, Ozgur E; Knox, Kirsten; Lumsden, Peter J; Thomson, Adrian W; Brown, Paul E; Pokhilko, Alexandra; Kozma-Bognar, Laszlo; Nagy, Ferenc; Rand, David A; Millar, Andrew J

    2010-01-01

    The circadian clock controls 24-h rhythms in many biological processes, allowing appropriate timing of biological rhythms relative to dawn and dusk. Known clock circuits include multiple, interlocked feedback loops. Theory suggested that multiple loops contribute the flexibility for molecular rhythms to track multiple phases of the external cycle. Clear dawn- and dusk-tracking rhythms illustrate the flexibility of timing in Ipomoea nil. Molecular clock components in Arabidopsis thaliana showed complex, photoperiod-dependent regulation, which was analysed by comparison with three contrasting models. A simple, quantitative measure, Dusk Sensitivity, was introduced to compare the behaviour of clock models with varying loop complexity. Evening-expressed clock genes showed photoperiod-dependent dusk sensitivity, as predicted by the three-loop model, whereas the one- and two-loop models tracked dawn and dusk, respectively. Output genes for starch degradation achieved dusk-tracking expression through light regulation, rather than a dusk-tracking rhythm. Model analysis predicted which biochemical processes could be manipulated to extend dusk tracking. Our results reveal how an operating principle of biological regulators applies specifically to the plant circadian clock. PMID:21045818

  16. Quantitative analysis of piperine in ayurvedic formulation by UV Spectrophotometry

    Directory of Open Access Journals (Sweden)

    Gupta Vishvnath

    2011-02-01

    Full Text Available A simple and reproducible UV- spectrophotometric method for the quantitative determination of piperine in Sitopaladi churna (STPLC were developed and validated in the present work. The parameters linearity, precision , accuracy, and standard error were studies according to indian herbal pharmacopiea. In this present study a new, simple, rapid, sensitive, precise and economic spectrophotometric method in ultraviolet region has been developed for the determination of piperine in market and laboratory herbal formulation of Sitopaladi churna. which were procured and purchased respectively from the local market and they were evaluated as per Indian herbal Pharmacopoeia and WHO guidelines. The concentration of piperine present in raw material of PSC was found to be 1.45±0.014 w/w in piper longum fruits. Piperine has the maximum wavelength at 342.5 nm and hence the UV spectrophotometric method was performed at 342.5 nm. The samples were prepared in methanol and methos obeys Beers law in concentration ranges employed for evaluation. The content of piperine in ayurvedic formulation was determined. The result of analysis have been validated statistically and recovery studies confirmed the accuracy of the proposed method. Hence the proposed method can be used for the reliable quantification of Piperine in crude drug and its herbal formulation.

  17. A Quantitative Analysis of Photovoltaic Modules Using Halved Cells

    Directory of Open Access Journals (Sweden)

    S. Guo

    2013-01-01

    Full Text Available In a silicon wafer-based photovoltaic (PV module, significant power is lost due to current transport through the ribbons interconnecting neighbour cells. Using halved cells in PV modules is an effective method to reduce the resistive power loss which has already been applied by some major PV manufacturers (Mitsubishi, BP Solar in their commercial available PV modules. As a consequence, quantitative analysis of PV modules using halved cells is needed. In this paper we investigate theoretically and experimentally the difference between modules made with halved and full-size solar cells. Theoretically, we find an improvement in fill factor of 1.8% absolute and output power of 90 mW for the halved cell minimodule. Experimentally, we find an improvement in fill factor of 1.3% absolute and output power of 60 mW for the halved cell module. Also, we investigate theoretically how this effect confers to the case of large-size modules. It is found that the performance increment of halved cell PV modules is even higher for high-efficiency solar cells. After that, the resistive loss of large-size modules with different interconnection schemes is analysed. Finally, factors influencing the performance and cost of industrial halved cell PV modules are discussed.

  18. Quantitative analysis of 3-OH oxylipins in fermentation yeast.

    Science.gov (United States)

    Potter, Greg; Xia, Wei; Budge, Suzanne M; Speers, R Alex

    2017-02-01

    Despite the ubiquitous distribution of oxylipins in plants, animals, and microbes, and the application of numerous analytical techniques to study these molecules, 3-OH oxylipins have never been quantitatively assayed in yeasts. The formation of heptafluorobutyrate methyl ester derivatives and subsequent analysis with gas chromatography - negative chemical ionization - mass spectrometry allowed for the first determination of yeast 3-OH oxylipins. The concentration of 3-OH 10:0 (0.68-4.82 ng/mg dry cell mass) in the SMA strain of Saccharomyces pastorianus grown in laboratory-scale beverage fermentations was elevated relative to oxylipin concentrations in plant tissues and macroalgae. In fermenting yeasts, the onset of 3-OH oxylipin formation has been related to fermentation progression and flocculation initiation. When the SMA strain was grown in laboratory-scale fermentations, the maximal sugar consumption rate preceded the lowest concentration of 3-OH 10:0 by ∼4.5 h and a distinct increase in 3-OH 10:0 concentration by ∼16.5 h.

  19. Immunoliposome-PCR: a generic ultrasensitive quantitative antigen detection system

    Directory of Open Access Journals (Sweden)

    He Junkun

    2012-06-01

    Full Text Available Abstract Background The accurate quantification of antigens at low concentrations over a wide dynamic range is needed for identifying biomarkers associated with disease and detecting protein interactions in high-throughput microarrays used in proteomics. Here we report the development of an ultrasensitive quantitative assay format called immunoliposome polymerase chain reaction (ILPCR that fulfills these requirements. This method uses a liposome, with reporter DNA encapsulated inside and biotin-labeled polyethylene glycol (PEG phospholipid conjugates incorporated into the outer surface of the liposome, as a detection reagent. The antigenic target is immobilized in the well of a microplate by a capture antibody and the liposome detection reagent is then coupled to a biotin-labeled second antibody through a NeutrAvidin bridge. The liposome is ruptured to release the reporter DNA, which serves as a surrogate to quantify the protein target using real-time PCR. Results A liposome detection reagent was prepared, which consisted of a population of liposomes ~120 nm in diameter with each liposome possessing ~800 accessible biotin receptors and ~220 encapsulated reporters. This liposome detection reagent was used in an assay to quantify the concentration of carcinoembryonic antigen (CEA in human serum. This ILPCR assay exhibited a linear dose–response curve from 10-10 M to 10-16 M CEA. Within this range the assay coefficient of variance was Conclusions The ILPCR assay has several advantages over other immuno-PCR methods. The reporter DNA and biotin-labeled PEG phospholipids spontaneously incorporate into the liposomes as they form, simplifying preparation of the detection reagent. Encapsulation of the reporter inside the liposomes allows nonspecific DNA in the assay medium to be degraded with DNase I prior to quantification of the encapsulated reporter by PCR, which reduces false-positive results and improves quantitative accuracy. The ability to

  20. Quantitative Modeling of the Alternative Pathway of the Complement System.

    Science.gov (United States)

    Zewde, Nehemiah; Gorham, Ronald D; Dorado, Angel; Morikis, Dimitrios

    2016-01-01

    The complement system is an integral part of innate immunity that detects and eliminates invading pathogens through a cascade of reactions. The destructive effects of the complement activation on host cells are inhibited through versatile regulators that are present in plasma and bound to membranes. Impairment in the capacity of these regulators to function in the proper manner results in autoimmune diseases. To better understand the delicate balance between complement activation and regulation, we have developed a comprehensive quantitative model of the alternative pathway. Our model incorporates a system of ordinary differential equations that describes the dynamics of the four steps of the alternative pathway under physiological conditions: (i) initiation (fluid phase), (ii) amplification (surfaces), (iii) termination (pathogen), and (iv) regulation (host cell and fluid phase). We have examined complement activation and regulation on different surfaces, using the cellular dimensions of a characteristic bacterium (E. coli) and host cell (human erythrocyte). In addition, we have incorporated neutrophil-secreted properdin into the model highlighting the cross talk of neutrophils with the alternative pathway in coordinating innate immunity. Our study yields a series of time-dependent response data for all alternative pathway proteins, fragments, and complexes. We demonstrate the robustness of alternative pathway on the surface of pathogens in which complement components were able to saturate the entire region in about 54 minutes, while occupying less than one percent on host cells at the same time period. Our model reveals that tight regulation of complement starts in fluid phase in which propagation of the alternative pathway was inhibited through the dismantlement of fluid phase convertases. Our model also depicts the intricate role that properdin released from neutrophils plays in initiating and propagating the alternative pathway during bacterial infection.

  1. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    Science.gov (United States)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  2. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    Science.gov (United States)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  3. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Science.gov (United States)

    Hur, Kwang-Ho; Mueller, Joachim D

    2015-01-01

    The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  4. Quantitative Brightness Analysis of Fluorescence Intensity Fluctuations in E. Coli.

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Hur

    Full Text Available The brightness measured by fluorescence fluctuation spectroscopy specifies the average stoichiometry of a labeled protein in a sample. Here we extended brightness analysis, which has been mainly applied in eukaryotic cells, to prokaryotic cells with E. coli serving as a model system. The small size of the E. coli cell introduces unique challenges for applying brightness analysis that are addressed in this work. Photobleaching leads to a depletion of fluorophores and a reduction of the brightness of protein complexes. In addition, the E. coli cell and the point spread function of the instrument only partially overlap, which influences intensity fluctuations. To address these challenges we developed MSQ analysis, which is based on the mean Q-value of segmented photon count data, and combined it with the analysis of axial scans through the E. coli cell. The MSQ method recovers brightness, concentration, and diffusion time of soluble proteins in E. coli. We applied MSQ to measure the brightness of EGFP in E. coli and compared it to solution measurements. We further used MSQ analysis to determine the oligomeric state of nuclear transport factor 2 labeled with EGFP expressed in E. coli cells. The results obtained demonstrate the feasibility of quantifying the stoichiometry of proteins by brightness analysis in a prokaryotic cell.

  5. Content Analysis in Systems Engineering Acquisition Activities

    Science.gov (United States)

    2016-04-30

    quantitative and qualitative methods exist to (1) capture or generate data needed for a particular analysis, (2) reduce the data, (3) evaluate the data to...presented in this report was supported by the Acquisition Research Program of the Graduate School of Business & Public Policy at the Naval...analysis in systems engineering technical evaluation processes. Content analysis is a qualitative data analysis methodology used to discover

  6. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  7. Simultaneous quantitative analysis of main components in linderae reflexae radix with one single marker.

    Science.gov (United States)

    Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing

    2016-05-08

    Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1-p-mentheneyl)-trans-stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.

  8. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  9. Evidence toward an expanded international civil aviation organization (ICAO) concept of a single unified global communication navigation surveillance air traffic management (CNS/ATM) system: A quantitative analysis of ADS-B technology within a CNS/ATM system

    Science.gov (United States)

    Gardner, Gregory S.

    This research dissertation summarizes research done on the topic of global air traffic control, to include technology, controlling world organizations and economic considerations. The International Civil Aviation Organization (ICAO) proposed communication, navigation, surveillance, air traffic management system (CNS/ATM) plan is the basis for the development of a single global CNS/ATM system concept as it is discussed within this study. Research will be evaluated on the efficacy of a single technology, Automatic Dependent Surveillance-Broadcast (ADS-B) within the scope of a single global CNS/ATM system concept. ADS-B has been used within the Federal Aviation Administration's (FAA) Capstone program for evaluation since the year 2000. The efficacy of ADS-B was measured solely by using National Transportation Safety Board (NTSB) data relating to accident and incident rates within the Alaskan airspace (AK) and that of the national airspace system (NAS).

  10. From classical genetics to quantitative genetics to systems biology: modeling epistasis.

    Directory of Open Access Journals (Sweden)

    David L Aylor

    2008-03-01

    Full Text Available Gene expression data has been used in lieu of phenotype in both classical and quantitative genetic settings. These two disciplines have separate approaches to measuring and interpreting epistasis, which is the interaction between alleles at different loci. We propose a framework for estimating and interpreting epistasis from a classical experiment that combines the strengths of each approach. A regression analysis step accommodates the quantitative nature of expression measurements by estimating the effect of gene deletions plus any interaction. Effects are selected by significance such that a reduced model describes each expression trait. We show how the resulting models correspond to specific hierarchical relationships between two regulator genes and a target gene. These relationships are the basic units of genetic pathways and genomic system diagrams. Our approach can be extended to analyze data from a variety of experiments, multiple loci, and multiple environments.

  11. Quantitative Transcript Analysis in Plants: Improved First-strand cDNA Synthesis

    Institute of Scientific and Technical Information of China (English)

    Nai-Zhong XIAO; Lei BA; Preben Bach HOLM; Xing-Zhi WANG; Steve BOWRA

    2005-01-01

    The quantity and quality of first-strand cDNA directly influence the accuracy of transcriptional analysis and quantification. Using a plant-derived α-tubulin as a model system, the effect of oligo sequence and DTT on the quality and quantity of first-strand cDNA synthesis was assessed via a combination of semi-quantitative PCR and real-time PCR. The results indicated that anchored oligo dT significantly improved the quantity and quality of α-tubulin cDNA compared to the conventional oligo dT. Similarly, omitting DTT from the first-strand cDNA synthesis also enhanced the levels of transcript. This is the first time that a comparative analysis has been undertaken for a plant system and it shows conclusively that small changes to current protocols can have very significant impact on transcript analysis.

  12. A quantitative analysis of Salmonella Typhimurium metabolism during infection

    OpenAIRE

    Steeb, Benjamin

    2012-01-01

    In this thesis, Salmonella metabolism during infection was investigated. The goal was to gain a quantitative and comprehensive understanding of Salmonella in vivo nutrient supply, utilization and growth. To achieve this goal, we used a combined experimental / in silico approach. First, we generated a reconstruction of Salmonella metabolism ([1], see 2.1). This reconstruction was then combined with in vivo data from experimental mutant phenotypes to build a comprehensive quantitative in viv...

  13. Fourier transform infrared spectroscopy quantitative analysis of SF6 partial discharge decomposition components.

    Science.gov (United States)

    Zhang, Xiaoxing; Liu, Heng; Ren, Jiangbo; Li, Jian; Li, Xin

    2015-02-05

    Gas-insulated switchgear (GIS) internal SF6 gas produces specific decomposition components under partial discharge (PD). By detecting these characteristic decomposition components, such information as the type and level of GIS internal insulation deterioration can be obtained effectively, and the status of GIS internal insulation can be evaluated. SF6 was selected as the background gas for Fourier transform infrared spectroscopy (FTIR) detection in this study. SOF2, SO2F2, SO2, and CO were selected as the characteristic decomposition components for system analysis. The standard infrared absorption spectroscopy of the four characteristic components was measured, the optimal absorption peaks were recorded and the corresponding absorption coefficient was calculated. Quantitative detection experiments on the four characteristic components were conducted. The volume fraction variation trend of four characteristic components at different PD time were analyzed. And under five different PD quantity, the quantitative relationships among gas production rate, PD time, and PD quantity were studied.

  14. Quantitative analysis of flavanones and chalcones from willow bark.

    Science.gov (United States)

    Freischmidt, A; Untergehrer, M; Ziegler, J; Knuth, S; Okpanyi, S; Müller, J; Kelber, O; Weiser, D; Jürgenliemk, G

    2015-09-01

    Willow bark extracts are used for the treatment of fever, pain and inflammation. Recent clinical and pharmacological research revealed that not only the salicylic alcohol derivatives, but also the polyphenols significantly contribute to these effects. Quantitative analysis of the European Pharmacopoeia still focuses on the determination of the salicylic alcohol derivatives. The objective of the present study was the development of an effective quantification method for the determination of as many flavanone and chalcone glycosides as possible in Salix purpurea and other Salix species as well as commercial preparations thereof. As Salix species contain a diverse spectrum of the glycosidated flavanones naringenin, eriodictyol, and the chalcone chalconaringenin, a subsequent acidic and enzymatic hydrolysis was developed to yield naringenin and eriodictyol as aglycones, which were quantified by HPLC. The 5-O-glucosides were cleaved with 11.5% TFA before subsequent hydrolysis of the 7-O-glucosides with an almond β-glucosidase at pH 6-7. The method was validated with regard to LOD, LOQ, intraday and interday precision, accuracy, stability, recovery, time of hydrolysis, robustness and applicability to extracts. All 5-O- and 7-O-glucosides of naringenin, eriodictyol and chalconaringenin were completely hydrolysed and converted to naringenin and eriodictyol. The LOD of the HPLC method was 0.77 μM of naringenin and 0.45 μM of eriodictyol. The LOQ was 2.34 μM of naringenin and 1.35 μM for eriodictyol. The method is robust with regard to sample weight, but susceptible concerning enzyme deterioration. The developed method is applicable to the determination of flavanone and chalcone glycosides in willow bark and corresponding preparations.

  15. Quantitative Financial Analysis of Alternative Energy Efficiency Shareholder Incentive Mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Cappers, Peter; Goldman, Charles; Chait, Michele; Edgar, George; Schlegel, Jeff; Shirley, Wayne

    2008-08-03

    Rising energy prices and climate change are central issues in the debate about our nation's energy policy. Many are demanding increased energy efficiency as a way to help reduce greenhouse gas emissions and lower the total cost of electricity and energy services for consumers and businesses. Yet, as the National Action Plan on Energy Efficiency (NAPEE) pointed out, many utilities continue to shy away from seriously expanding their energy efficiency program offerings because they claim there is insufficient profit-motivation, or even a financial disincentive, when compared to supply-side investments. With the recent introduction of Duke Energy's Save-a-Watt incentive mechanism and ongoing discussions about decoupling, regulators and policymakers are now faced with an expanded and diverse landscape of financial incentive mechanisms, Determining the 'right' way forward to promote deep and sustainable demand side resource programs is challenging. Due to the renaissance that energy efficiency is currently experiencing, many want to better understand the tradeoffs in stakeholder benefits between these alternative incentive structures before aggressively embarking on a path for which course corrections can be time-consuming and costly. Using a prototypical Southwest utility and a publicly available financial model, we show how various stakeholders (e.g. shareholders, ratepayers, etc.) are affected by these different types of shareholder incentive mechanisms under varying assumptions about program portfolios. This quantitative analysis compares the financial consequences associated with a wide range of alternative incentive structures. The results will help regulators and policymakers better understand the financial implications of DSR program incentive regulation.

  16. Descriptive quantitative analysis of hallux abductovalgus transverse plane radiographic parameters.

    Science.gov (United States)

    Meyr, Andrew J; Myers, Adam; Pontious, Jane

    2014-01-01

    Although the transverse plane radiographic parameters of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), and the metatarsal-sesamoid position (MSP) form the basis of preoperative procedure selection and postoperative surgical evaluation of the hallux abductovalgus deformity, the so-called normal values of these measurements have not been well established. The objectives of the present study were to (1) evaluate the descriptive statistics of the first IMA, HAA, and MSP from a large patient population and (2) to determine an objective basis for defining "normal" versus "abnormal" measurements. Anteroposterior foot radiographs from 373 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated for the measurements of the first IMA, HAA, and MSP. The results revealed a mean measurement of 9.93°, 17.59°, and position 3.63 for the first IMA, HAA, and MSP, respectively. An advanced descriptive analysis demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, clear differentiations in deformity progression were appreciated when the variables were graphically depicted against each other. This could represent a quantitative basis for defining "normal" versus "abnormal" values. From the results of the present study, we have concluded that these radiographic parameters can be more conservatively reported and analyzed using nonparametric descriptive and comparative statistics within medical studies and that the combination of a first IMA, HAA, and MSP at or greater than approximately 10°, 18°, and position 4, respectively, appears to be an objective "tipping point" in terms of deformity progression and might represent an upper limit of acceptable in terms of surgical deformity correction.

  17. A qualitative and quantitative analysis of vegetable pricing in supermarket

    Science.gov (United States)

    Miranda, Suci

    2017-06-01

    The purpose of this study is to analyze the variables affecting the determination of the sale price of vegetable which is constant over time in a supermarket qualitatively and quantitavely. It focuses on the non-organic vegetable with a fixed selling price over time such as spinach, beet, and parsley. In qualitative analysis, the sale price determination is influenced by the vegetable characteristics: (1) vegetable segmentation (low to high daily consumed); (2) vegetable age (how long it can last related to freshness); which both characteristic relates to the inventory management and ultimately to the sale price in supermarket. While quantitatively, the vegetables are divided into two categories: the leaf vegetable group that the leaves are eaten as a vegetable with the aging product (a) = 0 and the shelf life (t) = 0, and the non-leafy vegetable group with the aging group (a) = a+1 and the shelf life (t) = t+1. The vegetable age (a) = 0 means they only last for one day when they are ordered then they have to terminate. Whereas a+1 is that they have a longer life for more than a day such as beet, white radish, and string beans. The shelf life refers to how long it will be placed in a shelf in supermarket in line with the vegetable age. According to the cost plus pricing method using full price costing approach, production costs, non-production costs, and markup are adjusted differently for each category. There is a holding cost added to the sale price of the non-leafy vegetable, yet it is assumed a 0 holding cost for the leafy vegetable category. The amount of expected margin of each category is correlated to the vegetable characteristics.

  18. Immunology by numbers: quantitation of antigen presentation completes the quantitative milieu of systems immunology!

    Science.gov (United States)

    Purcell, Anthony W; Croft, Nathan P; Tscharke, David C

    2016-06-01

    We review approaches to quantitate antigen presentation using a variety of biological and biochemical readouts and highlight the emerging role of mass spectrometry (MS) in defining and quantifying MHC-bound peptides presented at the cell surface. The combination of high mass accuracy in the determination of the molecular weight of the intact peptide of interest and its signature pattern of fragmentation during tandem MS provide an unambiguous and definitive identification. This is in contrast to the potential receptor cross-reactivity towards closely related peptides and variable dose responsiveness seen in biological readouts. In addition, we gaze into the not too distant future where big data approaches in MS can be accommodated to quantify whole immunopeptidomes both in vitro and in vivo.

  19. Quantitative Change and Use Analysis of Agricultural Land in China

    Science.gov (United States)

    Fu, Y.; Chou, J.; Dong, W.

    2013-12-01

    Climatic change, economic and scientific development and political guidance irritate the change of land use. With the index of crop sown area, this paper mainly explores the agricultural land use situation in these years of China. Accumulated temperature and urbanization rate are used to analyze space-time difference and its impact mechanism of crop sown area, for the quantitative change of agricultural land. While cropping index reflected agricultural land use is considered to obtain the actual use of cultivated land and its surplus capacity. Some results are concluded as follows: (1) from 1949 to 2010, crop sown area has a generally slow growth in China, however, with obvious space diversity. Most quickly increase and decrease are reflected in Xinjiang and North China, and the size of agricultural land ranks from the midland to the east and to the west of China. (2) Based on the relationship of accumulated temperature and cropping system, effect of climatic change, urbanization and other factors aggregated on crop sown area increase are considered. It is confirmed that warming promotes much little, urbanization restrains mainly in South China, northeast China, Xinjiang and southwest China, and other factors aggregated accelerate agricultural land of the rest of China. (3) From 1980 to 2009, agricultural land use degree keeps unceasing deepening. By the common influence of decreased cultivated area and less potential cropping index than actual cropping index, surplus capacity of cultivated area induces, from 6.27*107 hm2 in 1980 to 3.85*107 hm2 in 2009. However, it still accounts for about 20 percent of agricultural land to the full potential, which verifies the necessary of sufficient and reasonable use in further.

  20. Quantitative PCR analysis of salivary pathogen burden in periodontitis.

    Science.gov (United States)

    Salminen, Aino; Kopra, K A Elisa; Hyvärinen, Kati; Paju, Susanna; Mäntylä, Päivi; Buhlin, Kåre; Nieminen, Markku S; Sinisalo, Juha; Pussinen, Pirkko J

    2015-01-01

    Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9 ± 9.2 years) with coronary artery disease (CAD) diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR (qPCR). Median salivary concentrations of Porphyromonas gingivalis, Tannerella forsythia, and Prevotella intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary Aggregatibacter actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥6 mm pockets, and alveolar bone loss (ABL). High level of T. forsythia was associated also with bleeding on probing (BOP). The combination of the four bacteria, i.e., the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR) of 2.40 (95% CI 1.39-4.13). When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51-4.52). The highest OR 3.59 (95% CI 1.94-6.63) was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T

  1. Quantitative PCR analysis of salivary pathogen burden in periodontitis

    Directory of Open Access Journals (Sweden)

    Aino eSalminen

    2015-10-01

    Full Text Available Our aim was to investigate the value of salivary concentrations of four major periodontal pathogens and their combination in diagnostics of periodontitis. The Parogene study included 462 dentate subjects (mean age 62.9±9.2 years with coronary artery disease diagnosis who underwent an extensive clinical and radiographic oral examination. Salivary levels of four major periodontal bacteria were measured by quantitative real-time PCR. Median salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia, as well as the sum of the concentrations of the four bacteria, were higher in subjects with moderate to severe periodontitis compared to subjects with no to mild periodontitis. Median salivary A. actinomycetemcomitans concentrations did not differ significantly between the subjects with no to mild periodontitis and subjects with moderate to severe periodontitis. In logistic regression analysis adjusted for age, gender, diabetes, and the number of teeth and implants, high salivary concentrations of P. gingivalis, T. forsythia, and P. intermedia were significantly associated with moderate to severe periodontitis. When looking at different clinical and radiographic parameters of periodontitis, high concentrations of P. gingivalis and T. forsythia were significantly associated with the number of 4-5 mm periodontal pockets, ≥ 6 mm pockets, and alveolar bone loss (ABL. High level of T. forsythia was associated also with bleeding on probing (BOP. The combination of the four bacteria, i.e. the bacterial burden index, was associated with moderate to severe periodontitis with an odds ratio (OR of 2.40 (95% CI 1.39–4.13. When A. actinomycetemcomitans was excluded from the combination of the bacteria, the OR was improved to 2.61 (95% CI 1.51–4.52. The highest odds ratio 3.59 (95% CI 1.94–6.63 was achieved when P. intermedia was further excluded from the combination and only the levels of P. gingivalis and T. forsythia were used. Salivary

  2. Quantitative Safety: Linking Proof-Based Verification with Model Checking for Probabilistic Systems

    CERN Document Server

    Ndukwu, Ukachukwu

    2009-01-01

    This paper presents a novel approach for augmenting proof-based verification with performance-style analysis of the kind employed in state-of-the-art model checking tools for probabilistic systems. Quantitative safety properties usually specified as probabilistic system invariants and modeled in proof-based environments are evaluated using bounded model checking techniques. Our specific contributions include the statement of a theorem that is central to model checking safety properties of proof-based systems, the establishment of a procedure; and its full implementation in a prototype system (YAGA) which readily transforms a probabilistic model specified in a proof-based environment to its equivalent verifiable PRISM model equipped with reward structures. The reward structures capture the exact interpretation of the probabilistic invariants and can reveal succinct information about the model during experimental investigations. Finally, we demonstrate the novelty of the technique on a probabilistic library cas...

  3. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  4. Quantitative Safety and Security Analysis from a Communication Perspective

    Directory of Open Access Journals (Sweden)

    Boris Malinowsky

    2015-12-01

    Full Text Available This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and security properties for deployment in safety-critical use-case scenarios with security requirements. We look at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective on the communication protocols. The results are obtained using the network simulator ns-3.

  5. Quantitative Safety and Security Analysis from a Communication Perspective

    DEFF Research Database (Denmark)

    Malinowsky, Boris; Schwefel, Hans-Peter; Jung, Oliver

    2014-01-01

    This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real......-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and security properties for deployment in safety-critical use-case scenarios with security requirements. We look...... at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective...

  6. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    Science.gov (United States)

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  7. A Quantitative Analysis of the Behavioral Checklist of the Movement ABC Motor Test

    Science.gov (United States)

    Ruiz, Luis Miguel; Gomez, Marta; Graupera, Jose Luis; Gutierrez, Melchor; Linaza, Jose Luis

    2007-01-01

    The fifth section of the Henderson and Sugden's Movement ABC Checklist is part of the general Checklist that accompanies The Movement ABC Battery. The authors maintain that the analysis of this section must be mainly qualitative instead of quantitative. The main objective of this study was to employ a quantitative analysis of this behavioural…

  8. [Bibliometric analysis of bacterial quantitative proteomics in English literatures].

    Science.gov (United States)

    Zhang, Xin; She, Danyang; Liu, Youning; Wang, Rui; Di, Xiuzhen; Liang, Beibei; Wang, Yue

    2014-07-01

    To analyze the worldwide advances on bacterial quantitative proteomics over the past fifteen years with bibliometric approach. Literature retrieval was conducted throughout the databases of Pubmed, Embase and Science citation index (SCI), using "bacterium" and "quantitative proteomics" as the key words. The deadline is July 2013. We sorted and analyzed these articles with Endnote X6 from the aspects of published year, the first author, name of journal, published institution, cited frequency and publication type. 932 English articles were included in our research after deleting the duplicates. The first article on bacterial quantitative proteomics was reported in 1999. The maximal publications were 163 related articles in 2012. Up till July 2013, authors from more than 23 countries and regions have published articles in this field. China ranks the fourth. The main publication type is original articles. The most frequently cited article is entitled with "Absolute quantification of proteins by LCMSE: a virtue of parallel MS acquisition" by Silva JC, Gorenstein MV, Li GZ, et al in Mol Cell Proteomics 2006. The most productive author is Smith RD from Biological Sciences Division, Pac. Northwest National Laboratory. The top journal publishing bacterial quantitative proteomics is Proteomics. More and more researchers pay attention to quantitative proteomics which will be widely used in bacteriology.

  9. Qualitative, quantitative and combination score systems in differential diagnosis of breast lesions by contrast-enhanced ultrasound.

    Science.gov (United States)

    Wang, YongMei; Fan, Wei; Zhao, Song; Zhang, Kai; Zhang, Li; Zhang, Ping; Ma, Rong

    2016-01-01

    To assess the feasibility of score systems in differential diagnosis of breast lesions by contrast-enhanced ultrasound (CEUS). CEUS was performed in 121 patients with 127 breast lesions by Philips iU22 with Sonovue as contrast agent. Pearson Chi-square χ(2) test, binary logistic regression analysis and Student's t-test are used to identify significant CEUS parameters in differential diagnosis. Based on these significant CEUS parameters, qualitative, quantitative and combination score systems were built by scoring 1 for benign characteristic and scoring 2 for malignant characteristic. Receiver operating characteristic (ROC) curve was applied to evaluate the diagnostic efficacy of different analytical methods. Pathological results showed 41 benign and 86 malignant lesions. Qualitative analysis and logistic regression analysis showed that there are significant differences in enhancement degree, enhancement order, internal homogeneity, enhancement margin, surrounding vessels and enlargement of diameters (PQuantitative analysis indicated that malignant lesions tended to show higher peak intensity (PI), larger area under the curve (AUC) and shorter time to peak (TTP) than benign ones (PQualitative score systems showed higher diagnostic efficacy than single quantitative CEUS parameters. The corresponding area under the ROC curve for qualitative, quantitative and combination score systems were 0.897, 0.716 and 0.903 respectively. Z test showed that area under the ROC curve of quantitative score system was statistically smaller than that of other score systems. Quantitative score system helps little in improving the diagnostic efficacy of CEUS. While qualitative score system improves the performance of CEUS greatly in discrimination of benign and malignant breast lesions. The application of qualitative could develop the diagnostic performance of CEUS which is clinically promising. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    Science.gov (United States)

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality.

  11. Quantitative assessment of resilience of a water supply system under rainfall reduction due to climate change

    Science.gov (United States)

    Amarasinghe, Pradeep; Liu, An; Egodawatta, Prasanna; Barnes, Paul; McGree, James; Goonetilleke, Ashantha

    2016-09-01

    A water supply system can be impacted by rainfall reduction due to climate change, thereby reducing its supply potential. This highlights the need to understand the system resilience, which refers to the ability to maintain service under various pressures (or disruptions). Currently, the concept of resilience has not yet been widely applied in managing water supply systems. This paper proposed three technical resilience indictors to assess the resilience of a water supply system. A case study analysis was undertaken of the Water Grid system of Queensland State, Australia, to showcase how the proposed indicators can be applied to assess resilience. The research outcomes confirmed that the use of resilience indicators is capable of identifying critical conditions in relation to the water supply system operation, such as the maximum allowable rainfall reduction for the system to maintain its operation without failure. Additionally, resilience indicators also provided useful insight regarding the sensitivity of the water supply system to a changing rainfall pattern in the context of climate change, which represents the system's stability when experiencing pressure. The study outcomes will help in the quantitative assessment of resilience and provide improved guidance to system operators to enhance the efficiency and reliability of a water supply system.

  12. Quantitative Analysis by Isotopic Dilution Using Mass Spectroscopy: The Determination of Caffeine by GC-MS.

    Science.gov (United States)

    Hill, Devon W.; And Others

    1988-01-01

    Describes a laboratory technique for quantitative analysis of caffeine by an isotopic dilution method for coupled gas chromatography-mass spectroscopy. Discusses caffeine analysis and experimental methodology. Lists sample caffeine concentrations found in common products. (MVL)

  13. Quantitative analysis of norfloxacin by 1H NMR and HPLC.

    Science.gov (United States)

    Frackowiak, Anita; Kokot, Zenon J

    2012-01-01

    1H NMR and developed previously HPLC methods were applied to quantitative determination of norfloxacin in veterinary solution form for pigeon. Changes in concentration can lead to significant changes in the 1H chemical shifts of non-exchangeable aromatic protons as a result of extensive self-association phenomena. This chemical shift variation of protons was analyzed and applied in the quantitative determination of norfloxacin. The method is simple, rapid, precise and accurate, and can be used for quality control of this drug.

  14. Quantitative analysis of autophagy using advanced 3D fluorescence microscopy.

    Science.gov (United States)

    Changou, Chun A; Wolfson, Deanna L; Ahluwalia, Balpreet Singh; Bold, Richard J; Kung, Hsing-Jien; Chuang, Frank Y S

    2013-05-03

    Prostate cancer is the leading form of malignancies among men in the U.S. While surgery carries a significant risk of impotence and incontinence, traditional chemotherapeutic approaches have been largely unsuccessful. Hormone therapy is effective at early stage, but often fails with the eventual development of hormone-refractory tumors. We have been interested in developing therapeutics targeting specific metabolic deficiency of tumor cells. We recently showed that prostate tumor cells specifically lack an enzyme (argininosuccinate synthase, or ASS) involved in the synthesis of the amino acid arginine(1). This condition causes the tumor cells to become dependent on exogenous arginine, and they undergo metabolic stress when free arginine is depleted by arginine deiminase (ADI)(1,10). Indeed, we have shown that human prostate cancer cells CWR22Rv1 are effectively killed by ADI with caspase-independent apoptosis and aggressive autophagy (or macroautophagy)(1,2,3). Autophagy is an evolutionarily-conserved process that allows cells to metabolize unwanted proteins by lysosomal breakdown during nutritional starvation(4,5). Although the essential components of this pathway are well-characterized(6,7,8,9), many aspects of the molecular mechanism are still unclear - in particular, what is the role of autophagy in the death-response of prostate cancer cells after ADI treatment? In order to address this question, we required an experimental method to measure the level and extent of autophagic response in cells - and since there are no known molecular markers that can accurately track this process, we chose to develop an imaging-based approach, using quantitative 3D fluorescence microscopy(11,12). Using CWR22Rv1 cells specifically-labeled with fluorescent probes for autophagosomes and lysosomes, we show that 3D image stacks acquired with either widefield deconvolution microscopy (and later, with super-resolution, structured-illumination microscopy) can clearly capture the early

  15. Microchromatography of hemoglobins. VIII. A general qualitative and quantitative method in plastic drinking straws and the quantitative analysis of Hb-F.

    Science.gov (United States)

    Schroeder, W A; Pace, L A

    1978-03-01

    The microchromatographic procedure for the quantitative analysis of the hemoglobin components in a hemolysate uses columns of DEAE-cellulose in a plastic drinking straw with a glycine-KCN-NaCl developer. Not only may the method be used for the quantitative analysis of Hb-F but also for the analysis of the varied components in mixtures of hemoglobins.

  16. Watershed Planning within a Quantitative Scenario Analysis Framework.

    Science.gov (United States)

    Merriam, Eric R; Petty, J Todd; Strager, Michael P

    2016-07-24

    There is a critical need for tools and methodologies capable of managing aquatic systems within heavily impacted watersheds. Current efforts often fall short as a result of an inability to quantify and predict complex cumulative effects of current and future land use scenarios at relevant spatial scales. The goal of this manuscript is to provide methods for conducting a targeted watershed assessment that enables resource managers to produce landscape-based cumulative effects models for use within a scenario analysis management framework. Sites are first selected for inclusion within the watershed assessment by identifying sites that fall along independent gradients and combinations of known stressors. Field and laboratory techniques are then used to obtain data on the physical, chemical, and biological effects of multiple land use activities. Multiple linear regression analysis is then used to produce landscape-based cumulative effects models for predicting aquatic conditions. Lastly, methods for incorporating cumulative effects models within a scenario analysis framework for guiding management and regulatory decisions (e.g., permitting and mitigation) within actively developing watersheds are discussed and demonstrated for 2 sub-watersheds within the mountaintop mining region of central Appalachia. The watershed assessment and management approach provided herein enables resource managers to facilitate economic and development activity while protecting aquatic resources and producing opportunity for net ecological benefits through targeted remediation.

  17. The application of high-speed cinematography for the quantitative analysis of equine locomotion.

    Science.gov (United States)

    Fredricson, I; Drevemo, S; Dalin, G; Hjertën, G; Björne, K

    1980-04-01

    Locomotive disorders constitute a serious problem in horse racing which will only be rectified by a better understanding of the causative factors associated with disturbances of gait. This study describes a system for the quantitative analysis of the locomotion of horses at speed. The method is based on high-speed cinematography with a semi-automatic system of analysis of the films. The recordings are made with a 16 mm high-speed camera run at 500 frames per second (fps) and the films are analysed by special film-reading equipment and a mini-computer. The time and linear gait variables are presented in tabular form and the angles and trajectories of the joints and body segments are presented graphically.

  18. A Quantitative System-Scale Characterization of the Metabolism of Clostridium acetobutylicum.

    Science.gov (United States)

    Yoo, Minyeong; Bestel-Corre, Gwenaelle; Croux, Christian; Riviere, Antoine; Meynial-Salles, Isabelle; Soucaille, Philippe

    2015-11-24

    Engineering industrial microorganisms for ambitious applications, for example, the production of second-generation biofuels such as butanol, is impeded by a lack of knowledge of primary metabolism and its regulation. A quantitative system-scale analysis was applied to the biofuel-producing bacterium Clostridium acetobutylicum, a microorganism used for the industrial production of solvent. An improved genome-scale model, iCac967, was first developed based on thorough biochemical characterizations of 15 key metabolic enzymes and on extensive literature analysis to acquire accurate fluxomic data. In parallel, quantitative transcriptomic and proteomic analyses were performed to assess the number of mRNA molecules per cell for all genes under acidogenic, solventogenic, and alcohologenic steady-state conditions as well as the number of cytosolic protein molecules per cell for approximately 700 genes under at least one of the three steady-state conditions. A complete fluxomic, transcriptomic, and proteomic analysis applied to different metabolic states allowed us to better understand the regulation of primary metabolism. Moreover, this analysis enabled the functional characterization of numerous enzymes involved in primary metabolism, including (i) the enzymes involved in the two different butanol pathways and their cofactor specificities, (ii) the primary hydrogenase and its redox partner, (iii) the major butyryl coenzyme A (butyryl-CoA) dehydrogenase, and (iv) the major glyceraldehyde-3-phosphate dehydrogenase. This study provides important information for further metabolic engineering of C. acetobutylicum to develop a commercial process for the production of n-butanol. Currently, there is a resurgence of interest in Clostridium acetobutylicum, the biocatalyst of the historical Weizmann process, to produce n-butanol for use both as a bulk chemical and as a renewable alternative transportation fuel. To develop a commercial process for the production of n-butanol via a

  19. Quantitative Analysis and Design of a Rudder Roll Damping Controller

    DEFF Research Database (Denmark)

    Hearns, G.; Blanke, M.

    1998-01-01

    A rudder roll damping controller is designed using Quantitative feedback theory to be robust for changes in the ships metacentric height. The analytical constraint due to the non-minimum phase behaviour of the rudder to roll is analysed using the Poisson Integral Formula and it is shown how...

  20. Quantitative analysis of distributed control paradigms of robot swarms

    DEFF Research Database (Denmark)

    Ngo, Trung Dung

    2010-01-01

    describe the physical and simulated robots, experiment scenario, and experiment setup. Third, we present our robot controllers based on behaviour based and neural network based paradigms. Fourth, we graphically show their experiment results and quantitatively analyse the results in comparison of the two...

  1. Teaching Quantitative Reasoning for Nonscience Majors through Carbon Footprint Analysis

    Science.gov (United States)

    Boose, David L.

    2014-01-01

    Quantitative reasoning is a key intellectual skill, applicable across disciplines and best taught in the context of authentic, relevant problems. Here, I describe and assess a laboratory exercise that has students calculate their "carbon footprint" and evaluate the impacts of various behavior choices on that footprint. Students gather…

  2. Quantitative security analysis for multi-threaded programs

    NARCIS (Netherlands)

    Ngo, Tri Minh; Huisman, Marieke

    2013-01-01

    Quantitative theories of information flow give us an approach to relax the absolute confidentiality properties that are difficult to satisfy for many practical programs. The classical information-theoretic approaches for sequential programs, where the program is modeled as a communication channel wi

  3. Quantitative imaging of collective cell migration during Drosophila gastrulation: multiphoton microscopy and computational analysis.

    Science.gov (United States)

    Supatto, Willy; McMahon, Amy; Fraser, Scott E; Stathopoulos, Angelike

    2009-01-01

    This protocol describes imaging and computational tools to collect and analyze live imaging data of embryonic cell migration. Our five-step protocol requires a few weeks to move through embryo preparation and four-dimensional (4D) live imaging using multi-photon microscopy, to 3D cell tracking using image processing, registration of tracking data and their quantitative analysis using computational tools. It uses commercially available equipment and requires expertise in microscopy and programming that is appropriate for a biology laboratory. Custom-made scripts are provided, as well as sample datasets to permit readers without experimental data to carry out the analysis. The protocol has offered new insights into the genetic control of cell migration during Drosophila gastrulation. With simple modifications, this systematic analysis could be applied to any developing system to define cell positions in accordance with the body plan, to decompose complex 3D movements and to quantify the collective nature of cell migration.

  4. Quantitative and Qualitative Analysis of Reported Dreams and the Problem of Double Hermeneutics in Clinical Research

    Directory of Open Access Journals (Sweden)

    Siamak Movahedi

    2012-12-01

    Full Text Available The aim of this article is to show that statistical analysis and hermeneutics are not mutually exclusive. Although statistical analysis may capture some patterns and regularities, statistical methods may themselves generate different types of interpretation and, in turn, give rise to even more interpretations. The discussion is lodged within the context of a quantitative analysis of dream content. I attempted to examine the dialogical texts of reported dreams monologically, but soon found myself returning to dialogic contexts to make sense of statistical patterns. One could cogently argue that the reported statistical relationships in this study, rather than pointing to any interaction among the “signifieds,” speak only to the relationships among the “signifiers” that were being played out through various actors on the analytic or scientific stage, since all of the constructs used in theorizing about, interpreting, and telling dreams come from the same discursive system.

  5. Enterprise Systems Analysis

    Science.gov (United States)

    2017-04-30

    is unavailable until a remediation process fixes the contamination problem. The final infrastructure system is the internet. The internet operates...Enterprise Systems Analysis Technical Report SERC-2017-TR-106 April 30 2017 Principal Investigator: Dr. Michael Pennock, Stevens Institute...Date April 30, 2017 Copyright © 2017 Stevens Institute of Technology The Systems Engineering Research Center (SERC

  6. Quantitative analysis of experiments on bacterial chemotaxis to naphthalene.

    Science.gov (United States)

    Pedit, Joseph A; Marx, Randall B; Miller, Cass T; Aitken, Michael D

    2002-06-20

    A mathematical model was developed to quantify chemotaxis to naphthalene by Pseudomonas putida G7 (PpG7) and its influence on naphthalene degradation. The model was first used to estimate the three transport parameters (coefficients for naphthalene diffusion, random motility, and chemotactic sensitivity) by fitting it to experimental data on naphthalene removal from a discrete source in an aqueous system. The best-fit value of naphthalene diffusivity was close to the value estimated from molecular properties with the Wilke-Chang equation. Simulations applied to a non-chemotactic mutant strain only fit the experimental data well if random motility was negligible, suggesting that motility may be lost rapidly in the absence of substrate or that gravity may influence net random motion in a vertically oriented experimental system. For the chemotactic wild-type strain, random motility and gravity were predicted to have a negligible impact on naphthalene removal relative to the impact of chemotaxis. Based on simulations using the best-fit value of the chemotactic sensitivity coefficient, initial cell concentrations for a non-chemotactic strain would have to be several orders of magnitude higher than for a chemotactic strain to achieve similar rates of naphthalene removal under the experimental conditions we evaluated. The model was also applied to an experimental system representing an adaptation of the conventional capillary assay to evaluate chemotaxis in porous media. Our analysis suggests that it may be possible to quantify chemotaxis in porous media systems by simply adjusting the model's transport parameters to account for tortuosity, as has been suggested by others.

  7. Structural and Quantitative Analysis of Three C-Glycosylflavones by Variable Temperature Proton Quantitative Nuclear Magnetic Resonance

    Directory of Open Access Journals (Sweden)

    Jing Liu

    2017-01-01

    Full Text Available Quantitative nuclear magnetic resonance is a powerful tool in drug analysis because of its speed, precision, and efficiency. In present study, the application of variable temperature proton quantitative nuclear magnetic resonance (VT-1H-qNMR for the calibration of three C-glycosylflavones including orientin, isoorientin, and schaftoside as reference substances was reported. Since there was conformational equilibrium due to the restricted rotation around the C(sp3-C(sp2 bond in C-glycosylflavones, the conformational behaviors were investigated by VT-NMR and verified by molecular mechanics (MM calculation. The VT-1H-qNMR method was validated including the linearity, limit of quantification, precision, and stability. The results were consistent with those obtained from mass balance approach. VT-1H-qNMR can be deployed as an effective tool in analyzing C-glycosylflavones.

  8. Structural and Quantitative Analysis of Three C-Glycosylflavones by Variable Temperature Proton Quantitative Nuclear Magnetic Resonance

    Science.gov (United States)

    Liu, Yang; Dai, Zhong

    2017-01-01

    Quantitative nuclear magnetic resonance is a powerful tool in drug analysis because of its speed, precision, and efficiency. In present study, the application of variable temperature proton quantitative nuclear magnetic resonance (VT-1H-qNMR) for the calibration of three C-glycosylflavones including orientin, isoorientin, and schaftoside as reference substances was reported. Since there was conformational equilibrium due to the restricted rotation around the C(sp3)-C(sp2) bond in C-glycosylflavones, the conformational behaviors were investigated by VT-NMR and verified by molecular mechanics (MM) calculation. The VT-1H-qNMR method was validated including the linearity, limit of quantification, precision, and stability. The results were consistent with those obtained from mass balance approach. VT-1H-qNMR can be deployed as an effective tool in analyzing C-glycosylflavones. PMID:28243484

  9. Quantitative real-time single particle analysis of virions.

    Science.gov (United States)

    Heider, Susanne; Metzner, Christoph

    2014-08-01

    Providing information about single virus particles has for a long time been mainly the domain of electron microscopy. More recently, technologies have been developed-or adapted from other fields, such as nanotechnology-to allow for the real-time quantification of physical virion particles, while supplying additional information such as particle diameter concomitantly. These technologies have progressed to the stage of commercialization increasing the speed of viral titer measurements from hours to minutes, thus providing a significant advantage for many aspects of virology research and biotechnology applications. Additional advantages lie in the broad spectrum of virus species that may be measured and the possibility to determine the ratio of infectious to total particles. A series of disadvantages remain associated with these technologies, such as a low specificity for viral particles. In this review we will discuss these technologies by comparing four systems for real-time single virus particle analysis and quantification.

  10. Understanding responder neurobiology in schizophrenia using a quantitative systems pharmacology model: application to iloperidone.

    Science.gov (United States)

    Geerts, Hugo; Roberts, Patrick; Spiros, Athan; Potkin, Steven

    2015-04-01

    The concept of targeted therapies remains a holy grail for the pharmaceutical drug industry for identifying responder populations or new drug targets. Here we provide quantitative systems pharmacology as an alternative to the more traditional approach of retrospective responder pharmacogenomics analysis and applied this to the case of iloperidone in schizophrenia. This approach implements the actual neurophysiological effect of genotypes in a computer-based biophysically realistic model of human neuronal circuits, is parameterized with human imaging and pathology, and is calibrated by clinical data. We keep the drug pharmacology constant, but allowed the biological model coupling values to fluctuate in a restricted range around their calibrated values, thereby simulating random genetic mutations and representing variability in patient response. Using hypothesis-free Design of Experiments methods the dopamine D4 R-AMPA (receptor-alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid) receptor coupling in cortical neurons was found to drive the beneficial effect of iloperidone, likely corresponding to the rs2513265 upstream of the GRIA4 gene identified in a traditional pharmacogenomics analysis. The serotonin 5-HT3 receptor-mediated effect on interneuron gamma-aminobutyric acid conductance was identified as the process that moderately drove the differentiation of iloperidone versus ziprasidone. This paper suggests that reverse-engineered quantitative systems pharmacology is a powerful alternative tool to characterize the underlying neurobiology of a responder population and possibly identifying new targets. © The Author(s) 2015.

  11. THE ANALYSIS OF CORRUPTION IN PUBLIC ADMINISTRATION - A QUANTITATIVE METHOD

    Directory of Open Access Journals (Sweden)

    Tudorel ANDREI

    2010-06-01

    Full Text Available This paper aims to examine, starting from Romania case, the degree in which decentralization process and improvement of local governance contributes to reduction of corruption on short and medium term. Through the used methodology, the paper is in line with the international trend that aims to analyze the impact of corruption on economical and social processes at the local level. For corruption analysis we used a simple dichotomist logistical model. From the obtained results, at one hand – descriptive analysis, on the other hand – the logistical model, there are some action to be undertaken for reduction the corruption level at local public administration like intensifying the reform process at local public administration level on three important components regarding civil service, decentralization process and improving the public policy formulation process, elaborating a long term strategy and a specific law on civil servant payment system, intensifying the continuous training courses for local electives, fluctuation reduction of technical apparatus from city halls as result of political changes, continuous training courses for mayors.

  12. Silicon sheet growth development for the large area silicon sheet task of the low cost solar array project. Quantitative analysis of defects in silicon

    Science.gov (United States)

    Natesh, R.

    1978-01-01

    The various steps involved in obtaining quantitative information of structural defects in crystalline silicon samples are described. Procedures discussed include: (1) chemical polishing; (2) chemical etching; and (3) automated image analysis of samples on the QTM 720 System.

  13. Quantitative Analysis of Criteria in University Building Maintenance in Malaysia

    Directory of Open Access Journals (Sweden)

    Olanrewaju Ashola Abdul-Lateef

    2010-10-01

    Full Text Available University buildings are a significant part of university assets and considerable resources are committed to their design, construction and maintenance. The core of maintenance management is to optimize productivity and user satisfaction with optimum resources. An important segment in the maintenance management system is the analysis of criteria that influence building maintenance. Therefore, this paper aims to identify quantify, rank and discuss the criteria that influence maintenance costs, maintenance backlogs, productivity and user satisfaction in Malaysian university buildings. The paper reviews the related literature and presents the outcomes of a questionnaire survey. Questionnaires were administered on 50 university maintenance organizations. Thirty-one criteria were addressed to the university maintenance organizations to evaluate the degree to which each of the criteria influences building maintenance management. With a 66% response rate, it was concluded that the consideration of the criteria is critical to the university building maintenance management system. The quality of components and materials, budget constraints and the age of the building were found to be the most influential criteria but information on user performance satisfaction, problems associated with in-house workforce and shortage of materials and components were the least influential criteria. The paper also outlined that maintenance management is a strategic function in university administration.

  14. Qualitative and quantitative analysis of Chinese herb fructus chaenomelis

    Institute of Scientific and Technical Information of China (English)

    Shuhu Du; Lina Chen; Kunfang Ma; Hongjian Ji

    2007-01-01

    Objective: To establish reliable methods for evaluating the quality of Chinese herb fructus chaenomelis. Methods: Qualitative analysis by Thin layer chromatography (TLC) , reference substances were Chaenomeles speciosa (Sweet) Nakai and oleanolic acid, a ethanol solution. In the system of high performance liquid chromatography(HPLC), a Prontosil Eurobond C18 column (250 mm×4.0 the flow rate was 1.0 ml/min with UV detected at 210 nm, the column temperature was maintained at room temperature. Results:In the system of TLC, oleanolic acid was separated successfully. In HPLC, the linear ranges of oleanotic acid and ursolic acid were 5.89-13.73 μg (R=0.9990)and 6.84~15.96 μg (R=0.9990), respectively. The average recoveries of oleanolic acid and ursolic acid were 97.52% (RSD=2.58% ), 98.21% (RSD=2.23%), respectively. Conclusion: The established TLC method can easily distinguish Chinese herb fructus chaenomelis from other commonly used crude drugs of the same family .The HPLC method for determining oleanolic acid and ursolic acid is simple, reproducible, accurate and feasible. The methods reported in this paper can be used scientifically and effectively to evaluate the quality of Chinese herb fructus chaenomelis.

  15. Quantitative analysis of immobilized metalloenzymes by atomic absorption spectroscopy.

    Science.gov (United States)

    Opwis, Klaus; Knittel, Dierk; Schollmeyer, Eckhard

    2004-12-01

    A new, sensitive assay for the quantitative determination of immobilized metal containing enzymes has been developed using atomic absorption spectroscopy (AAS). In contrast with conventionally used indirect methods the described quantitative AAS assay for metalloenzymes allows more exact analyses, because the carrier material with the enzyme is investigated directly. As an example, the validity and reliability of the method was examined by fixing the iron-containing enzyme catalase on cotton fabrics using different immobilization techniques. Sample preparation was carried out by dissolving the loaded fabrics in sulfuric acid before oxidising the residues with hydrogen peroxide. The iron concentrations were determined by flame atomic absorption spectrometry after calibration of the spectrometer with solutions of the free enzyme at different concentrations.

  16. Quantitative analysis of lead in aqueous solutions by ultrasonic nebulizer assisted laser induced breakdown spectroscopy

    Science.gov (United States)

    Zhong, Shi-Lei; Lu, Yuan; Kong, Wei-Jin; Cheng, Kai; Zheng, Ronger

    2016-08-01

    In this study, an ultrasonic nebulizer unit was established to improve the quantitative analysis ability of laser-induced breakdown spectroscopy (LIBS) for liquid samples detection, using solutions of the heavy metal element Pb as an example. An analytical procedure was designed to guarantee the stability and repeatability of the LIBS signal. A series of experiments were carried out strictly according to the procedure. The experimental parameters were optimized based on studies of the pulse energy influence and temporal evolution of the emission features. The plasma temperature and electron density were calculated to confirm the LTE state of the plasma. Normalizing the intensities by background was demonstrated to be an appropriate method in this work. The linear range of this system for Pb analysis was confirmed over a concentration range of 0-4,150ppm by measuring 12 samples with different concentrations. The correlation coefficient of the fitted calibration curve was as high as 99.94% in the linear range, and the LOD of Pb was confirmed as 2.93ppm. Concentration prediction experiments were performed on a further six samples. The excellent quantitative ability of the system was demonstrated by comparison of the real and predicted concentrations of the samples. The lowest relative error was 0.043% and the highest was no more than 7.1%.

  17. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    Science.gov (United States)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  18. Quantitative analysis of pheromone-binding protein specificity

    OpenAIRE

    Katti, S.; Lokhande, N.; D González; Cassill, A.; Renthal, R

    2012-01-01

    Many pheromones have very low water solubility, posing experimental difficulties for quantitative binding measurements. A new method is presented for determining thermodynamically valid dissociation constants for ligands binding to pheromone-binding proteins (OBPs), using β-cyclodextrin as a solubilizer and transfer agent. The method is applied to LUSH, a Drosophila OBP that binds the pheromone 11-cis vaccenyl acetate (cVA). Refolding of LUSH expressed in E. coli was assessed by measuring N-p...

  19. Quantitative Trait Locus Analysis of the Early Domestication of Sunflower

    OpenAIRE

    David M Wills; Burke, John M.

    2007-01-01

    Genetic analyses of the domestication syndrome have revealed that domestication-related traits typically have a very similar genetic architecture across most crops, being conditioned by a small number of quantitative trait loci (QTL), each with a relatively large effect on the phenotype. To date, the domestication of sunflower (Helianthus annuus L.) stands as the only counterexample to this pattern. In previous work involving a cross between wild sunflower (also H. annuus) and a highly improv...

  20. Fluorescent microscopy approaches of quantitative soil microbial analysis

    Science.gov (United States)

    Ivanov, Konstantin; Polyanskaya, Lubov

    2015-04-01

    Classical fluorescent microscopy method was used during the last decades in various microbiological studies of terrestrial ecosystems. The method provides representative results and simple application which is allow to use it both as routine part of amplitudinous research and in small-scaled laboratories. Furthermore, depending on research targets a lot of modifications of fluorescent microscopy method were established. Combination and comparison of several approaches is an opportunity of quantitative estimation of microbial community in soil. The first analytical part of the study was dedicated to soil bacterial density estimation by fluorescent microscopy in dynamic of several 30-days experiments. The purpose of research was estimation of changes in soil bacterial community on the different soil horizons under aerobic and anaerobic conditions with adding nutrients in two experimental sets: cellulose and chitin. Was modified the nalidixic acid method for inhibition of DNA division of gram-negative bacteria, and the method provides the quantification of this bacterial group by fluorescent microscopy. Established approach allowed to estimate 3-4 times more cells of gram-negative bacteria in soil. The functions of actinomyces in soil polymer destruction are traditionally considered as dominant in comparison to gram-negative bacterial group. However, quantification of gram-negative bacteria in chernozem and peatland provides underestimation of classical notion for this bacterial group. Chitin introduction had no positive effect to gram-negative bacterial population density changes in chernozem but concurrently this nutrient provided the fast growing dynamics at the first 3 days of experiment both under aerobic and anaerobic conditions. This is confirming chitinolytic activity of gram-negative bacteria in soil organic matter decomposition. At the next part of research modified method for soil gram-negative bacteria quantification was compared to fluorescent in situ

  1. The effects of selection on linkage analysis for quantitative traits.

    Science.gov (United States)

    Mackinnon, M J; Georges, M A

    1992-12-01

    The effects of within-sample selection on the outcome of analyses detecting linkage between genetic markers and quantitative traits were studied. It was found that selection by truncation for the trait of interest significantly reduces the differences between marker genotype means thus reducing the power to detect linked quantitative trait loci (QTL). The size of this reduction is a function of proportion selected, the magnitude of the QTL effect, recombination rate between the marker locus and the QTL, and the allele frequency of the QTL. Proportion selected was the most influential of these factors on bias, e.g., for an allele substitution effect of one standard deviation unit, selecting the top 80%, 50% or 20% of the population required 2, 6 or 24 times the number of progeny, respectively, to offset the loss of power caused by this selection. The effect on power was approximately linear with respect to the size of gene effect, almost invariant to recombination rate, and a complex function of QTL allele frequency. It was concluded that experimental samples from animal populations which have been subjected to even minor amounts of selection will be inefficient in yielding information on linkage between markers and loci influencing the quantitative trait under selection.

  2. Analysis on the go: quantitation of drugs of abuse in dried urine with digital microfluidics and miniature mass spectrometry.

    Science.gov (United States)

    Kirby, Andrea E; Lafrenière, Nelson M; Seale, Brendon; Hendricks, Paul I; Cooks, R Graham; Wheeler, Aaron R

    2014-06-17

    We report the development of a method coupling microfluidics and a miniature mass spectrometer, applied to quantitation of drugs of abuse in urine. A custom digital microfluidic system was designed to deliver droplets of solvent to dried urine samples and then transport extracted analytes to an array of nanoelectrospray emitters for analysis. Tandem mass spectrometry (MS/MS) detection was performed using a fully autonomous 25 kg instrument. Using the new method, cocaine, benzoylecgonine, and codeine can be quantified from four samples in less than 15 min from (dried) sample to analysis. The figures of merit for the new method suggest that it is suitable for on-site screening; for example, the limit of quantitation (LOQ) for cocaine is 40 ng/mL, which is compatible with the performance criteria for laboratory analyses established by the United Nations Office on Drugs and Crime. More importantly, the LOQ of the new method is superior to the 300 ng/mL cutoff values used by the only other portable analysis systems we are aware of (relying on immunoassays). This work serves as a proof-of-concept for integration of microfluidics with miniature mass spectrometry. The system is attractive for the quantitation of drugs of abuse from urine and, more generally, may be useful for a wide range of applications that would benefit from portable, quantitative, on-site analysis.

  3. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  4. Reliability and Validity of Quantitative Video Analysis of Baseball Pitching Motion.

    Science.gov (United States)

    Oyama, Sakiko; Sosa, Araceli; Campbell, Rebekah; Correa, Alexandra

    2017-02-01

    Video recordings are used to quantitatively analyze pitchers' techniques. However, reliability and validity of such analysis is unknown. The purpose of the study was to investigate the reliability and validity of joint and segment angles identified during a pitching motion using video analysis. Thirty high school baseball pitchers participated. The pitching motion was captured using 2 high-speed video cameras and a motion capture system. Two raters reviewed the videos to digitize the body segments to calculate 2-dimensional angles. The corresponding 3-dimensional angles were calculated from the motion capture data. Intrarater reliability, interrater reliability, and validity of the 2-dimensional angles were determined. The intrarater and interrater reliability of the 2-dimensional angles were high for most variables. The trunk contralateral flexion at maximum external rotation was the only variable with high validity. Trunk contralateral flexion at ball release, trunk forward flexion at foot contact and ball release, shoulder elevation angle at foot contact, and maximum shoulder external rotation had moderate validity. Two-dimensional angles at the shoulder, elbow, and trunk could be measured with high reliability. However, the angles are not necessarily anatomically correct, and thus use of quantitative video analysis should be limited to angles that can be measured with good validity.

  5. Quantitative allochem compositional analysis of Lochkovian-Pragian boundary sections in the Prague Basin (Czech Republic)

    Science.gov (United States)

    Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich

    2017-06-01

    Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.

  6. The usefulness of 3D quantitative analysis with using MRI for measuring osteonecrosis of the femoral head

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ji Young; Lee, Sun Wha [Ewha Womans University College of Medicine, Seoul (Korea, Republic of); Park, Youn Soo [Samsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2006-11-15

    We wanted to evaluate the usefulness of MRI 3D quantitative analysis for measuring osteonecrosis of the femoral head in comparison with MRI 2D quantitative analysis and quantitative analysis of the specimen. For 3 months at our hospital, 14 femoral head specimens with osteonecrosis were obtained after total hip arthroplasty. The patients preoperative MRIs were retrospectively reviewed for quantitative analysis of the size of the necrosis. Each necrotic fraction of the femoral head was measured by 2D quantitative analysis with using mid-coronal and mid-sagittal MRIs, and by 3D quantitative analysis with using serial continuous coronal MRIs and 3D reconstruction software. The necrotic fraction of the specimen was physically measured by the fluid displacement method. The necrotic fraction according to MRI 2D or 3D quantitative analysis was compared with that of the specimen by using Spearman's correlation test. On the correlative analysis, the necrotic fraction by MRI 2D quantitative analysis and quantitative analysis of the specimen showed moderate correlation (r = 0.657); on the other hand, the necrotic fraction by MRI 3D quantitative analysis and quantitative analysis of the specimen demonstrated a strong correlation (r = 0.952) ({rho} < 0.05). MRI 3D quantitative analysis was more accurate than 2D quantitative analysis using MRI for measuring osteonecrosis of the femoral head. Therefore, it may be useful for predicting the clinical outcome and deciding the proper treatment option.

  7. Quantitation of DNA methylation by melt curve analysis

    Directory of Open Access Journals (Sweden)

    Jones Michael E

    2009-04-01

    Full Text Available Abstract Background Methylation of DNA is a common mechanism for silencing genes, and aberrant methylation is increasingly being implicated in many diseases such as cancer. There is a need for robust, inexpensive methods to quantitate methylation across a region containing a number of CpGs. We describe and validate a rapid, in-tube method to quantitate DNA methylation using the melt data obtained following amplification of bisulfite modified DNA in a real-time thermocycler. Methods We first describe a mathematical method to normalise the raw fluorescence data generated by heating the amplified bisulfite modified DNA. From this normalised data the temperatures at which melting begins and finishes can be calculated, which reflect the less and more methylated template molecules present respectively. Also the T50, the temperature at which half the amplicons are melted, which represents the summative methylation of all the CpGs in the template mixture, can be calculated. These parameters describe the methylation characteristics of the region amplified in the original sample. Results For validation we used synthesized oligonucleotides and DNA from fresh cells and formalin fixed paraffin embedded tissue, each with known methylation. Using our quantitation we could distinguish between unmethylated, partially methylated and fully methylated oligonucleotides mixed in varying ratios. There was a linear relationship between T50 and the dilution of methylated into unmethylated DNA. We could quantitate the change in methylation over time in cell lines treated with the demethylating drug 5-aza-2'-deoxycytidine, and the differences in methylation associated with complete, clonal or no loss of MGMT expression in formalin fixed paraffin embedded tissues. Conclusion We have validated a rapid, simple in-tube method to quantify methylation which is robust and reproducible, utilizes easily designed primers and does not need proprietary algorithms or software. The

  8. Quantitative XRD Analysis of Cement Clinker by the Multiphase Rietveld Method

    Institute of Scientific and Technical Information of China (English)

    HONG Han-lie; FU Zheng-yi; MIN Xin-min

    2003-01-01

    Quantitative phase analysis of Portland cement clinker samples was performed using an adaptation of the Rietveld method.The Rietveld quantitative analysis program,originally in Fortran 77 code,was significantly modified in visual basic code with windows 9X graph-user interface,which is free from the constraint of direct utilizable memory 640 k,and can be conveniently operated under the windows environment.The Rietveld quantitative method provides numerous advantages over conventional XRD quantitative method,especially in the intensity anomalies and superposition problems.Examples of its use are given with the results from other methods.It is concluded that,at present,the Rietveld method is the most suitable one for quantitative phase analysis of Portland cement clinker.

  9. Scientific production on surgical nursing: analysis of the quantitative studies carried out between 2005 and 2009

    OpenAIRE

    2012-01-01

    This study aimed to analyze the characteristics of quantitative studies of nursing scientific publication in the surgical area. Research with quantitative approach carried out in the Virtual Healthcare Library with articles of quantitative nature, published from 2005 to 2009, in journals classified as A1, A2 and B1 by the Capes Nursing Qualis system 2008, full-text available on-line and in Portuguese. 28 articles were analyzed and it was detected that most of them are descriptive, exploratory...

  10. Acousto-Optic Tunable Filter Spectroscopic Instrumentation for Quantitative Near-Ir Analysis of Organic Materials.

    Science.gov (United States)

    Eilert, Arnold James

    1995-01-01

    The utility of near-IR spectroscopy for routine quantitative analyses of a wide variety of compositional, chemical, or physical parameters of organic materials is well understood. It can be used for relatively fast and inexpensive non-destructive bulk material analysis before, during, and after processing. It has been demonstrated as being a particularly useful technique for numerous analytical applications in cereal (food and feed) science and industry. Further fulfillment of the potential of near-IR spectroscopic analysis, both in the process and laboratory environment, is reliant upon the development of instrumentation that is capable of meeting the challenges of increasingly difficult applications. One approach to the development of near-IR spectroscopic instrumentation that holds a great deal of promise is acousto-optic tunable filter (AOTF) technology. A combination of attributes offered by AOTF spectrometry, including speed, optical throughput, wavelength reproducibility, ruggedness (no -moving-parts operation) and flexibility, make it particularly desirable for numerous applications. A series of prototype (research model) acousto -optic tunable filter instruments were developed and tested in order to investigate the feasibility of the technology for quantitative near-IR spectrometry. Development included design, component procurement, assembly and/or configuration of the optical and electronic subsystems of which each functional spectrometer arrangement was comprised, as well as computer interfacing and acquisition/control software development. Investigation of this technology involved an evolution of several operational spectrometer systems, each of which offered improvements over its predecessor. Appropriate testing was conducted at various stages of development. Demonstrations of the potential applicability of our AOTF spectrometer to quantitative process monitoring or laboratory analysis of numerous organic substances, including food materials, were

  11. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  12. Particle concentration measurement of virus samples using electrospray differential mobility analysis and quantitative amino acid analysis.

    Science.gov (United States)

    Cole, Kenneth D; Pease, Leonard F; Tsai, De-Hao; Singh, Tania; Lute, Scott; Brorson, Kurt A; Wang, Lili

    2009-07-24

    Virus reference materials are needed to develop and calibrate detection devices and instruments. We used electrospray differential mobility analysis (ES-DMA) and quantitative amino acid analysis (AAA) to determine the particle concentration of three small model viruses (bacteriophages MS2, PP7, and phiX174). The biological activity, purity, and aggregation of the virus samples were measured using plaque assays, denaturing gel electrophoresis, and size-exclusion chromatography. ES-DMA was developed to count the virus particles using gold nanoparticles as internal standards. ES-DMA additionally provides quantitative measurement of the size and extent of aggregation in the virus samples. Quantitative AAA was also used to determine the mass of the viral proteins in the pure virus samples. The samples were hydrolyzed and the masses of the well-recovered amino acids were used to calculate the equivalent concentration of viral particles in the samples. The concentration of the virus samples determined by ES-DMA was in good agreement with the concentration predicted by AAA for these purified samples. The advantages and limitations of ES-DMA and AAA to characterize virus reference materials are discussed.

  13. Quantitative analysis of food and feed samples with droplet digital PCR.

    Science.gov (United States)

    Morisset, Dany; Štebih, Dejan; Milavec, Mojca; Gruden, Kristina; Žel, Jana

    2013-01-01

    In this study, the applicability of droplet digital PCR (ddPCR) for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies) of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR) approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed.

  14. Quantitative analysis of food and feed samples with droplet digital PCR.

    Directory of Open Access Journals (Sweden)

    Dany Morisset

    Full Text Available In this study, the applicability of droplet digital PCR (ddPCR for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs. Real-time quantitative polymerase chain reaction (qPCR is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed.

  15. Quantitative analysis of HIV-1 protease inhibitors in cell lysates using MALDI-FTICR mass spectrometry.

    NARCIS (Netherlands)

    Kampen, JJ van; Burgers, P.C.; Groot, R. de; Osterhaus, A.D.; Reedijk, M.L.; Verschuren, E.J.; Gruters, R.A.; Luider, T.M.

    2008-01-01

    In this report we explore the use of MALDI-FTICR mass spectrometry for the quantitative analysis of five HIV-1 protease inhibitors in cell lysates. 2,5-Dihydroxybenzoic acid (DHB) was used as the matrix. From a quantitative perspective, DHB is usually a poor matrix due to its poor shot-to-shot and p

  16. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    Science.gov (United States)

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  17. Use of MRI in Differentiation of Papillary Renal Cell Carcinoma Subtypes: Qualitative and Quantitative Analysis.

    Science.gov (United States)

    Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh

    2016-03-01

    The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.

  18. Complex pedigree analysis to detect quantitative trait loci in dairy cattle.

    NARCIS (Netherlands)

    Bink, M.C.A.M.

    1998-01-01

    In dairy cattle, many quantitative traits of economic importance show phenotypic variation. For breeding purposes the analysis of this phenotypic variation and uncovering the contribution of genetic factors is very important. Usually, the individual gene effects contributing to the quantitative gene

  19. Quantitative analysis of target components by comprehensive two-dimensional gas chromatography

    NARCIS (Netherlands)

    Mispelaar, V.G. van; Tas, A.C.; Smilde, A.K.; Schoenmakers, P.J.; Asten, A.C. van

    2003-01-01

    Quantitative analysis using comprehensive two-dimensional (2D) gas chromatography (GC) is still rarely reported. This is largely due to a lack of suitable software. The objective of the present study is to generate quantitative results from a large GC x GC data set, consisting of 32 chromatograms. I

  20. Quantitative Analysis of Human Pluripotency and Neural Specification by In-Depth (PhosphoProteomic Profiling

    Directory of Open Access Journals (Sweden)

    Ilyas Singec

    2016-09-01

    Full Text Available Controlled differentiation of human embryonic stem cells (hESCs can be utilized for precise analysis of cell type identities during early development. We established a highly efficient neural induction strategy and an improved analytical platform, and determined proteomic and phosphoproteomic profiles of hESCs and their specified multipotent neural stem cell derivatives (hNSCs. This quantitative dataset (nearly 13,000 proteins and 60,000 phosphorylation sites provides unique molecular insights into pluripotency and neural lineage entry. Systems-level comparative analysis of proteins (e.g., transcription factors, epigenetic regulators, kinase families, phosphorylation sites, and numerous biological pathways allowed the identification of distinct signatures in pluripotent and multipotent cells. Furthermore, as predicted by the dataset, we functionally validated an autocrine/paracrine mechanism by demonstrating that the secreted protein midkine is a regulator of neural specification. This resource is freely available to the scientific community, including a searchable website, PluriProt.

  1. Quantitative Analysis of Piezoelectric and Seismoelectric Anomalies in Subsurface Geophysics

    Science.gov (United States)

    Eppelbaum, Lev

    2017-04-01

    problem was the basis for an inverse problem, i.e. revealing depth of a body occurrence, its location in a space as well as determining physical properties. At the same time, this method has not received a wide practical application taking into account complexity of real geological media. Careful analysis piezo- and seismoelectric anomalies shows the possibility of application of quantitative analysis of these effects advanced methodologies developed in magnetic prospecting for complex physical-geological conditions (Eppelbaum et al., 2000, 2001, 2010; Eppelbaum, 2010; 2011, 2015). Employment of these methodologies (improved modifications of tangents, characteristic points areal methods) for obtaining quantitative characteristics of ore bodies, environmental features and archaeological targets (models of horizontal circular cylinder, sphere, thin bed, thick bed and thin horizontal plate were utilized) have demonstrated their effectiveness. Case study at the archaeological site Tel Kara Hadid Field piezoelectric observations were conducted at the ancient archaeological site Tel Kara Hadid with gold-quartz mineralization in southern Israel within the Precambrian terrain at the northern extension of the Arabian-Nubian Shield (Neishtadt et al., 2006). The area of the archaeological site is located eight kilometers north of the town of Eilat, in an area of strong industrial noise. Ancient river alluvial terraces (extremely heterogeneous at a local scale, varying from boulders to silt) cover the quartz veins and complicate their identification. Piezoelectric measurements conducted over a quartz vein covered by surface sediments (approximately of 0.4 m thickness) produced a sharp (500 μV ) piezoelectric anomaly. Values recorded over the host rocks (clays and shales of basic composition) were close to zero. The observed piezoelectric anomaly was successfully interpreted by the use of methodologies developed in magnetic prospecting. For effective integration of piezo- and

  2. Quantitative analysis of the individual dynamics of Psychology theses

    Directory of Open Access Journals (Sweden)

    Robles, Jaime R.

    2009-12-01

    Full Text Available Three cohorts of undergraduate psychology theses (n = 57 performed by last year undergraduate psychology students from Universidad Católica Andrés Bello, were monitored using 5 longitudinal measurements of progression. A Generalized Additive Model, to predict the completion time of the theses, is tested against two completion times: early and delayed. Effect size measures favor a multiple dimension model over a global progress model. The trajectory of the indicators through the 5 measurements allows the differentiation between early and delayed completion. The completion probabilities estimated by the dimensional model allow the identification of differential oscillation levels for the distinct completion times. The initial progression indicators allow the prediction of early completion with a 71% success rate, while the final measurement shows a success rate of 89%. The results support the effectiveness of the supervisory system and the analysis of the progression dynamics of the theses from a task-delay model, focused on the relationship between the amount of task completion and the deadlines

  3. Quantitative XPS analysis of silica-supported Cu Co oxides

    Science.gov (United States)

    Cesar, Deborah V.; Peréz, Carlos A.; Schmal, Martin; Salim, Vera Maria M.

    2000-04-01

    Copper-cobalt oxides with Cu/Co=5:5, 15:15 and 35:35 bulk ratio have been prepared by deposition-precipitation method at constant pH from copper and cobalt nitrate solutions. Different oxides were obtained by decomposition of the precursors at 673 K for 7 h in air and analyzed by X-ray diffraction (XRD), transmission electron microscopy (TEM) and X-ray photoelectron spectroscopy (XPS). XRD data showed the formation of different oxide phases; for the bulk atomic ratio of 15Cu:15Co, a phase containing Cu and Co with spinel-like structure was observed, while the other bimetallic oxides presented CuO and Co 3O 4 as distinct phases. The XPS qualitative analysis has shown that all samples exhibited Cu 2+ and Co 3+ species at the surface. The Cu-Co spinel presented a displacement in Cu 2p binding energy value. A mathematical model was proposed from relative intensity ratios, which allowed the determination of the oxide particle thickness and the fraction of coverage at the support. This model described accurately the system and showed that cobalt improved the copper dispersion.

  4. Quantitative analysis of wrist electrodermal activity during sleep

    OpenAIRE

    Sano, Akane; Picard, Rosalind W.; Stickgold, Robert

    2014-01-01

    We present the first quantitative characterization of electrodermal activity (EDA) patterns on the wrists of healthy adults during sleep using dry electrodes. We compare the new results on the wrist to the prior findings on palmar or finger EDA by characterizing data measured from 80 nights of sleep consisting of 9 nights of wrist and palm EDA from 9 healthy adults sleeping at home, 56 nights of wrist and palm EDA from one healthy adult sleeping at home, and 15 nights of wrist EDA from 15 hea...

  5. Identification of Case Content with Quantitative Network Analysis

    DEFF Research Database (Denmark)

    Christensen, Martin Lolle; Olsen, Henrik Palmer; Tarissan, Fabian

    2016-01-01

    the relevant articles. In order to enhance information retrieval about case content, without relying on manual labor and subjective judgment, we propose in this paper a quantitative method that gives a better indication of case content in terms of which articles a given case is more closely associated with...... of important cases and comparing manual investigation of real content of those cases with the MAININ and MAINOUT articles. Results show that MAININ in particular is able to infer correctly the real content in most of the cases....

  6. Estimating rice yield related traits and quantitative trait loci analysis under different nitrogen treatments using a simple tower-based field phenotyping system with modified single-lens reflex cameras

    Science.gov (United States)

    Naito, Hiroki; Ogawa, Satoshi; Valencia, Milton Orlando; Mohri, Hiroki; Urano, Yutaka; Hosoi, Fumiki; Shimizu, Yo; Chavez, Alba Lucia; Ishitani, Manabu; Selvaraj, Michael Gomez; Omasa, Kenji

    2017-03-01

    Application of field based high-throughput phenotyping (FB-HTP) methods for monitoring plant performance in real field conditions has a high potential to accelerate the breeding process. In this paper, we discuss the use of a simple tower based remote sensing platform using modified single-lens reflex cameras for phenotyping yield traits in rice under different nitrogen (N) treatments over three years. This tower based phenotyping platform has the advantages of simplicity, ease and stability in terms of introduction, maintenance and continual operation under field conditions. Out of six phenological stages of rice analyzed, the flowering stage was the most useful in the estimation of yield performance under field conditions. We found a high correlation between several vegetation indices (simple ratio (SR), normalized difference vegetation index (NDVI), transformed vegetation index (TVI), corrected transformed vegetation index (CTVI), soil-adjusted vegetation index (SAVI) and modified soil-adjusted vegetation index (MSAVI)) and multiple yield traits (panicle number, grain weight and shoot biomass) across a three trials. Among all of the indices studied, SR exhibited the best performance in regards to the estimation of grain weight (R2 = 0.80). Under our tower-based field phenotyping system (TBFPS), we identified quantitative trait loci (QTL) for yield related traits using a mapping population of chromosome segment substitution lines (CSSLs) and a single nucleotide polymorphism data set. Our findings suggest the TBFPS can be useful for the estimation of yield performance during early crop development. This can be a major opportunity for rice breeders whom desire high throughput phenotypic selection for yield performance traits.

  7. A quantitative analysis of municipal solid waste disposal charges in China.

    Science.gov (United States)

    Wu, Jian; Zhang, Weiqian; Xu, Jiaxuan; Che, Yue

    2015-03-01

    Rapid industrialization and economic development have caused a tremendous increase in municipal solid waste (MSW) generation in China. China began implementing a policy of MSW disposal fees for household waste management at the end of last century. Three charging methods were implemented throughout the country: a fixed disposal fee, a potable water-based disposal fee, and a plastic bag-based disposal fee. To date, there has been little qualitative or quantitative analysis on the effectiveness of this relatively new policy. This paper provides a general overview of MSW fee policy in China, attempts to verify whether the policy is successful in reducing general waste collected, and proposes an improved charging system to address current problems. The paper presents an empirical statistical analysis of policy effectiveness derived from an environmental Kuznets curve (EKC) test on panel data of China. EKC tests on different kinds of MSW charge systems were then examined for individual provinces or cities. A comparison of existing charging systems was conducted using environmental and economic criteria. The results indicate the following: (1) the MSW policies implemented over the study period were effective in the reduction of waste generation, (2) the household waste discharge fee policy did not act as a strong driver in terms of waste prevention and reduction, and (3) the plastic bag-based disposal fee appeared to be performing well according to qualitative and quantitative analysis. Based on current situation of waste discharging management in China, a three-stage transitional charging scheme is proposed and both advantages and drawbacks discussed. Evidence suggests that a transition from a fixed disposal fee to a plastic bag-based disposal fee involving various stakeholders should be the next objective of waste reduction efforts.

  8. Quantitative Analysis of Lithium-Ion Battery Capacity Prediction via Adaptive Bathtub-Shaped Function

    Directory of Open Access Journals (Sweden)

    Shaomin Wu

    2013-06-01

    Full Text Available Batteries are one of the most important components in many mechatronics systems, as they supply power to the systems and their failures may lead to reduced performance or even catastrophic results. Therefore, the prediction analysis of remaining useful life (RUL of batteries is very important. This paper develops a quantitative approach for battery RUL prediction using an adaptive bathtub-shaped function (ABF. ABF has been utilised to model the normalised battery cycle capacity prognostic curves, which attempt to predict the remaining battery capacity with given historical test data. An artificial fish swarm algorithm method with a variable population size (AFSAVP is employed as the optimiser for the parameter determination of the ABF curves, in which the fitness function is defined in the form of a coefficient of determination (R2. A 4 x 2 cross-validation (CV has been devised, and the results show that the method can work valuably for battery health management and battery life prediction.

  9. [Quantitative histoenzymatic analysis of the adenohypophysis and adrenal cortex during the early stages of involution].

    Science.gov (United States)

    Prochukhanov, R A; Rostovtseva, T I

    1977-11-01

    A method of quantitative histenzymatic analysis was applied for determination of the involution changes of the neuroendocrine system. The activity of NAD- and NADP-reductases, acid and alkaline phosphatases, glucose-6-phosphoric dehydrogenase, 3-OH-steroid-dehydrogenase, 11-hydroxysteroid dehydrogenases was investigated in the adenohypophysis and in the adrenal cortex of rats aged 4 and 12 months. There were revealed peculiarities attending the structural-metabolic provision of physiological reconstructions of the neuro-endocrine system under conditions of the estral cycle at the early involution stages. An initial reduction of the cell ular-vascular transport with the retention of the functional activity of the intracellular organoids was demonstrated in ageing animals.

  10. An Integrated Method of Multiradar Quantitative Precipitation Estimation Based on Cloud Classification and Dynamic Error Analysis

    Directory of Open Access Journals (Sweden)

    Yong Huang

    2017-01-01

    Full Text Available Relationships between radar reflectivity factor and rainfall are different in various precipitation cloud systems. In this study, the cloud systems are firstly classified into five categories with radar and satellite data to improve radar quantitative precipitation estimation (QPE algorithm. Secondly, the errors of multiradar QPE algorithms are assumed to be different in convective and stratiform clouds. The QPE data are then derived with methods of Z-R, Kalman filter (KF, optimum interpolation (OI, Kalman filter plus optimum interpolation (KFOI, and average calibration (AC based on error analysis on the Huaihe River Basin. In the case of flood on the early of July 2007, the KFOI is applied to obtain the QPE product. Applications show that the KFOI can improve precision of estimating precipitation for multiple precipitation types.

  11. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD

    Directory of Open Access Journals (Sweden)

    Sanawar Mansur

    2016-12-01

    Full Text Available A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA of China. In quantitative analysis, the five compounds showed good regression (R2 = 0.9995 within the test ranges, and the recovery of the method was in the range of 94.2%–103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa.

  12. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD.

    Science.gov (United States)

    Mansur, Sanawar; Abdulla, Rahima; Ayupbec, Amatjan; Aisa, Haji Akbar

    2016-12-21

    A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD) was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa. Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA) of China. In quantitative analysis, the five compounds showed good regression (R² = 0.9995) within the test ranges, and the recovery of the method was in the range of 94.2%-103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa. Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification anal